Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

reading rows of big csv file in python

I have a very big csv file which I cannot load in memory in full. So I want to read it piece by piece, convert it into numpy array and then do some more processing.

I already checked: Lazy Method for Reading Big File in Python?

But problem here is that it is a normal reader, and I am unable to find any option of specifying size in csvReader.

Also since I want to convert rows into numpy array, i dont want to read any line in half, so rather than specifying size, I want something where I can specify "no of rows" in reader.

Is there any built-in function or easy way to do it.

like image 325
Shweta Avatar asked Oct 31 '22 20:10

Shweta


1 Answers

The csv.reader won't read the whole file into memory. It lazily iterates over the file, line by line, as you iterate over the reader object. So you can just use the reader as you normally would, but break from your iteration after you're read however many lines you want to read. You can see this in the C-code used to implement the reader object.

Initializer for the reader objecT:
static PyObject *
csv_reader(PyObject *module, PyObject *args, PyObject *keyword_args)
{
    PyObject * iterator, * dialect = NULL;
    ReaderObj * self = PyObject_GC_New(ReaderObj, &Reader_Type);

    if (!self)
        return NULL;

    self->dialect = NULL;
    self->fields = NULL;
    self->input_iter = NULL;
    self->field = NULL;
    // stuff we dont care about here
    // ...
    self->input_iter = PyObject_GetIter(iterator);  // here we save the iterator (file object) we passed in
    if (self->input_iter == NULL) {
        PyErr_SetString(PyExc_TypeError,
                        "argument 1 must be an iterator");
        Py_DECREF(self);
        return NULL;
    }

static PyObject *
Reader_iternext(ReaderObj *self)  // This is what gets called when you call `next(reader_obj)` (which is what a for loop does internally)
{
    PyObject *fields = NULL;
    Py_UCS4 c;
    Py_ssize_t pos, linelen;
    unsigned int kind;
    void *data;
    PyObject *lineobj;

    if (parse_reset(self) < 0)
        return NULL;
    do {
        lineobj = PyIter_Next(self->input_iter);  // Equivalent to calling `next(input_iter)`
        if (lineobj == NULL) {
            /* End of input OR exception */
            if (!PyErr_Occurred() && (self->field_len != 0 ||
                                      self->state == IN_QUOTED_FIELD)) {
                if (self->dialect->strict)
                    PyErr_SetString(_csvstate_global->error_obj,
                                    "unexpected end of data");
                else if (parse_save_field(self) >= 0)
                    break;
            }
            return NULL;
        }

As you can see, next(reader_object) calls next(file_object) internally. So you're iterating over both line by line, without reading the entire thing into memory.

like image 198
dano Avatar answered Nov 15 '22 05:11

dano