Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to incrementally write into a json file

I am writing a program, which requires me to generate a very large json file. I know the traditional way is to dump a dictionary list using json.dump(), but the list just got too big that even the total memory + swap space cannot hold it before it is dumped. Is there anyway to stream it into a json file, i.e., write the data into the json file incrementally?

like image 583
fanchyna Avatar asked Mar 22 '16 14:03

fanchyna


People also ask

How do I write data into a JSON file?

First, to write data to a JSON file, we must create a JSON string of the data with JSON. stringify . This returns a JSON string representation of a JavaScript object, which can be written to a file.

How do I create a JSON file and write it?

Method 2: Writing JSON to a file in Python using json.dump() Another way of writing JSON to a file is by using json. dump() method The JSON package has the “dump” function which directly writes the dictionary to a file in the form of JSON, without needing to convert it into an actual JSON object.

How do I update a field in JSON?

To update JSON data in a table: Query and retrieve the JSON data from a JSON column. Modify the contents of the JSON column. To push the updated column value back into the table, issue the UPDATE SQL statement and specify the JSON2BSON function.


2 Answers

I know this is a year late, but the issue is still open and I'm surprised the json.iterencode() was not mentioned.

The potential problem with iterencode in this example, is that you would want to have an iterative handle on the large data set by using a generator, and json encode does not serialize generators.

The way around this is to the subclass list type and override the __iter__ magic method so that you could yield the output of your generator.

Here is an example of this list sub class.

class StreamArray(list):
    """
    Converts a generator into a list object that can be json serialisable
    while still retaining the iterative nature of a generator.

    IE. It converts it to a list without having to exhaust the generator
    and keep it's contents in memory.
    """
    def __init__(self, generator):
        self.generator = generator
        self._len = 1

    def __iter__(self):
        self._len = 0
        for item in self.generator:
            yield item
            self._len += 1

    def __len__(self):
        """
        Json parser looks for a this method to confirm whether or not it can
        be parsed
        """
        return self._len

The usage from here on is quite simple. Get the generator handle, pass it into the StreamArray class, pass the stream array object into iterencode() and iterate over the chunks. The chunks will be json formated output which can be directly written to file.

Example usage:

#Function that will iteratively generate a large set of data.
def large_list_generator_func():
    for i in xrange(5):
        chunk = {'hello_world': i}
        print 'Yielding chunk: ', chunk
        yield chunk

#Write the contents to file:
with open('/tmp/streamed_write.json', 'w') as outfile:
    large_generator_handle = large_list_generator_func()
    stream_array = StreamArray(large_generator_handle)
    for chunk in json.JSONEncoder().iterencode(stream_array):
        print 'Writing chunk: ', chunk
        outfile.write(chunk)

The output that shows yield and writes happen consecutively.

Yielding chunk:  {'hello_world': 0}
Writing chunk:  [
Writing chunk:  {
Writing chunk:  "hello_world"
Writing chunk:  : 
Writing chunk:  0
Writing chunk:  }
Yielding chunk:  {'hello_world': 1}
Writing chunk:  , 
Writing chunk:  {
Writing chunk:  "hello_world"
Writing chunk:  : 
Writing chunk:  1
Writing chunk:  }
like image 57
Werner Smit Avatar answered Sep 20 '22 15:09

Werner Smit


Sadly the json library does not have any incremental writing facilities, and therefore cannot do what you want.

That's clearly going to be a very large file - would some other representation be more appropriate?

Otherwise the best suggestion I can make is to dump each list entry to an in-memory structure and write them out with the necessary delimiters ([ at the beginning, ],[ between entries and ] at the end) to try and construct the JSON that you need.

If formatting is important you should know that the wrapper test your program writes will destroy correct indentation, but indentation is only for humans so it shouldn't make any difference to the semantics of the JSON structure.

like image 43
holdenweb Avatar answered Sep 22 '22 15:09

holdenweb