Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Streaming large list of data as JSON format using Json.net

Using the MVC model, I would like to write a JsonResult that would stream the Json string to the client rather than converting all the data into Json string at once and then streaming it back to the client. I have actions that require to send very large (over 300,000 records) as Json transfers and I think the basic JsonResult implementation is not scalable.

I am using Json.net, I am wondering if there is a way to stream the chunks of the Json string as it is being transformed.

//Current implementation:
response.Write(Newtonsoft.Json.JsonConvert.SerializeObject(Data, formatting));
response.End();

//I know I can use the JsonSerializer instead
Newtonsoft.Json.JsonSerializer serializer = new Newtonsoft.Json.JsonSerializer();
serializer.Serialize(textWriter, Data);

However I am not sure how I can get the chunks written into textWriter and write into response and call reponse.Flush() until all 300,000 records are converted to Json.

Is this possible at all?

like image 521
sam360 Avatar asked Oct 09 '14 02:10

sam360


People also ask

Can you stream JSON?

Concatenated JSON streaming allows the sender to simply write each JSON object into the stream with no delimiters. It relies on the receiver using a parser that can recognize and emit each JSON object as the terminating character is parsed.

How do you handle large JSON data?

Instead of reading the whole file at once, the 'chunksize' parameter will generate a reader that gets a specific number of lines to be read every single time and according to the length of your file, a certain amount of chunks will be created and pushed into memory; for example, if your file has 100.000 lines and you ...

Is JSON good for big data?

Although JSON is referred to as comparatively better than CSV when dealing with massive data sets and in terms of scalability of files or applications, you should avoid this format when working with big data.

How do I view large JSON files?

With Gigasheet, you can open large JSON files with millions of rows or billions of cells, and work with them just as easily as you'd work with a much smaller file in Excel or Google Sheets.


1 Answers

Assuming your final output is a JSON array and each "chunk" is one item in that array, you could try something like the following JsonStreamingResult class. It uses a JsonTextWriter to write the JSON to the output stream, and uses a JObject as a means to serialize each item individually before writing it to the writer. You could pass the JsonStreamingResult an IEnumerable implementation which can read items individually from your data source so that you don't have them all in memory at once. I haven't tested this extensively, but it should get you going in the right direction.

public class JsonStreamingResult : ActionResult
{
    private IEnumerable itemsToSerialize;

    public JsonStreamingResult(IEnumerable itemsToSerialize)
    {
        this.itemsToSerialize = itemsToSerialize;
    }

    public override void ExecuteResult(ControllerContext context)
    {
        var response = context.HttpContext.Response;
        response.ContentType = "application/json";
        response.ContentEncoding = Encoding.UTF8;

        JsonSerializer serializer = new JsonSerializer();

        using (StreamWriter sw = new StreamWriter(response.OutputStream))
        using (JsonTextWriter writer = new JsonTextWriter(sw))
        {
            writer.WriteStartArray();
            foreach (object item in itemsToSerialize)
            {
                JObject obj = JObject.FromObject(item, serializer);
                obj.WriteTo(writer);
                writer.Flush();
            }
            writer.WriteEndArray();
        }
    }
}
like image 132
Brian Rogers Avatar answered Sep 24 '22 11:09

Brian Rogers