I am trying to parse the JSON incrementally, i.e. based on a condition.
Below is my json message and I am currently using JavaScriptSerializer to deserialize the message.
string json = @"{"id":2,
"method":"add",
"params":
{"object":
{"name":"test"
"id":"1"},
"position":"1"}
}";
JavaScriptSerializer js = new JavaScriptSerializer();
Message m = js.Deserialize<Message>(json);
Message class is shown below:
public class Message
{
public string id { get; set; }
public string method { get; set; }
public Params @params { get; set; }
public string position { get; set; }
}
public class Params
{
public string name { get; set; }
public string id{ get; set;
}
The above code parses the message with no problems. But it parses the entire JSON at once. I want it to proceed parsing only if the "method" parameter's value is "add". If it is not "add", then I don't want it to proceed to parse rest of the message. Is there a way to do incremental parsing based on a condition in C#? (Environment: VS 2008 with .Net 3.5)
I have to admit I'm not as familiar with the JavaScriptSerializer, but if you're open to use JSON.net, it has a JsonReader
that acts much like a DataReader
.
using(var jsonReader = new JsonTextReader(myTextReader)){
while(jsonReader.Read()){
//evaluate the current node and whether it's the name you want
if(jsonReader.TokenType.PropertyName=="add"){
//do what you want
} else {
//break out of loop.
}
}
}
Here are the generic and simple methods I use to parse, load and create very large JSON files. The code uses now pretty much standard JSON.Net library. Unfortunately the documentation isn't very clear on how to do this but it's not very hard to figure it out either.
Below code assumes the scenario where you have large number of objects that you want to serialize as JSON array and vice versa. We want to support very large files whoes size is only limited by your storage device (not memory). So when serializing, the method takes IEnumerable<T>
and while deserializing it returns the same. This way you can process the entire file without being limited by the memory.
I've used this code on file sizes of several GBs with reasonable performance.
//Serialize sequence of objects as JSON array in to a specified file
public static void SerializeSequenceToJson<T>(this IEnumerable<T> sequence, string fileName)
{
using (var fileStream = File.CreateText(fileName))
SerializeSequenceToJson(sequence, fileStream);
}
//Deserialize specified file in to IEnumerable assuming it has array of JSON objects
public static IEnumerable<T> DeserializeSequenceFromJson<T>(string fileName)
{
using (var fileStream = File.OpenText(fileName))
foreach (var responseJson in DeserializeSequenceFromJson<T>(fileStream))
yield return responseJson;
}
//Utility methods to operate on streams instead of file
public static void SerializeSequenceToJson<T>(this IEnumerable<T> sequence, TextWriter writeStream, Action<T, long> progress = null)
{
using (var writer = new JsonTextWriter(writeStream))
{
var serializer = new JsonSerializer();
writer.WriteStartArray();
long index = 0;
foreach (var item in sequence)
{
if (progress != null)
progress(item, index++);
serializer.Serialize(writer, item);
}
writer.WriteEnd();
}
}
public static IEnumerable<T> DeserializeSequenceFromJson<T>(TextReader readerStream)
{
using (var reader = new JsonTextReader(readerStream))
{
var serializer = new JsonSerializer();
if (!reader.Read() || reader.TokenType != JsonToken.StartArray)
throw new Exception("Expected start of array in the deserialized json string");
while (reader.Read())
{
if (reader.TokenType == JsonToken.EndArray) break;
var item = serializer.Deserialize<T>(reader);
yield return item;
}
}
}
If you take a look at Json.NET, it provides a non-caching, forward-only JSON parser that will suit your needs.
See the JsonReader
and JsonTextReader
class in the documentation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With