I see a feature of Jackson JSON parser as an inherent problem for my case. I am to parse an unknown file which might not comply with the json formats, thereby end up having multiple key names that are same. In that case, if i call a function like getFieldNames() on it , it ends up giving only one entry among those multiple same simple elements. So if i do a get(String) on it, i'll end up getting only one of those Json nodes having the same key value where as i'm supposed to get all the others Any comments or solutions on this?
Most JSON parsers will reject your input file out of hand, as duplicate keys at the same nesting level are not allowed (this is a de-facto standard). However, certain parsers will allow you to handle the duplicate in a variety of ways.
One way to handle this in Jackson, would be to map regular attributes into an entity class, then handle the potential duplicates via a @JsonAnySetter
.
public class Bag {
final transient Multimap<String, Object> multimap = LinkedListMultimap
.create();
// regular properties, constructors etc
@JsonAnySetter
public void add(final String key, final String value) {
multimap.put(key, value);
}
}
Note the use of a multimap: regular hash maps cannot contain duplicate keys, so a multimap is a requirement for a working solution. After deserializing the input file, all 'regular' JSON attributes will be mapped to their corresponding entity properties, whereas all duplicates will be stored in the map, and available for manual processing.
final List<Object> duplicatedValues = multimap.get(someKey);
Alternatively, you could create a custom deserializer, which will recieve all tokens (regardless of wether they are duplicates or not).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With