I have tsv log files where a column is populated by a json string.
I want to parse that column with JsonLoader in a Pig script. I saw many examples where JsonLoader is used in cases where each row is only a json string. I have other columns I want to skip and I don't know how to do that.
The file looks like this:
foo bar {"version":1; "type":"an event"; "count": 1}
foo bar {"version":1; "type":"another event"; "count": 1}
How can I do that?
You are looking for the JsonStringToMap UDF provided in Elephant Bird: https://github.com/kevinweil/elephant-bird/search?q=JsonStringToMap&ref=cmdform
Sample File:
foo bar {"version":1, "type":"an event", "count": 1}
foo bar {"version":1, "type":"another event", "count": 1}
Pig Script:
REGISTER /path/to/elephant-bird.jar;
DEFINE JsonStringToMap com.twitter.elephantbird.pig.piggybank.JsonStringToMap();
raw = LOAD '/tmp/file.tsv' USING PigStorage('\t') AS (col1:chararray,col2:chararray,json_string:chararray);
parsed = FOREACH raw GENERATE col1,col2,JsonStringToMap(json_string);
ILLUSTRATE parsed; -- Just to show the output
Pre-processing (JSON as chararray/string):
-------------------------------------------------------------------------------------------------------
| raw | col1:chararray | col2:chararray | json_string:chararray |
-------------------------------------------------------------------------------------------------------
| | foo | bar | {"version":1, "type":"another event", "count": 1} |
Post-processing (JSON as map):
-------------------------------------------------------------------------------------------------
| parsed | col1:chararray | col2:chararray | json:map(:chararray) |
-------------------------------------------------------------------------------------------------
| | foo | bar | {count=1, type=another event, version=1} |
-------------------------------------------------------------------------------------------------
Here's the same question asked just two days ago: How do you decode JSON in Pig that comes from a column?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With