I am working on a bash script (using jq for JSON parsing)
Searched for a bit for both the requirements but could not find anything concrete. Please advice. The highlighted steps below (in ***) are the points where I need help.
Sample Flow :
create empty FINAL array
for(eachService is serviceList)
a. CURL <service_response> returning JSON array of objects
b. use jq filters to parse JSON response, apply some logic and modify elements in response as needed
c. ***add this JSON array to FINAL array***
***LOOP through FINAL array, one object at a time a write to CSV.***
Sample Data :
CURL Response 1 (ex: $curl1):
[
{
"id":"123",
"startDate": "2016-12-09T00:00:00Z",
"calls":4
},
{
"id":"456",
"startDate": "2016-12-09T00:00:00Z",
"calls":22
}
]
CURL Response 2 (ex : $curl2):
[
{
"id":"789",
"startDate": "2016-12-09T00:00:00Z",
"calls":8
},
{
"id":"147",
"startDate": "2016-12-09T00:00:00Z",
"calls":10
}
]
NEEDED OUTPUT ($final):
[
{
"id":"123",
"startDate": "2016-12-09T00:00:00Z",
"calls":4
},
{
"id":"456",
"startDate": "2016-12-09T00:00:00Z",
"calls":22
},
{
"id":"789",
"startDate": "2016-12-09T00:00:00Z",
"calls":8
},
{
"id":"147",
"startDate": "2016-12-09T00:00:00Z",
"calls":10
}
]
jq can deal with multiple input arrays. You can pipe the whole output of the loop to it:
for service in "$services" ; do
curl "$service/path"
done | jq -r '.[]|[.id,.startDate,.calls]|@csv'
Note that the csv transformation can be done by @csv
As @hek2mlg pointed out, it should be possible to invoke jq just once. If the input is sufficiently uniform (admittedly, maybe a big "if"), you could even avoid having to name the fields explicitly, e.g.:
$ for service in "$services" ; do
curl "$service/path"
done | jq -sr 'add[] | [.[]] | @csv'
Output:
"123","2016-12-09T00:00:00Z",4
"456","2016-12-09T00:00:00Z",22
"789","2016-12-09T00:00:00Z",8
"147","2016-12-09T00:00:00Z",10
Note that using -s allows you to perform arbitrary computations on all the inputs, e.g. counting them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With