If I run:
cat <file> | jq 
I get:
{
  "user": "alex",
  "num": "486",
  "time": "Thu Jun  6 16:26:06 PDT 2019",
  "pwd": "/Users/alex/codes/ores/prompt-command",
  "pid": 11047,
  "exit_code": 0,
  "cmd": "echo '123'"
}
{
  "user": "john",
  "num": "487",
  "time": "Thu Jun  6 16:26:24 PDT 2019",
  "pwd": "/Users/alex/codes/ores/prompt-command",
  "pid": 11108,
  "exit_code": 5,
  "cmd": "echo '456'"
}
{
  "user": "alex",
  "num": "488",
  "time": "Thu Jun  6 16:26:59 PDT 2019",
  "pwd": "/Users/alex/codes/ores/prompt-command",
  "pid": 11141,
  "exit_code": 5,
  "cmd": "echo '789'"
}
but instead of all those fields, I just want some output like:
alex echo '123'
alex echo '789'
so I tried this:
cat <file> | jq -r '.user .cmd'
but that didn't work I got this error:
jq: error (at :63): Cannot index string with string "cmd"
I also want to filter it so I only see my commands, something like:
cat <file> | jq -r '.user=alex .cmd'
                Use @tsv to generated tab-separated values as output:
jq -r '[.user, .cmd] | @tsv' <yourfile
...emits, given your input file:
alex    echo '123'
john    echo '456'
alex    echo '789'
...though if you're filtering for only your user account, you can just print cmd directly, since the user value is known:
jq -r 'select(.user == "alex") | .cmd' 
                        When you write .user .cmd you are asking for the "cmd" field of the JSON object at .user. To obtain both the .user and .cmd values, you could use the "," operator:
.user, .cmd
The above, however, will produce two lines.  There are many options for emitting multiple values on a single line.  You might wish to consider using string interpolation; or wrapping the values in square brackets and then using one of @csv, @tsv, or join/1; or using the -j command-line option.
This is all pretty clearly explained in the standard jq documentation (see e.g. https://stackoverflow.com/tags/jq/info), as is the use of select for making a selection.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With