I'm trying to reformat the output of the last
command e.g. last -adn 10 | head -n -2 | awk -F' {2,}' '{ print "USER:",$1,"IP:",$5 }'
.
>> last -adn 10 | head -n -2
root pts/0 Tue Jul 10 13:51 still logged in 10.102.11.34
reboot system boot Fri Jun 22 09:37 (18+04:19) 0.0.0.0
I would like my output to be something like:
>>last -adn 10 | head -n -2 | awk -F' {2,}' '{ print "USER:",$1,"IP:",$5 }'
USER: root IP: 10.102.11.34 TIME: Tue Jul 10 13:51
I've tried every method described here, and I can't figure out why this is not working for me. When executing that command, it simply stores the whole line in $1, and the others are blank.
To illustrate how to separate using multiple delimiters in AWK, I will use a simple example to show you how to use this functionality. The above command outputs the information as shown below: As you can see, you can combine more than one delimiter in the AWK field separator to get specific information.
awk -F' {2,}' # means - two or more spaces. awk -F' +' # means - one space, then one or more spaces. This commands mean the same - use 2 or more spaces as fields delimiter. Save this answer.
The awk variables $1 or $2 through $nn represent the fields of each record and should not be confused with shell variables that use the same style of names. Inside an awk script $1 refers to field 1 of a record; $2 to field 2 of a record.
grep should be slightly faster because awk does more with each input line than just search for a regexp in it, e.g. if a field is referenced in the script (which it's not in this case) awk will split each input line into fields based on the field-separator value and it populates builtin variables.
By default, gawk does not enable interval expressions. You can enable them with the --re-interval
flag. Other versions of awk may or may not support them at all.
If you can't enable the flag, you can use a more explicit match. For example:
$ echo 'foo bar baz' |
awk -F'[[:space:]][[:space:]][[:space:]]*' '{print $2}'
bar baz
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With