I'm trying to parse a test file. the file has username, address and phone in the following format:
Name: John Doe1
address : somewhere
phone: 123-123-1234
Name: John Doe2
address : somewhere
phone: 123-123-1233
Name: John Doe3
address : somewhere
phone: 123-123-1232
Only for almost 10k users: ) what I would like to do is convert those rows to columns, for example:
Name: John Doe1 address : somewhere phone: 123-123-1234
Name: John Doe2 address : somewhere phone: 123-123-1233
Name: John Doe3 address : somewhere phone: 123-123-1232
I would prefer to do it in bash
but if you know how to do it in python that would be great too, the file that has this information is in /root/docs/information. Any tips or help would be much appreciated.
Computer programmers often use parsing programs to convert text into formats that other applications can use. Parsers split items in a text string into separate fields. If, for example, you have a business database application that reads comma-delimited input files, a parser can help you create a comma-delimited file.
To efficiently parse fixed width files with Python, we can use the Pandas' read_fwf method. to define the col_specification list with the column specifications for filename. txt. Then we call read.
One way with GNU awk
:
awk 'BEGIN { FS="\n"; RS=""; OFS="\t\t" } { print $1, $2, $3 }' file.txt
Results:
Name: John Doe1 address : somewhere phone: 123-123-1234
Name: John Doe2 address : somewhere phone: 123-123-1233
Name: John Doe3 address : somewhere phone: 123-123-1232
Note that, I've set the output file separator (OFS
) to two tab characters (\t\t
). You can change this to whatever character or set of characters you please. HTH.
With a short Perl
one-liner :
$ perl -ne 'END{print "\n"}chomp; /^$/ ? print "\n" : print "$_\t\t"' file.txt
OUTPUT
Name: John Doe1 address : somewhere phone: 123-123-1234
Name: John Doe2 address : somewhere phone: 123-123-1233
Name: John Doe3 address : somewhere phone: 123-123-1232
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With