I need to pick some numbers out of some text files. I can pick out the lines I need with grep, but didn't know how to extract the numbers from the lines. A colleague showed me how to do this from bash with perl:
cat results.txt | perl -pe 's/.+(\d\.\d+)\.\n/\1 /'
However, I usually code in Python, not Perl. So my question is, could I have used Python in the same way? I.e., could I have piped something from bash to Python and then gotten the result straight to stdout? ... if that makes sense. Or is Perl just more convenient in this case?
As time passes, Python seems to be taking the place of Perl for the same function. Python is faster, streamlined for large amounts of text editing, and capable of powerful functions.
Here are the simple steps to convert PERL scripts to Python. Remove all ';' at the end of the line. Remove all curly brackets and adjust indentation. Convert variables names from $x, %x or @x to x.
Yes, you can use Python from the command line. python -c <stuff>
will run <stuff>
as Python code. Example:
python -c "import sys; print sys.path"
There isn't a direct equivalent to the -p
option for Perl (the automatic input/output line-by-line processing), but that's mostly because Python doesn't use the same concept of $_
and whatnot that Perl does - in Python, all input and output is done manually (via raw_input()
/input()
, and print
/print()
).
For your particular example:
cat results.txt | python -c "import re, sys; print ''.join(re.sub(r'.+(\d\.\d+)\.\n', r'\1 ', line) for line in sys.stdin)"
(Obviously somewhat more unwieldy. It's probably better to just write the script to do it in actual Python.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With