I'm reading a file in bash, taking the values out and saving them to another file. The file has ~100k lines in it, and it takes around 25minutes to read and rewrite them all.
Is there maybe some faster way to write to a file, because now I'm just iterating through the lines, parsing some values and saving them like this:
while read line; do
zip="$(echo "$line" | cut -c 1-8)"
echo $zip
done < file_one.txt
Everything works fine, the values are parsed correctly, I just want to know how can I optimize the process (if I even can).
Thanks
The bash loop only slows it down (especially the part where you invoke an external program (cut
) once per iteration). You can do all of it in one cut
:
cut -c 1-8 file_one.xt
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With