Using sed or similar how would you extract lines from a file? If I wanted lines 1, 5, 1010, 20503 from a file, how would I get these 4 lines?
What if I have a fairly large number of lines I need to extract? If I had a file with 100 lines, each representing a line number that I wanted to extract from another file, how would I do that?
If you want your file to be split based on the number of lines in each chunk rather than the number of bytes, you can use the -l (lines) option. In this example, each file will have 1,000 lines except, of course, for the last one which may have fewer lines.
Syntax: Read file line by line on a Bash Unix & Linux shell file. The -r option passed to read command prevents backslash escapes from being interpreted. Add IFS= option before read command to prevent leading/trailing whitespace from being trimmed. while IFS= read -r line; do COMMAND_on $line; done < input.
with awk it's as simple as:
awk 'NR==1 || NR==5 || NR==1010' "file"
@OP, you can do this easier and more efficiently with awk. so for your first question
awk 'NR~/^(1|2|5|1010)$/{print}' file
for 2nd question
awk 'FNR==NR{a[$1];next}(FNR in a){print}' file_with_linenr file
Something like "sed -n '1p;5p;1010p;20503p'. Execute the command "man sed" for details.
For your second question, I'd transform the input file into a bunch of sed(1) commands to print the lines I wanted.
This ain't pretty and it could exceed command length limits under some circumstances*:
sed -n "$(while read a; do echo "${a}p;"; done < line_num_file)" data_file
Or its much slower but more attractive, and possibly more well-behaved, sibling:
while read a; do echo "${a}p;"; done < line_num_file | xargs -I{} sed -n \{\} data_file
A variation:
xargs -a line_num_file -I{} sed -n \{\}p\; data_file
You can speed up the xarg
versions a little bit by adding the -P
option with some large argument like, say, 83 or maybe 419 or even 1177, but 10 seems as good as any.
*xargs --show-limits </dev/null
can be instructive
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With