I have a variable:
var='/path/to/filename.ext
/path/to/filename2.ext
/path/to/filename3.ext'
I want to put all strings separated by a newline in an array:
declare -a arr
Based on numerous posts here on StackOverflow, I found a couple of ways:
# method 1: while loop
while read line; do
arr+=($line)
done <<< "$var"
# method 2: readarray
readarray -t arr <<< "$var"
# method 3:
IFS=$'\n'
arr=("$var")
However, before I learned all these methods, I was using another one, namely:
# method 4 (not working in the current situation)
IFS=$'\n'
read -a arr <<< "$var"
This is not working, because it will only store the first string of var
in arr[0]
. I don't understand why it doesn't work in situations where the delimiter is a newline, while it does work with other delimiters, e.g.:
IFS='|'
strings='/path/to/filename.ext|/path/to/filename2.ext|'
read -a arr <<< "$strings"
Is there something that I'm missing?
Removed my own answer that argued you cannot use read
for this purpose. Turns out you can.
How can i make in array from that text by newlines? I need to paste in the begining of each line simbols "> ", how to do that? You may need to split by " " sometimes depending on the file. Use JavaScript .split () function to create an array with elements split by ' ' and then manually iterate through that array and add '<' for each item.
Find a simple, readable approach that can easily be adapted to similar problems. The readarray command is easiest to use with newlines as the delimiter. With other delimiters it may add an extra element to the array. The cleanest approach is to first adapt our input into a form that works nicely with readarray before passing it in.
There could be cases where you may have passed a splitter that is not part of the string or doesn't match any part of it. In that case, the split () method returns an array with the entire string as an element. In the example below, the message string doesn't have a comma (,) character.
The splitter can be a single character, another string, or a regular expression. After splitting the string into multiple substrings, the split () method puts them in an array and returns it. It doesn't make any modifications to the original string.
It turns out that your answer is wrong. Yes, you can! you need to use the -d
switch to read
:
-d
delimThe first character of delim is used to terminate the input line, rather than newline.
If you use it with an empty argument, bash uses the null byte as a delimiter:
$ var=$'/path/to/filename.ext\n/path/to/filename2.ext\n/path/to/filename3.ext'
$ IFS=$'\n' read -r -d '' -a arr < <(printf '%s\0' "$var")
$ declare -p arr
declare -a arr='([0]="/path/to/filename.ext" [1]="/path/to/filename2.ext" [2]="/path/to/filename3.ext")'
Success. Here we're using a process substitution with printf
that just dumps the content of the variable with a trailing null byte, so that read
is happy and returns a success return code. You could use:
IFS=$'\n' read -r -d '' -a arr <<< "$var"
In this case, the content of arr
is the same; the only difference is that the return code of read
is 1
(failure).
As a side note: there's a difference between
$ IFS=$'\n'
$ read ...
and
$ IFS=$'\n' read ...
The former sets IFS
globally (i.e., IFS
will retain this value for the remaining part of the script—until you modify it again, of course): you very likely don't want to do that!
The latter only sets IFS
for the command read
. You certainly want to use it that way!
Another side note: about your method 1. You're missing quotes, you're not unsetting IFS
, and you're not using the -r
flag to read
. This is bad:
while read line; do
arr+=($line)
done <<< "$var"
This is good:
while IFS= read -r line; do
arr+=( "$line" )
done <<< "$var"
Why?
IFS
, you'll get leading and trailing spaces removed.-r
, some backslashes will be understood as escaping backslashes (\'
, trailing \
, \
, and maybe others).( "$line" )
, you'll get word splitting and filename expansion turned on: you don't want that in case your input contains spaces or glob characters (like *
, [
, ?
, etc.).If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With