I have a file and its consist of multiple lists like below
[234,343,234]
[23,45,34,5]
[354,45]
[]
[334,23]
I am trying to read line by line and append to a single list in python.
how to do it?
I tried so far>
with open("pos.txt","r") as filePos:
pos_lists=filePos.read()
new_list=[]
for i in pos_lists.split("\n"):
print(type(i)) #it is str i want it as list
new_list.extend(i)
print(new_list)
thanks in advance
To convert string to list in Python, use the string split() method. The split() is a built-in Python method that splits the strings and stores them in the list.
To create a list of strings, first use square brackets [ and ] to create a list. Then place the list items inside the brackets separated by commas. Remember that strings must be surrounded by quotes. Also remember to use = to store the list in a variable.
You can concatenate a list of strings into a single string with the string method, join() . Call the join() method from 'String to insert' and pass [List of strings] . If you use an empty string '' , [List of strings] is simply concatenated, and if you use a comma , , it makes a comma-delimited string.
You can try these:
>>> from ast import literal_eval
>>> with open(YOURTEXTFILE) as f:
... final_list = [literal_eval(elem) for elem in f.readlines()]
>>> final_list
[[234, 343, 234], [23, 45, 34, 5], [354, 45], [], [334, 23]]
Or,
>>> from ast import literal_eval
>>> with open(YOURTEXTFILE) as f:
... final_list = sum(map(literal_eval, s.readlines()), [])
>>> final_list
[234, 343, 234, 23, 45, 34, 5, 354, 45, 334, 23]
Whichever you want.
The same thing can be done with python built-in eval()
however, it is not recommended to use eval()
on untrusted code, instead use ast.literal_eval()
which only works on very limited data types.
For more on this, see Using python's eval() vs. ast.literal_eval()?
You can use ast.literal_eval
>>> res = []
>>> with open('f.txt') as f:
... for line in f:
... res.append(ast.literal_eval(line))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With