Essentially, I have a huge file and all the file contains are multiple words per line, each separated by a space. Kind of like this:
WORD WORD WORD WORD
ANOTHER
WORD SCRABBLE BLAH
YES NO
What I want to do is put all the words in the file into one huge list, I tried using split but that didn't account for the new lines(\n)
Reading via for line in f splits on newline and it's efficient memory-wise (it reads one line at a time) but putting everything in a huge list is not. Anyway, if you insist:
huge_list = []
with open(huge_file, "r") as f:
for line in f:
huge_list.extend(line.split())
To read the whole file into memory as a string, use f.read() instead:
huge_list = []
with open(huge_file, "r") as f:
huge_list = f.read().split()
Input file (words separated by spaces and newlines):
WORD WORD WORD WORD
ANOTHER
WORD SCRABBLE BLAH
YES NO
Output of both examples:
>>> huge_list
['WORD', 'WORD', 'WORD', 'WORD', 'ANOTHER', 'WORD', 'SCRABBLE', 'BLAH', 'YES', 'NO']
>>>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With