Does the csvreader object read the entire file into memory? If I have big data, would it crash because of low memory. Or it's only a pointer so that I can process each line?
import csv
with open('RawData.csv','r') as file:
csvreader = csv.reader(file, delimiter=',')
for row in csvreader:
print(row)
From the csv.reader
documentation:
Return a reader object which will iterate over lines in the given csvfile. csvfile can be any object which supports the iterator protocol and returns a string each time its
__next__()
method is called — file objects and list objects are both suitable.
(Emphasis mine.)
What you have is a wrapper around the file object. The file pointer does all the dirty work of efficiently iterating over the lines of your file, and the csv module's Reader parses those lines as they're read in.
So yes, +1 for memory friendliness and efficiency.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With