The best practice to avoid EOF in python while coding on any platform is to catch the exception, and we don't need to perform any action so, we just pass the exception using the keyword “pass” in the “except” block.
EOF stands for End Of File . This is the point in the program where the user cannot read the data anymore. It means that the program reads the whole file till the end.
EOF stands for End of File in Python. Unexpected EOF implies that the interpreter has reached the end of our program before executing all the code. This error is likely to occur when: we fail to declare a statement for loop ( while / for ) we omit the closing parenthesis or curly bracket in a block of code.
Python doesn't have built-in eof detection function but that functionality is available in two ways: f. read(1) will return b'' if there are no more bytes to read. This works for text as well as binary files. The second way is to use f.
Loop over the file to read lines:
with open('somefile') as openfileobject:
for line in openfileobject:
do_something()
File objects are iterable and yield lines until EOF. Using the file object as an iterable uses a buffer to ensure performant reads.
You can do the same with the stdin (no need to use raw_input()
:
import sys
for line in sys.stdin:
do_something()
To complete the picture, binary reads can be done with:
from functools import partial
with open('somefile', 'rb') as openfileobject:
for chunk in iter(partial(openfileobject.read, 1024), b''):
do_something()
where chunk
will contain up to 1024 bytes at a time from the file, and iteration stops when openfileobject.read(1024)
starts returning empty byte strings.
You can imitate the C idiom in Python.
To read a buffer up to max_size
number of bytes, you can do this:
with open(filename, 'rb') as f:
while True:
buf = f.read(max_size)
if not buf:
break
process(buf)
Or, a text file line by line:
# warning -- not idiomatic Python! See below...
with open(filename, 'rb') as f:
while True:
line = f.readline()
if not line:
break
process(line)
You need to use while True / break
construct since there is no eof test in Python other than the lack of bytes returned from a read.
In C, you might have:
while ((ch != '\n') && (ch != EOF)) {
// read the next ch and add to a buffer
// ..
}
However, you cannot have this in Python:
while (line = f.readline()):
# syntax error
because assignments are not allowed in expressions in Python (although recent versions of Python can mimic this using assignment expressions, see below).
It is certainly more idiomatic in Python to do this:
# THIS IS IDIOMATIC Python. Do this:
with open('somefile') as f:
for line in f:
process(line)
Update: Since Python 3.8 you may also use assignment expressions:
while line := f.readline():
process(line)
The Python idiom for opening a file and reading it line-by-line is:
with open('filename') as f:
for line in f:
do_something(line)
The file will be automatically closed at the end of the above code (the with
construct takes care of that).
Finally, it is worth noting that line
will preserve the trailing newline. This can be easily removed using:
line = line.rstrip()
You can use below code snippet to read line by line, till end of file
line = obj.readline()
while(line != ''):
# Do Something
line = obj.readline()
While there are suggestions above for "doing it the python way", if one wants to really have a logic based on EOF, then I suppose using exception handling is the way to do it --
try:
line = raw_input()
... whatever needs to be done incase of no EOF ...
except EOFError:
... whatever needs to be done incase of EOF ...
Example:
$ echo test | python -c "while True: print raw_input()"
test
Traceback (most recent call last):
File "<string>", line 1, in <module>
EOFError: EOF when reading a line
Or press Ctrl-Z at a raw_input()
prompt (Windows, Ctrl-Z Linux)
In addition to @dawg's great answer, the equivalent solution using walrus operator (Python >= 3.8):
with open(filename, 'rb') as f:
while buf := f.read(max_size):
process(buf)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With