I would like a way to do for line in file
in python, where the end of line is redefined to be any string that I want. Another way of saying that is I want to read records from file rather than lines; I want it to be equally fast and convenient to do as reading lines.
This is the python equivalent to setting perl's $/
input record separator, or using Scanner
in java. This doesn't necessarily have to use for line in file
(in particular, the iterator may not be a file object). Just something equivalent which avoids reading too much data into memory.
See also: Add support for reading records with arbitrary separators to the standard IO stack
There is nothing in the Python 2.x file
object, or the Python 3.3 io
classes, that lets you specify a custom delimiter for readline
. (The for line in file
is ultimately using the same code as readline
.)
But it's pretty easy to build it yourself. For example:
def delimited(file, delimiter='\n', bufsize=4096):
buf = ''
while True:
newbuf = file.read(bufsize)
if not newbuf:
yield buf
return
buf += newbuf
lines = buf.split(delimiter)
for line in lines[:-1]:
yield line
buf = lines[-1]
Here's a stupid example of it in action:
>>> s = io.StringIO('abcZZZdefZZZghiZZZjklZZZmnoZZZpqr')
>>> d = delimited(s, 'ZZZ', bufsize=2)
>>> list(d)
['abc', 'def', 'ghi', 'jkl', 'mno', 'pqr']
If you want to get it right for both binary and text files, especially in 3.x, it's a bit trickier. But if it only has to work for one or the other (and one language or the other), you can ignore that.
Likewise, if you're using Python 3.x (or using io
objects in Python 2.x), and want to make use of the buffers that are already being maintained in a BufferedIOBase
instead of just putting a buffer on top of the buffer, that's trickier. The io
docs do explain how to do everything… but I don't know of any simple examples, so you're really going to have to read at least half of that page and skim the rest. (Of course, you could just use the raw files directly… but not if you want to find unicode delimiters…)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With