I have a number of large comma-delimited text files (the biggest is about 15GB) that I need to process using a Python script. The problem is that the files sporadically contain DOS EOF (Ctrl-Z) characters in the middle of them. (Don't ask me why, I didn't generate them.) The other problem is that the files are on a Windows machine.
On Windows, when my script encounters one of these characters, it assumes it is at the end of the file and stops processing. For various reasons, I am not allowed to copy the files to any other machine. But I still need to process them.
Here are my ideas so far:
chr(26)
. This would work, but it would take approximately forever.sed
to eliminate the EOF characters. Unfortunately, as far as I can tell, sed
on Windows has the same problem and will quit when it sees the EOF.Notepad
program and do a find-and-replace. But it turns out that Notepad
-type programs don't cope well with 15GB files.My IDEAL solution would be some way to just read the file as text and simply ignore the Ctrl-Z characters. Is there a reasonable way to accomplish this?
Reading Large Text Files in Python We can use the file object as an iterator. The iterator will return each line one by one, which can be processed. This will not read the whole file into memory and it's suitable to read large files in Python.
On Windows, CTRL-Z is the end-of-file character for text files.
It's easy to use Python to delete the DOS EOF chars; for example,
def delete_eof(fin, fout):
BUFSIZE = 2**15
EOFCHAR = chr(26)
data = fin.read(BUFSIZE)
while data:
fout.write(data.translate(None, EOFCHAR))
data = fin.read(BUFSIZE)
import sys
ipath = sys.argv[1]
opath = ipath + ".new"
with open(ipath, "rb") as fin, open(opath, "wb") as fout:
delete_eof(fin, fout)
That takes a file path as its first argument, and copies the file but without chr(26)
bytes to the same file path with .new
appended. Fiddle to taste.
By the way, are you sure that DOS EOF characters are your only problem? It's hard to conceive of a sane way in which they could end up in files intended to be treated as text files.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With