I am trying to print a list of tuples formatted in my stdout
. For this, I use the str.format method. Everything works fine, but when I pipe the output to see the
first lines using the head
command a IOError
occurs.
Here is my code:
# creating the data
data = []$
for i in range(0, 1000):
pid = 'pid%d' % i
uid = 'uid%d' % i
pname = 'pname%d' % i
data.append( (pid, uid, pname) )
# find max leghed string for each field
pids, uids, pnames = zip(*data)
max_pid = len("%s" % max( pids) )
max_uid = len("%s" % max( uids) )
max_pname = len("%s" % max( pnames) )
# my template for the formatted strings
template = "{0:%d}\t{1:%d}\t{2:%d}" % (max_pid, max_uid, max_pname)
# print the formatted output to stdout
for pid, uid, pname in data:
print template.format(pid, uid, pname)
And here is the error I get after running the command: python myscript.py | head
Traceback (most recent call last):
File "lala.py", line 16, in <module>
print template.format(pid, uid, pname)
IOError: [Errno 32] Broken pipe
Can anyone help me on this?
I tried to put print
in a try-except
block to handle the error,
but after that there was another message in the console:
close failed in file object destructor:
sys.excepthook is missing
lost sys.stderr
I also tried to flush immediately the data through a two consecutive
sys.stdout.write
and sys.stdout.flush
calls, but nothing happend..
In general, I try to catch the most specific error I can get away with. In this case it is BrokenPipeError
:
try:
# I usually call a function here that generates all my output:
for pid, uid, pname in data:
print template.format(pid, uid, pname)
except BrokenPipeError as e:
pass # Ignore. Something like head is truncating output.
finally:
sys.stderr.close()
If this is at the end of execution, I find I only need to close sys.stderr
. If I don't close sys.stderr
, I'll get a BrokenPipeError but without a stack trace.
This seems to be the minimum fix for writing tools that output to pipelines.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With