I have about 1 million files (which are output simulation outputs). I want to store a specific information of them within one file. I have a for loop which goes to 1M. I put a counter to track the state of the for loop. It will be killed
some where between 875000 and 900000. I thought that it may be a space problem.
When I run df -h
or df /
, I have about 68G available.
What are other possible reasons that a Python script may be killed? How can I explore it more?
"killed" generally means that the process received some signal that caused it to exit. In this case since it is happening at the same time of the script there is a good chance that it is a broken pipe, the process is trying to read from or write to a file handle that has been closed on the other end.
Drop a pidfile somewhere (e.g. /tmp). Then you can check to see if the process is running by checking to see if the PID in the file exists. Don't forget to delete the file when you shut down cleanly, and check for it when you start up.
Ctrl + C on Windows can be used to terminate Python scripts and Ctrl + Z on Unix will suspend (freeze) the execution of Python scripts. If you press CTRL + C while a script is running in the console, the script ends and raises an exception.
On a Linux system, check the output of dmesg
. If the process is getting killed by the kernel, it will have a explanation there. Most probable reason: out of memory, or out of file descriptors.
Usually, you get killed
message when the program runs out of RAM (as opposed to hard disk which you have in plenty). You should keep a watch on main memory. Run top and have a look at the memory being taken by your program, or alternatively use a tool like guppy (https://pypi.python.org/pypi/guppy/) to track memory utilization programmatically.
I would hazard a guess that you are creating some big in memory data structure while processing files, perhaps not de-allocating them as you iterate through the files.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With