I write deep learning software using Python and the Tensorflow library under Windows. Sometimes by mistake I load too much into memory and the computer stops responding; i cannot even kill the process.
Is it possible to limit the memory and CPU usage for Python scripts under Windows? I use PyCharm as an editor. Under UNIX Systems there seems to be the possibility to use resource.RLIMIT_VMEM, but under Windows I get the notification no module named resource.
This is a common problem when running resource-intensive processes, where the total amount of memory required might be hard to predict.
If the main issue is the whole system halting, you can create a watchdog process preventing that from happening and killing the process. It is a bit hacky, not as clean as the UNIX solution, and it will cost you a bit of overhead, but at least it can save you a restart!
This can easily be done in python, using the psutil package. This short piece of code runs whenever over 90% of virtual memory has been used and kills the python.exe process which is using the most memory:
import time
import psutil
while True:
if psutil.virtual_memory().percent > 90:
processes = []
for proc in psutil.process_iter():
if proc.name() == 'python.exe':
processes.append((proc, proc.memory_percent()))
sorted(processes, key=lambda x: x[1])[-1][0].kill()
time.sleep(10)
This can also be adapted for CPU, using psutil.cpu_percent().
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With