I need to run a CPU- and memory-heavy Python script (analyzing and altering a lengthy WAV file) as a background process on my web server (a VPS), between HTTP requests.
The script takes up to 20 seconds to run and I am concerned about the performance on my server. Is there a good approach to either lower the priority of the process, periodically cede control to the OS, or otherwise protect the performance of my modest server?
Unoptimized Code. It's common to have custom code running on servers for specific tasks. If such codes are not optimized, they might end up using a lot of CPU resources. Unoptimized loops and recursion are some of the most common causes for high CPU usage due to unoptimized code.
Compute-Intensive (or CPU-Intensive) Processes Compute-intensive processes are ones that perform IO rarely, perhaps to read in some initial data from a file, for example, and then spend long periods processing the data before producing the result at the end, which requires minimal output.
You can use cpulimit on a linux based server. It will allow you to limit the CPU usage (specify the limit as a percentage) even of scripts that have already started running, and its usage is pretty straightforward.
It's available on the Debian repository, so you can install it easily using aptitude:
apt-get install cpulimit
Typical ways to use cpulimit
includes:
# To limit CPU usage to 75% of program called foo:
cpulimit -e foo -l 75
# To limit CPU usage to 50% of program with pid = 1582
cpulimit -p 1582 -l 50
Assuming it's a UNIX server, you could use the nice command to lower its priority. That should do the trick.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With