Following is the snippet of code I am running to use multiprocessing which fires HTTP request in parallel. After the running on console it is getting hung at "requests.get(url)" and neither proceeding ahead nor throwing an error.
def echo_100(q):
...
print "before"
r = requests.get(url)
print "after"
...
q.put(r)
q = multiprocessing.Queue()
p = multiprocessing.Process(target=echo_100,args=(q))
p.start()
p.join()
resp = q.get()
multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.
The Python language allows for something called multiprocess, a term that describes the act of running many processes simultaneously. With it, you can write a program and assign many tasks to be completed at the same time, saving time and energy.
It is used to significantly speed up your program, especially if it has a lot of CPU extensive tasks. In this case, multiple functions can run together because each one will use a different CPU core which in turn will improve the CPU utilization.
On Mac OS, there seem to be some bugs reading proxy settings from the operating system. I don't know the exact details, but it sometimes causes requests to hang when using multiprocessing. You could try to circumvent the problem by disabling OS proxies entirely, like this:
session = requests.Session()
session.trust_env = False # Don't read proxy settings from OS
r = session.get(url)
That fixed it for me.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With