Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python: multiprocessing and requests

Following is the snippet of code I am running to use multiprocessing which fires HTTP request in parallel. After the running on console it is getting hung at "requests.get(url)" and neither proceeding ahead nor throwing an error.

def echo_100(q):
    ...
    print "before"
    r = requests.get(url)
    print "after"
    ...
    q.put(r)

q = multiprocessing.Queue()
p = multiprocessing.Process(target=echo_100,args=(q))
p.start()
p.join()
resp = q.get()
like image 899
Aman Gupta Avatar asked May 26 '15 08:05

Aman Gupta


People also ask

What is Python multiprocessing?

multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.

Is multiprocessing possible in Python?

The Python language allows for something called multiprocess, a term that describes the act of running many processes simultaneously. With it, you can write a program and assign many tasks to be completed at the same time, saving time and energy.

Does multiprocessing speed up Python?

It is used to significantly speed up your program, especially if it has a lot of CPU extensive tasks. In this case, multiple functions can run together because each one will use a different CPU core which in turn will improve the CPU utilization.


1 Answers

On Mac OS, there seem to be some bugs reading proxy settings from the operating system. I don't know the exact details, but it sometimes causes requests to hang when using multiprocessing. You could try to circumvent the problem by disabling OS proxies entirely, like this:

session = requests.Session()
session.trust_env = False  # Don't read proxy settings from OS
r = session.get(url)

That fixed it for me.

like image 156
Kevin Ferguson Avatar answered Oct 10 '22 09:10

Kevin Ferguson