Let's say that i have a way to send http request to a server. How it's possible to send two of these requests (or more) to the server at the same time? For example maybe by fork a process? How can i do it? (also i'm using django)
#This example is not tested...
import requests
def tester(request):
server_url = 'http://localhost:9000/receive'
payload = {
'd_test2': '1234',
'd_test2': 'demo',
}
json_payload = simplejson.dumps(payload)
content_length = len(json_payload)
headers = {'Content-Type': 'application/json', 'Content-Length': content_length}
response = requests.post(server_url, data=json_payload, headers=headers, allow_redirects=True)
if response.status_code == requests.codes.ok:
print 'Headers: {}\nResponse: {}'.format(response.headers, response.text)
Thanks!
There are two basic ways to generate concurrent HTTP requests: via multiple threads or via async programming. In multi-threaded approach, each request is handled by a specific thread. In asynchronous programming, there is (usually) one thread and an event loop, which periodically checks for the completion of a task.
Both multithreading and multiprocessing allow Python code to run concurrently. Only multiprocessing will allow your code to be truly parallel. However, if your code is IO-heavy (like HTTP requests), then multithreading will still probably speed up your code.
I think you want to use threads here rather than forking off new processes. While threads are bad in some cases, that isn't true here. Also, I think you want to use concurrent.futures
instead of using threads (or processes) directly.
For example, let's say you have 10 URLs, and you're currently doing them one in a row, like this:
results = map(tester, urls)
But now, you want to send them 2 at a time. Just change it to this:
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
results = pool.map(tester, urls)
If you want to try 4 at a time instead of 2, just change the max_workers
. In fact, you should probably experiment with different values to see what works best for your program.
If you want to do something a little fancier, see the documentation—the main ThreadPoolExecutor Example is almost exactly what you're looking for.
Unfortunately, in 2.7, this module doesn't come with the standard library, so you will have to install the backport from PyPI.
If you have pip
installed, this should be as simple as:
pip install futures
… or maybe sudo pip install futures
, on Unix.
And if you don't have pip
, go get it first (follow the link above).
The main reason you sometimes want to use processes instead of threads is that you've got heavy CPU-bound computation, and you want to take advantage of multiple CPU cores. In Python, threading can't effectively use up all your cores. So, if the Task Manager/Activity Monitor/whatever shows that your program is using up 100% CPU on one core, while the others are all at 0%, processes are the answer. With futures
, all you have to do is change ThreadPoolExecutor
to ProcessPoolExecutor
.
Meanwhile, sometimes you need more than just "give me a magic pool of workers to run my tasks". Sometimes you want to run a handful of very long jobs instead of a bunch of little ones, or load-balance the jobs yourself, or pass data between jobs, or whatever. For that, you want to use multiprocessing
or threading
instead of futures
.
Very rarely, even that is too high-level, and directly tell Python to create a new child process or thread. For that, you go all the way down to os.fork
(on Unix only) or thread
.
I would use gevent
, which can launch these all in so-called green-threads:
# This will make requests compatible
from gevent import monkey; monkey.patch_all()
import requests
# Make a pool of greenlets to make your requests
from gevent.pool import Pool
p = Pool(10)
urls = [..., ..., ...]
p.map(requests.get, urls)
Of course, this example submits get
s, but pool is generalized to map inputs into any function, including, say, yours to make requests. These greenlets will run as nearly as simultaneously as using fork
but are much faster and much lighter-weight.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With