I have a function that is side-effect free. I would like to run it for every element in an array and return an array with all of the results.
Does Python have something to generate all of the values?
Python now has the concurrent. futures module, which is the simplest way of getting map to work with either multiple threads or multiple processes.
Python doesn't support multi-threading because Python on the Cpython interpreter does not support true multi-core execution via multithreading. However, Python does have a threading library. The GIL does not prevent threading.
2-Use Cases for Multiprocessing: Multiprocessing outshines threading in cases where the program is CPU intensive and doesn't have to do any IO or user interaction. Show activity on this post. Process may have multiple threads. These threads may share memory and are the units of execution within a process.
Python is NOT a single-threaded language. Python processes typically use a single thread because of the GIL. Despite the GIL, libraries that perform computationally heavy tasks like numpy, scipy and pytorch utilise C-based implementations under the hood, allowing the use of multiple cores.
Try concurrent.futures.ThreadPoolExecutor.map in Python Standard Library (New in version 3.2).
Similar to map(func, *iterables) except:
- the iterables are collected immediately rather than lazily;
- func is executed asynchronously and several calls to func may be made concurrently.
A simple example (modified from ThreadPoolExecutor Example):
import concurrent.futures
import urllib.request
URLS = [
'http://www.foxnews.com/',
'http://www.cnn.com/',
'http://europe.wsj.com/',
'http://www.bbc.co.uk/',
]
# Retrieve a single page and report the URL and contents
def load_url(url, timeout):
# Do something here
# For example
with urllib.request.urlopen(url, timeout=timeout) as conn:
try:
data = conn.read()
except Exception as e:
# You may need a better error handler.
return b''
else:
return data
# We can use a with statement to ensure threads are cleaned up promptly
with concurrent.futures.ThreadPoolExecutor(max_workers=20) as executor:
# map
l = list(executor.map(lambda url: load_url(url, 60), URLS))
print('Done.')
Try the Pool.map function from multiprocessing:
http://docs.python.org/library/multiprocessing.html#using-a-pool-of-workers
It's not multithreaded per-se, but that's actually good since multithreading is severely crippled in Python by the GIL.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With