I am using Keras for some ML and have this generator for the data and labels:
def createBatchGenerator(driving_log,batch_size=32):
batch_images = np.zeros((batch_size, 66, 200, 3))
batch_steering = np.zeros(batch_size)
while 1:
for i in range(batch_size):
x,y = get_preprocessed_row(driving_log)
batch_images[i]=x
batch_steering[i]=y
yield batch_images, batch_steering
When I use it locally it runs fine, but when I run it on an AWS g2.2xlarge with a GPU, I get this error "ValueError: generator already executing". Can someone please help me resolve this?
Python uses the Mersenne Twister as the core generator. It produces 53-bit precision floats and has a period of 2**19937-1. The underlying implementation in C is both fast and threadsafe. Both the functions on the “random” module are thread safe as are the methods on an instance of the random.
Yes, Keras is thread safe, if you pay a little attention to it. In fact, in reinforcement learning there is an algorithm called Asynchronous Advantage Actor Critics (A3C) where each agent relies on the same neural network to tell them what they should do in a given state. In other words, each thread calls model.
ZipFile is not thread safe.
Local variables and parameters are always thread-safe. Instance variables, class variables, and global variables may not be thread-safe (but they might be).
You need to make a generator that can support multi-threading to make sure the generator is called by two threads at once:
import threading
class createBatchGenerator:
def __init__(self, driving_log,batch_size=32):
self.driving_log = driving_log
self.batch_size = batch_size
self.lock = threading.Lock()
def __iter__(self):
return self
def __next__(self):
with self.lock:
batch_images = np.zeros((batch_size, 66, 200, 3))
batch_steering = np.zeros(batch_size)
for i in range(self.batch_size):
x,y = get_preprocessed_row(self.driving_log)
batch_images[i]=x
batch_steering[i]=y
return batch_images, batch_steering
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With