I am relatively new to python, and while attempting to train a chatbot I received the error: ‘UnboundLocalError: local variable 'logs' referenced before assignment‘. I used model.fit to train:
model.fit(x_train, y_train, epochs=7)
And I received the error:
UnboundLocalError Traceback (most recent call last)
<ipython-input-10-847c83704a3f> in <module>()
2 x_train,
3 y_train,
----> 4 epochs=7
5 )
1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in _method_wrapper(self, *args, **kwargs)
64 def _method_wrapper(self, *args, **kwargs):
65 if not self._in_multi_worker_mode(): # pylint: disable=protected-access
---> 66 return method(self, *args, **kwargs)
67
68 # Running inside `run_distribute_coordinator` already.
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
854 logs = tmp_logs # No error, now safe to assign to logs.
855 callbacks.on_train_batch_end(step, logs)
--> 856 epoch_logs = copy.copy(logs)
857
858 # Run validation.
UnboundLocalError: local variable 'logs' referenced before assignment
I ran this in google colab, with the link here: https://colab.research.google.com/drive/18uTvvKYDrd8CQi31kg6vX2Dbxg1gD20X?usp=sharing
I used the chatterbot/english dataset on kaggle: https://www.kaggle.com/kausr25/chatterbotenglish
May be a small thing to see, check if your validation data is not empty. It can be, somehow.
This issue looks similar to the problem I had while working with small datasets and it is covered in this thread: #38064. I solved my particular issue setting a smaller batch_size, in my case:
batch_size = 2
This issue was seen when my dataset was not properly loaded or when the dataset file was missing. It implies there was not even one record available for the code to process.
I guess it may also occur when the available dataset is small too.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With