I tensor flow installed and successfully went through the MNIST demo. Now, I am trying to run the seq2seq demo, but this is not working for me.
I cloned a version of their github repo and attempted to run some of the listed commands from the repo root.
$ bazel run -c opt ./tensorflow/models/rnn/translate/translate.py
ERROR: Bad target pattern './tensorflow/models/rnn/translate/translate.py': package names may contain only A-Z, a-z, 0-9, '/', '-' and '_'.
INFO: Elapsed time: 0.115s
ERROR: Build failed. Not running target.
No surprise here, as it doesn't really make sense to have bazel execute a python script.
Later in the tutorial,
$ bazel run -c opt //tensorflow/models/rnn/translate:translate \
--data_dir ./data_dir --train_dir ./checkpoints_directory \
--en_vocab_size=40000 --fr_vocab_size=40000
Unrecognized option: --data_dir
If I remove the parameters from the invocation above, it will attempt (and fail) to build the entire tensor flow project before it executes translate
. This is not what I want as I have already successfully installed tensor flow with pip.
The last thing I tried running,
$ python ./tensorflow/models/rnn/translate/translate.py
Traceback (most recent call last):
File "./tensorflow/models/rnn/translate/translate.py", line 28, in <module>
from tensorflow.models.rnn.translate import data_utils
ImportError: No module named translate
Environment info: OS X 10.11.1, Python 2.7.10 (anaconda)
There are two ways to run the script:
1) separate the script arguments with -- as part of bazel run
bazel run -c opt //tensorflow/models/rnn/translate:translate -- \
--data_dir ./data_dir --train_dir ./checkpoints_directory \
--en_vocab_size=40000 --fr_vocab_size=40000
2) build and then run from ./bazel-bin/
:
bazel build -c opt //tensorflow/models/rnn/translate:translate
./bazel-bin/tensorflow/models/rnn/translate/translate \
--data_dir ./data_dir --train_dir ./checkpoints_directory \
--en_vocab_size=40000 --fr_vocab_size=40000
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With