I can see that this question has been asked before here tensorflow-has-no-attribute-compat
but the answer given was to
Microsoft Visual C++ 2015-2019 Redistributable (x64)
It did not work for the previous member it has not worked for me either. I have visual studio 2019 installed. I downloaded it anyways and ran a repair of (MV C++) just in case. Still getting same error.
So that being said I have not found a valid solution for this anywhere on google or stackoverflow. Here are some spec details of what I have installed.
tensorflow-gpu 2.1
python 3.7.7
CUDA 10.1
Anaconda 3.7
Looks like gpu started successfully.
2020-06-28 07:19:47.851257: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudart64_101.dll
python packages for 3.7.7. All packages installed via pip. Only tensorflow installed via conda. Dont think it makes a difference. I used conda so I can specify install version 2.1, pip would automatically install 2.2.
Package Version
---------------------- ----------------------------
absl-py 0.9.0
appdirs 1.4.4
astor 0.7.1
astroid 2.4.2
astropy 4.0.1.post1
attrs 19.3.0
backcall 0.2.0
bayesian-optimization 1.2.0
black 19.10b0
bleach 3.1.5
blinker 1.4
brotlipy 0.7.0
cachetools 4.1.0
certifi 2020.6.20
cffi 1.14.0
chardet 3.0.4
click 7.1.2
cloudpickle 1.3.0
colorama 0.4.3
confuse 1.3.0
cryptography 2.9.2
cycler 0.10.0
decorator 4.4.2
defusedxml 0.6.0
entrypoints 0.3
future 0.18.2
gast 0.2.2
gitdb 4.0.5
GitPython 3.1.3
google-auth 1.17.2
google-auth-oauthlib 0.4.1
google-pasta 0.2.0
grpcio 1.27.2
gym 0.17.2
h2o 3.30.0.5
h5py 2.10.0
htmlmin 0.1.12
idna 2.10
ImageHash 4.1.0
importlib-metadata 1.7.0
invoke 1.4.1
ipykernel 5.3.0
ipython 7.16.1
ipython-genutils 0.2.0
ipywidgets 7.5.1
isort 4.3.21
jedi 0.17.1
Jinja2 2.11.2
joblib 0.15.1
json5 0.9.5
jsonschema 3.2.0
jupyter 1.0.0
jupyter-client 6.1.3
jupyter-console 6.1.0
jupyter-core 4.6.3
jupyterlab 2.1.5
jupyterlab-server 1.1.5
kaggle 1.5.6
Keras 2.4.3
Keras-Applications 1.0.8
Keras-Preprocessing 1.1.0
kiwisolver 1.2.0
lazy-object-proxy 1.4.3
lightgbm 2.3.1
llvmlite 0.33.0
Markdown 3.2.2
MarkupSafe 1.1.1
matplotlib 3.2.2
mccabe 0.6.1
missingno 0.4.2
mistune 0.8.4
mkl-service 2.3.0
nbconvert 5.6.1
nbdime 2.0.0
nbformat 5.0.7
networkx 2.4
notebook 6.0.3
numba 0.50.1
numpy 1.19.0
oauthlib 3.0.1
opt-einsum 0+untagged.56.g2664021.dirty
packaging 20.4
pandas 1.0.5
pandas-profiling 2.8.0
pandocfilters 1.4.2
parso 0.7.0
path 13.1.0
path.py 12.4.0
pathspec 0.8.0
patsy 0.5.1
phik 0.10.0
pickleshare 0.7.5
Pillow 7.1.2
pip 20.1.1
plotly 4.8.2
prometheus-client 0.8.0
prompt-toolkit 3.0.5
protobuf 3.12.3
py4j 0.10.9
pyasn1 0.4.8
pyasn1-modules 0.2.7
pycparser 2.20
pyglet 1.5.0
Pygments 2.6.1
PyJWT 1.7.1
pylint 2.5.3
pyOpenSSL 19.1.0
pyparsing 2.4.7
pyreadline 2.1
pyrsistent 0.16.0
PySocks 1.7.1
pyspark 3.0.0
python-dateutil 2.8.1
python-slugify 4.0.0
pytz 2020.1
PyWavelets 1.1.1
pywin32 228
pywinpty 0.5.7
PyYAML 5.3.1
pyzmq 19.0.1
qtconsole 4.7.5
QtPy 1.9.0
regex 2020.6.8
requests 2.24.0
requests-oauthlib 1.2.0
retrying 1.3.3
rsa 4.6
scikit-learn 0.23.1
scipy 1.5.0
seaborn 0.10.1
Send2Trash 1.5.0
setuptools 47.3.1.post20200616
six 1.15.0
smmap 3.0.4
statsmodels 0.11.1
tabulate 0.8.7
tangled-up-in-unicode 0.0.6
tensorboard 2.2.2
tensorboard-plugin-wit 1.6.0.post3
tensorflow 2.1.0
tensorflow-estimator 2.2.0
termcolor 1.1.0
terminado 0.8.3
testpath 0.4.4
text-unidecode 1.3
threadpoolctl 2.1.0
toml 0.10.1
tornado 6.0.4
tqdm 4.46.1
traitlets 4.3.3
typed-ast 1.4.1
urllib3 1.24.3
visions 0.4.4
wcwidth 0.2.5
webencodings 0.5.1
Werkzeug 0.16.1
wheel 0.34.2
widgetsnbextension 3.5.1
win-inet-pton 1.1.0
wincertstore 0.2
wrapt 1.12.1
zipp 3.1.0
This is the error I get once i try to import tensorflow. (NOTE: I get this error on GPU install and CPU install of Tensorflow 2.1)
Python 3.7.7 (default, May 6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
2020-07-05 09:48:26.577683: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudart64_101.dll
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow\__init__.py", line 101, in <module>
from tensorflow_core import *
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\__init__.py", line 46, in <module>
from . _api.v2 import compat
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\__init__.py", line 39, in <module>
from . import v1
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\__init__.py", line 32, in <module>
from . import compat
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\compat\__init__.py", line 39, in <module>
from . import v1
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\compat\v1\__init__.py", line 29, in <module>
from tensorflow._api.v2.compat.v1 import app
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\__init__.py", line 39, in <module>
from . import v1
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\__init__.py", line 32, in <module>
from . import compat
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\compat\__init__.py", line 39, in <module>
from . import v1
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_core\_api\v2\compat\v1\compat\v1\__init__.py", line 667, in <module>
from tensorflow_estimator.python.estimator.api._v1 import estimator
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\__init__.py", line 10, in <module>
from tensorflow_estimator._api.v1 import estimator
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\_api\v1\estimator\__init__.py", line 10, in <module>
from tensorflow_estimator._api.v1.estimator import experimental
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\_api\v1\estimator\experimental\__init__.py", line 10, in <module>
from tensorflow_estimator.python.estimator.canned.dnn import dnn_logit_fn_builder
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\python\estimator\canned\dnn.py", line 33, in <module>
from tensorflow_estimator.python.estimator import estimator
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\python\estimator\estimator.py", line 53, in <module>
from tensorflow_estimator.python.estimator import util as estimator_util
File "G:\ProgramFiles\Anaconda37\envs\tensorflow\lib\site-packages\tensorflow_estimator\python\estimator\util.py", line 75, in <module>
class _DatasetInitializerHook(tf.compat.v1.train.SessionRunHook):
AttributeError: module 'tensorflow' has no attribute 'compat'
Special note for Conda users:
When one enters conda install tensorflow
it installs 2.1.0 but it brings with it tensorflow-estimator 2.2.0. To fix this problem simply run conda install tensorflow-estimator==2.1.0
after installing tensorflow 2.1.0 in Conda.
This advice is valid until conda switches to TF 2.2.0 (or better yet to 2.3.0)
credits to this TF github thread
UPDATE: as of 20-Aug-2021 Conda has TF 2.4 for Windows and 2.5 for Linux. Check here to see current state of TF support in Conda.
the specs you have given are correct
tensorflow-gpu 2.1
python 3.7.7
CUDA 10.1
Anaconda 3.7
but in tensorflow 2.X first install
pip install tensorflow-estimator==2.1.*
then install
pip install tensorflow-gpu
and then run the following command to check the number of gpu
import tensorflow as tf
print("Num GPUs Available: ", len(tf.config.experimental.list_physical_devices('GPU')))
This is usually caused by the broken TensorFlow-estimator module.
simply do a
pip install tensorflow-estimator==2.1.*
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With