I'm relatively new to theano and I want to run the mnist example on my GPU but I get the following output:
Using gpu device 0: GeForce GTX 970M (CNMeM is disabled)
Loading data...
Building model and compiling functions...
WARNING (theano.gof.compilelock):
Overriding existing lock by dead process '9700' (I am process '10632')
DEBUG: nvcc STDOUT mod.cu
Creating library
C:/Users/user/AppData/Local/Theano
/compiledir_Windows-8-6.2.9200-Intel64_Family_6_Model_71_Stepping_1_GenuineIntel-3.4.3-64
/tmp55nlvvvo/m25b839e7715203be227800f03e7c8fe8.lib
and object
C:/Users/user/AppData/Local/Theano
/compiledir_Windows-8-6.2.9200-Intel64_Family_6_Model_71_Stepping_1_GenuineIntel-3.4.3-64
/tmp55nlvvvo/m25b839e7715203be227800f03e7c8fe8.exp
It keeps outputting DEBUG messages without any output of mnist. I have a working version of nvcc:
C:\Users\user>nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2015 NVIDIA Corporation
Built on Tue_Aug_11_14:49:10_CDT_2015
Cuda compilation tools, release 7.5, V7.5.17
And my .theanorc
file:
[global]
floatX = float32
device = gpu0
[nvcc]
fastmath = True
How can I solve this?
I have a similar problem. Search google and get the code. https://github.com/Theano/Theano/blob/master/theano/sandbox/cuda/nvcc_compiler.py
p = subprocess.Popen(
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
nvcc_stdout_raw, nvcc_stderr_raw = p.communicate()[:2]
console_encoding = getpreferredencoding()
nvcc_stdout = decode_with(nvcc_stdout_raw, console_encoding)
nvcc_stderr = decode_with(nvcc_stderr_raw, console_encoding)
if nvcc_stdout:
# this doesn't happen to my knowledge
print("DEBUG: nvcc STDOUT", nvcc_stdout, file=sys.stderr)
It seems to nvcc have error output.
But in my occasion, its output looks like
DEBUG: nvcc STDOUT mod.cu
DEBUG: nvcc STDOUT mod.cu
Sometimes the program work fine after this, sometimes doesn't work. It's very strange. Sorry I can not make comment, so I just post the answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With