I am having an issue importing pyarrow into my jupyter notebook. To give some context, I created a virtual environment named rc_env. I've uninstalled and reinstalled pyarrow quite a few times but this the message I received when I recently install it.
pip install pyarrow
Requirement already satisfied: pyarrow in ./Environments/rc_env/lib/python3.8/site-packages (4.0.1)
Requirement already satisfied: numpy>=1.16.6 in ./Environments/rc_env/lib/python3.8/site-packages (from pyarrow) (1.21.0)
So the package is in my environment. And for another sanity check, I used the pip list command to see the python packages installed in my environment.
(rc_env) LTA00015JFK:~ adenner$ pip list
Package Version
----------------------------- ---------
appnope 0.1.2
argon2-cffi 20.1.0
async-generator 1.10
attrs 21.2.0
backcall 0.2.0
beautifulsoup4 4.9.3
bleach 3.3.0
boto3 1.17.103
botocore 1.20.103
cachetools 4.2.2
certifi 2021.5.30
cffi 1.14.5
chardet 4.0.0
cycler 0.10.0
debugpy 1.3.0
decorator 5.0.9
defusedxml 0.7.1
dictor 0.1.7
entrypoints 0.3
et-xmlfile 1.1.0
google 3.0.0
google-api-core 1.30.0
google-api-python-client 2.11.0
google-auth 1.32.1
google-auth-httplib2 0.1.0
google-auth-oauthlib 0.4.4
google-cloud 0.34.0
google-cloud-bigquery 2.20.0
google-cloud-bigquery-storage 2.5.0
google-cloud-core 1.7.1
google-cloud-vision 2.3.2
google-crc32c 1.1.2
google-resumable-media 1.3.1
googleapis-common-protos 1.53.0
grpcio 1.38.1
httplib2 0.19.1
idna 2.10
ipykernel 6.0.0
ipython 7.25.0
ipython-genutils 0.2.0
ipywidgets 7.6.3
jedi 0.18.0
Jinja2 3.0.1
jmespath 0.10.0
jsonschema 3.2.0
jupyter 1.0.0
jupyter-client 6.1.12
jupyter-console 6.4.0
jupyter-core 4.7.1
jupyterlab-pygments 0.1.2
jupyterlab-widgets 1.0.0
kiwisolver 1.3.1
libcst 0.3.19
MarkupSafe 2.0.1
matplotlib 3.4.2
matplotlib-inline 0.1.2
mistune 0.8.4
mypy-extensions 0.4.3
nbclient 0.5.3
nbconvert 6.1.0
nbformat 5.1.3
nest-asyncio 1.5.1
notebook 6.4.0
numpy 1.21.0
oauthlib 3.1.1
openpyxl 3.0.7
packaging 20.9
pandas 1.2.5
pandas-gbq 0.15.0
pandocfilters 1.4.3
parso 0.8.2
pexpect 4.8.0
pickleshare 0.7.5
Pillow 8.3.0
pip 21.1.3
prometheus-client 0.11.0
prompt-toolkit 3.0.19
proto-plus 1.19.0
protobuf 3.17.3
ptyprocess 0.7.0
pyarrow 4.0.1
pyasn1 0.4.8
pyasn1-modules 0.2.8
pycparser 2.20
pydata-google-auth 1.2.0
Pygments 2.9.0
pyparsing 2.4.7
pyrsistent 0.18.0
python-dateutil 2.8.1
pytz 2021.1
PyYAML 5.4.1
pyzmq 22.1.0
qtconsole 5.1.1
QtPy 1.9.0
requests 2.25.1
requests-oauthlib 1.3.0
rsa 4.7.2
s3transfer 0.4.2
scipy 1.7.0
seaborn 0.11.1
Send2Trash 1.7.1
seshat 0.8.5
setuptools 57.0.0
six 1.16.0
soupsieve 2.2.1
terminado 0.10.1
testpath 0.5.0
titlecase 2.2.0
tornado 6.1
tqdm 4.61.1
traitlets 5.0.5
typing-extensions 3.10.0.0
typing-inspect 0.7.1
uritemplate 3.0.1
urllib3 1.26.6
uuid 1.30
wcwidth 0.2.5
webencodings 0.5.1
wheel 0.36.2
widgetsnbextension 3.5.1
XlsxWriter 1.4.3
Now when I open up python and try to import the module, I receive the following error message:
(rc_env) LTA00015JFK:~ adenner$ python
Python 3.8.5 (v3.8.5:580fbb018f, Jul 20 2020, 12:11:27)
[Clang 6.0 (clang-600.0.57)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyarrow
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/adenner/Environments/rc_env/lib/python3.8/site-packages/pyarrow/__init__.py", line 63, in <module>
import pyarrow.lib as _lib
ImportError: dlopen(/Users/adenner/Environments/rc_env/lib/python3.8/site-packages/pyarrow/lib.cpython-38-darwin.so, 2): Symbol not found: ____chkstk_darwin
Referenced from: /Users/adenner/Environments/rc_env/lib/python3.8/site-packages/pyarrow/libarrow.400.dylib
Expected in: /usr/lib/libSystem.B.dylib
in /Users/adenner/Environments/rc_env/lib/python3.8/site-packages/pyarrow/libarrow.400.dylib
I have tried using both anaconda3 and miniconda3 environments but unfortunately, I keep receiving the same error message. I am also using macOS High Serra version 10.13.6 Any suggestions?
This is a known bug with the latest pyarrow release: https://issues.apache.org/jira/browse/ARROW-13108. you will need macOS 10.15+ to run the wheel builds.
As you are already in an environment created by conda, you could instead use the pyarrow conda package. This will work on macOS 10.9+ and is even the preferred way to install pyarrow:
conda install -c conda-forge pyarrow
Alternatively using mamba:
mamba install -c conda-forge pyarrow
you can try the jupyter lab instead of using jupyter notebook. I got this problem when i use notebook, but pyarrow works well in jupyter lab
In my case I was having problems with import pyarrow.parquet as pq when running my code on jupyter-lab. When I tried the same code as script as in if __name__ == __main__: my_func() , where my_function() had the import, then it worked. When I re-installed jupyter-lab the problems disappeared also in jupyter.
conda install -c conda-forge jupyterlab
I have previously installed Tensorflow on several machines but am stuck installing it on my new laptop with RTX 2060. No matter what combination of versions I try, I get the same error. I found similar issues online and it seems like the problem is the version conflict of cudnn and tensorflow.
Here's my installation and the error.
Currently, I have Cuda v10.0.130 and cudnn-10.0-windows10-x64-v7.6.0.64 to match the installation of tensorflow, like in the image. tf__version__ = 1.13.1. Python version is 3.6. THe cudnn libraries are copied in Cuda installation folder. I also tried with Tensorflow 1.14 and python 3.7 and am getting the same results.
I'm installing tensorflow with Anaconda
conda install tensorflow-gpu
Traceback (most recent call last):
File "<ipython-input-1-c77ea08f5c30>", line 1, in <module>
runfile('C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py', wdir='C:/Users/mazat/Documents/Python/MVTools/player_detector')
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 399, in <module>
player_detector_run()
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 392, in player_detector_run
glavnaya(dropbox_folder,gamename,mvstatus)
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 247, in glavnaya
__,box1,score = yolo_class.detect_images(im2[ii].astype('uint8'))
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\yolo3.py", line 181, in detect_images
K.learning_phase(): 0
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\client\session.py", line 929, in run
run_metadata_ptr)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\client\session.py", line 1152, in _run
feed_dict_tensor, options, run_metadata)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\client\session.py", line 1328, in _do_run
run_metadata)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\client\session.py", line 1348, in _do_call
raise type(e)(node_def, op, message)
UnknownError: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv2d_1/convolution (defined at C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\keras\backend\tensorflow_backend.py:3650) ]]
Caused by op 'conv2d_1/convolution', defined at:
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\console\__main__.py", line 11, in <module>
start.main()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\console\start.py", line 318, in main
kernel.start()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\kernelapp.py", line 563, in start
self.io_loop.start()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\platform\asyncio.py", line 148, in start
self.asyncio_loop.run_forever()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\asyncio\base_events.py", line 438, in run_forever
self._run_once()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\asyncio\base_events.py", line 1451, in _run_once
handle._run()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\asyncio\events.py", line 145, in _run
self._callback(*self._args)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\ioloop.py", line 690, in <lambda>
lambda f: self._run_callback(functools.partial(callback, future))
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\ioloop.py", line 743, in _run_callback
ret = callback()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\gen.py", line 787, in inner
self.run()
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\gen.py", line 748, in run
yielded = self.gen.send(value)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\kernelbase.py", line 365, in process_one
yield gen.maybe_future(dispatch(*args))
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\gen.py", line 209, in wrapper
yielded = next(result)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\kernelbase.py", line 272, in dispatch_shell
yield gen.maybe_future(handler(stream, idents, msg))
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\gen.py", line 209, in wrapper
yielded = next(result)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\kernelbase.py", line 542, in execute_request
user_expressions, allow_stdin,
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tornado\gen.py", line 209, in wrapper
yielded = next(result)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\ipkernel.py", line 294, in do_execute
res = shell.run_cell(code, store_history=store_history, silent=silent)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\ipykernel\zmqshell.py", line 536, in run_cell
return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\interactiveshell.py", line 2855, in run_cell
raw_cell, store_history, silent, shell_futures)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\interactiveshell.py", line 2881, in _run_cell
return runner(coro)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\async_helpers.py", line 68, in _pseudo_sync_runner
coro.send(None)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\interactiveshell.py", line 3058, in run_cell_async
interactivity=interactivity, compiler=compiler, result=result)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\interactiveshell.py", line 3249, in run_ast_nodes
if (await self.run_code(code, result, async_=asy)):
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\IPython\core\interactiveshell.py", line 3326, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-1-c77ea08f5c30>", line 1, in <module>
runfile('C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py', wdir='C:/Users/mazat/Documents/Python/MVTools/player_detector')
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 399, in <module>
player_detector_run()
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 392, in player_detector_run
glavnaya(dropbox_folder,gamename,mvstatus)
File "C:/Users/mazat/Documents/Python/MVTools/player_detector/player_detector_testing.py", line 137, in glavnaya
yolo_class=YOLO(model_name,script_dir, res)
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\yolo3.py", line 39, in __init__
self.boxes, self.scores, self.classes = self.generate()
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\yolo3.py", line 68, in generate
if is_tiny_version else yolo_body(Input(shape=(None,None,3)), num_anchors//3, num_classes)
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\model.py", line 72, in yolo_body
darknet = Model(inputs, darknet_body(inputs))
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\model.py", line 48, in darknet_body
x = DarknetConv2D_BN_Leaky(32, (3,3))(x)
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\utils.py", line 16, in <lambda>
return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
File "C:\Users\mazat\Documents\Python\MVTools\player_detector\yolo3\utils.py", line 16, in <lambda>
return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\keras\engine\base_layer.py", line 457, in __call__
output = self.call(inputs, **kwargs)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\keras\layers\convolutional.py", line 171, in call
dilation_rate=self.dilation_rate)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\keras\backend\tensorflow_backend.py", line 3650, in conv2d
data_format=tf_data_format)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 851, in convolution
return op(input, filter)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 966, in __call__
return self.conv_op(inp, filter)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 591, in __call__
return self.call(inp, filter)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 208, in __call__
name=self.name)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\ops\gen_nn_ops.py", line 1026, in conv2d
data_format=data_format, dilations=dilations, name=name)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 788, in _apply_op_helper
op_def=op_def)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\util\deprecation.py", line 507, in new_func
return func(*args, **kwargs)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\framework\ops.py", line 3300, in create_op
op_def=op_def)
File "C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\tensorflow\python\framework\ops.py", line 1801, in __init__
self._traceback = tf_stack.extract_stack()
UnknownError (see above for traceback): Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv2d_1/convolution (defined at C:\Users\mazat\Anaconda3\envs\tf_gpu\lib\site-packages\keras\backend\tensorflow_backend.py:3650) ]]
Here's also result of conda list
Same issue if I don't force python or tensorflow versions and install the default tensorflow 1.14 and python 3.7
(tf_gpu_tds) C:\Users\mazat>conda list
# packages in environment at C:\Users\mazat\Anaconda3\envs\tf_gpu_tds:
#
# Name Version Build Channel
_tflow_select 2.1.0 gpu
absl-py 0.8.0 py37_0
alabaster 0.7.12 py37_0
asn1crypto 0.24.0 py37_0
astor 0.8.0 py37_0
astroid 2.3.1 py37_0
attrs 19.1.0 py37_1
babel 2.7.0 py_0
backcall 0.1.0 py37_0
blas 1.0 mkl
bleach 3.1.0 py37_0
ca-certificates 2019.9.11 hecc5488_0 conda-forge
certifi 2019.9.11 py37_0
cffi 1.12.3 py37h7a1dbc1_0
chardet 3.0.4 py37_1003
cloudpickle 1.2.2 py_0
colorama 0.4.1 py37_0
cryptography 2.7 py37h7a1dbc1_0
cudatoolkit 10.0.130 0
cudnn 7.6.0 cuda10.0_0
cycler 0.10.0 py_1 conda-forge
cytoolz 0.10.0 py37hfa6e2cd_0 conda-forge
dask-core 2.5.0 py_0 conda-forge
decorator 4.4.0 py37_1
defusedxml 0.6.0 py_0
docutils 0.15.2 py37_0
entrypoints 0.3 py37_0
freetype 2.9.1 ha9979f8_1
gast 0.3.2 py_0
grpcio 1.16.1 py37h351948d_1
h5py 2.9.0 py37h5e291fa_0
hdf5 1.10.4 h7ebc959_0
icc_rt 2019.0.0 h0cc432a_1
icu 58.2 ha66f8fd_1
idna 2.8 py37_0
imageio 2.5.0 py37_0 conda-forge
imagesize 1.1.0 py37_0
intel-openmp 2019.4 245
ipykernel 5.1.2 py37h39e3cac_0
ipython 7.8.0 py37h39e3cac_0
ipython_genutils 0.2.0 py37_0
isort 4.3.21 py37_0
jedi 0.15.1 py37_0
jinja2 2.10.1 py37_0
joblib 0.13.2 py37_0
jpeg 9b hb83a4c4_2
jsonschema 3.0.2 py37_0
jupyter_client 5.3.3 py37_1
jupyter_core 4.5.0 py_0
keras-applications 1.0.8 py_0
keras-base 2.2.4 py37_0 anaconda
keras-gpu 2.2.4 0 anaconda
keras-preprocessing 1.1.0 py_1
keyring 18.0.0 py37_0
kiwisolver 1.1.0 py37he980bc4_0 conda-forge
lazy-object-proxy 1.4.2 py37he774522_0
libpng 1.6.37 h2a8f88b_0
libprotobuf 3.9.2 h7bd577a_0
libsodium 1.0.16 h9d3ae62_0
libtiff 4.0.10 hb898794_2
markdown 3.1.1 py37_0
markupsafe 1.1.1 py37he774522_0
matplotlib-base 3.1.1 py37h2852a4a_1 conda-forge
mccabe 0.6.1 py37_1
mistune 0.8.4 py37he774522_0
mkl 2019.4 245
mkl-service 2.3.0 py37hb782905_0
mkl_fft 1.0.14 py37h14836fe_0
mkl_random 1.1.0 py37h675688f_0
nbconvert 5.6.0 py37_1
nbformat 4.4.0 py37_0
networkx 2.3 py_0 conda-forge
numpy 1.16.5 py37h19fb1c0_0
numpy-base 1.16.5 py37hc3f5095_0
numpydoc 0.9.1 py_0
olefile 0.46 py37_0
openssl 1.1.1c hfa6e2cd_0 conda-forge
packaging 19.2 py_0
pandas 0.25.1 py37ha925a31_0 anaconda
pandoc 2.2.3.2 0
pandocfilters 1.4.2 py37_1
parso 0.5.1 py_0
pickleshare 0.7.5 py37_0
pillow 6.1.0 py37hdc69c19_0
pip 19.2.3 py37_0
prompt_toolkit 2.0.9 py37_0
protobuf 3.9.2 py37h33f27b4_0
psutil 5.6.3 py37he774522_0
pycodestyle 2.5.0 py37_0
pycparser 2.19 py37_0
pyflakes 2.1.1 py37_0
pygments 2.4.2 py_0
pylint 2.4.2 py37_0
pyopenssl 19.0.0 py37_0
pyparsing 2.4.2 py_0
pyqt 5.9.2 py37h6538335_2
pyreadline 2.1 py37_1
pyrsistent 0.15.4 py37he774522_0
pysocks 1.7.1 py37_0
python 3.7.4 h5263a28_0
python-dateutil 2.8.0 py37_0
pytz 2019.2 py_0
pywavelets 1.0.3 py37h452e1ab_1 conda-forge
pywin32 223 py37hfa6e2cd_1
pyyaml 5.1.2 py37he774522_0 anaconda
pyzmq 18.1.0 py37ha925a31_0
qt 5.9.7 vc14h73c81de_0
qtawesome 0.6.0 py_0
qtconsole 4.5.5 py_0
qtpy 1.9.0 py_0
requests 2.22.0 py37_0
rope 0.14.0 py_0
scikit-image 0.15.0 py37he350917_2 conda-forge
scikit-learn 0.21.3 py37h6288b17_0
scipy 1.3.1 py37h29ff71c_0
setuptools 41.2.0 py37_0
sip 4.19.8 py37h6538335_0
six 1.12.0 py37_0
snowballstemmer 1.9.1 py_0
sphinx 2.2.0 py_0
sphinxcontrib-applehelp 1.0.1 py_0
sphinxcontrib-devhelp 1.0.1 py_0
sphinxcontrib-htmlhelp 1.0.2 py_0
sphinxcontrib-jsmath 1.0.1 py_0
sphinxcontrib-qthelp 1.0.2 py_0
sphinxcontrib-serializinghtml 1.1.3 py_0
spyder 3.3.6 py37_0
spyder-kernels 0.5.2 py37_0
sqlite 3.29.0 he774522_0
tensorboard 1.14.0 py37he3c9ec2_0
tensorflow 1.14.0 gpu_py37h5512b17_0
tensorflow-base 1.14.0 gpu_py37h55fc52a_0
tensorflow-estimator 1.14.0 py_0
tensorflow-gpu 1.14.0 h0d30ee6_0
termcolor 1.1.0 py37_1
testpath 0.4.2 py37_0
tk 8.6.8 hfa6e2cd_0
toolz 0.10.0 py_0 conda-forge
tornado 6.0.3 py37he774522_0
traitlets 4.3.2 py37_0
urllib3 1.24.2 py37_0
vc 14.1 h0510ff6_4
vs2015_runtime 14.16.27012 hf0eaf9b_0
wcwidth 0.1.7 py37_0
webencodings 0.5.1 py37_1
werkzeug 0.16.0 py_0
wheel 0.33.6 py37_0
win_inet_pton 1.1.0 py37_0
wincertstore 0.2 py37_0
wrapt 1.11.2 py37he774522_0
xz 5.2.4 h2fa13f4_4
yaml 0.1.7 vc14h4cb57cf_1 [vc14] anaconda
zeromq 4.3.1 h33f27b4_3
zlib 1.2.11 h62dcd97_3
zstd 1.3.7 h508b16e_0
In the end, I got my answer from this github issue. After several reinstalls and restarts, these lines of code started making all the difference:
import tensorflow as tf
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
tf.keras.backend.set_session(tf.Session(config=config))
I'm still not sure what was happening, but I guess my suggestion would be to make sure you restart your computer often enough. Now both environments with TF1.13.1 + Python 3.6 and TF1.14+Python 3.7 work for me.
Please note that, due to compatibility reasons, the code for newer (>= 2.0) versions of tensorflow is:
import tensorflow as tf
config = tf.compat.v1.ConfigProto()
config.gpu_options.allow_growth = True
tf.compat.v1.keras.backend.set_session(tf.compat.v1.Session(config=config))
Since the compatibility issue is the most probable option as we discussed in the comment sections, I found the tested versions of tensorflow with respect to the CUDA and cuDNN versions. You can find it in here.
Please feel free to update the status of your problem after you setup your environment according to the given link.
I hope it would be resolved.
EDIT: In case you're in Windows, I'd like to update a new link
I'm trying to use airflow with a Vertica database as a backend for metadata.
I have configured correctly the airflow.cfg file providing the string connection with Vertica and the schema name.
This is the error I keep getting when I try to run airflow initdb
Traceback (most recent call last):
File "/srv/python/virtualenvs/airflow/bin/airflow", line 32, in <module>
args.func(args)
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/airflow/bin/ cli.py", line 1096, in initdb
db.initdb(settings.RBAC)
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/airflow/util s/db.py", line 91, in initdb
upgradedb()
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/airflow/util s/db.py", line 358, in upgradedb
command.upgrade(config, 'heads')
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/comm and.py", line 254, in upgrade
script.run_env()
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/scri pt/base.py", line 427, in run_env
util.load_python_file(self.dir, 'env.py')
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/util /pyfiles.py", line 81, in load_python_file
module = load_module_py(module_id, path)
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/util /compat.py", line 83, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/airflow/migr ations/env.py", line 92, in <module>
run_migrations_online()
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/airflow/migr ations/env.py", line 82, in run_migrations_online
compare_type=COMPARE_TYPE,
File "<string>", line 8, in configure
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/runt ime/environment.py", line 812, in configure
opts=opts
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/runt ime/migration.py", line 172, in configure
return MigrationContext(dialect, connection, opts, environment_context)
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/runt ime/migration.py", line 111, in __init__
self.impl = ddl.DefaultImpl.get_by_dialect(dialect)(
File "/srv/python/virtualenvs/airflow/lib/python3.6/site-packages/alembic/ddl/ impl.py", line 65, in get_by_dialect
return _impls[dialect.name]
KeyError: 'vertica'
This is the list of package installed in a dedicated virtualenv
alembic 0.9.10
apache-airflow 1.10.3
Babel 2.6.0
certifi 2019.3.9
chardet 3.0.4
Click 7.0
colorama 0.4.1
configparser 3.5.3
croniter 0.3.29
cx-Oracle 7.1.2
defusedxml 0.5.0
dill 0.2.9
docutils 0.14
Flask 1.0.2
Flask-Admin 1.5.3
Flask-AppBuilder 1.12.3
Flask-Babel 0.12.2
Flask-Caching 1.3.3
Flask-Login 0.4.1
Flask-OpenID 1.2.5
Flask-SQLAlchemy 2.3.2
flask-swagger 0.2.13
Flask-WTF 0.14.2
funcsigs 1.0.0
future 0.16.0
gitdb2 2.0.5
GitPython 2.1.11
gunicorn 19.9.0
idna 2.8
iso8601 0.1.12
itsdangerous 1.1.0
Jinja2 2.10
json-merge-patch 0.2
lockfile 0.12.2
lxml 4.3.3
Mako 1.0.8
Markdown 2.6.11
MarkupSafe 1.1.1
numpy 1.16.2
ordereddict 1.1
pandas 0.24.2
pendulum 1.4.4
pip 19.0.3
psutil 5.6.1
psycopg2 2.8.2
psycopg2-binary 2.8.2
Pygments 2.3.1
python-daemon 2.1.2
python-dateutil 2.8.0
python-editor 1.0.4
python3-openid 3.1.0
pytz 2019.1
pytzdata 2019.1
PyYAML 5.1
requests 2.21.0
setproctitle 1.1.10
setuptools 41.0.0
six 1.12.0
smmap2 2.0.5
SQLAlchemy 1.2.18
sqlalchemy-vertica-python 0.4.4
tabulate 0.8.3
tenacity 4.12.0
text-unidecode 1.2
thrift 0.11.0
tzlocal 1.5.1
unicodecsv 0.14.1
urllib3 1.24.1
vertica-python 0.9.1
Werkzeug 0.14.1
wheel 0.33.1
WTForms 2.2.1
zope.deprecation 4.4.0
Thanks