How can I install Fbprophet without the error of wheel - python

I use python 3.10 without conda and try to install fbprophet in order to do time series analysis. Nevertheless, i get this error.
Is anyone has an idea to avoid this problem ? I tried to install bdist_wheel or pystan beside. I tried to install conda. However, nothing change.
Thanks to the person who will answer me :)
Bruce
│ exit code: 1
╰─> [55 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build\lib
creating build\lib\fbprophet
creating build\lib\fbprophet\stan_model
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "C:\Users\33646\AppData\Local\Temp\pip-install-ayf6nuuh\fbprophet_7b8991beef76435da2b024d4df8dce33\setup.py", line 122, in <module>
setup(
File "c:\Users\33646\AppData\Local\Programs\Python\Python310\lib\site-packages\setuptools\__init__.py", line 87, in setup
return distutils.core.setup(**attrs)
File "c:\Users\33646\AppData\Local\Programs\Python\Python310\lib\site-packages\setuptools\_distutils\core.py", line 177, in setup
return run_commands(dist)
File "c:\Users\33646\AppData\Local\Programs\Python\Python310\lib\site-packages\setuptools\_distutils\core.py", line 193, in run_commands
dist.run_commands()
File "c:\Users\33646\AppData\Local\Programs\Python\Python310\lib\site-packages\setuptools\_distutils\dist.py", line 968, in run_commands
self.run_command(cmd)
...
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_normal_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_normal_lpdf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_normal_prec_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_normal_prec_lpdf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_normal_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
Output exceeds the size limit. Open the full output data in a text editor
Collecting fbprophet
Using cached fbprophet-0.7.1.tar.gz (64 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: Cython>=0.22 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (0.29.32)
Collecting cmdstanpy==0.9.5
Using cached cmdstanpy-0.9.5-py3-none-any.whl (37 kB)
Collecting pystan>=2.14
Using cached pystan-3.5.0-py3-none-any.whl (13 kB)
Requirement already satisfied: numpy>=1.15.4 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (1.23.0)
Requirement already satisfied: pandas>=1.0.4 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (1.4.3)
Requirement already satisfied: matplotlib>=2.0.0 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (3.5.2)
Collecting LunarCalendar>=0.0.9
Using cached LunarCalendar-0.0.9-py2.py3-none-any.whl (18 kB)
Collecting convertdate>=2.1.2
Using cached convertdate-2.4.0-py3-none-any.whl (47 kB)
Collecting holidays>=0.10.2
Downloading holidays-0.15-py3-none-any.whl (181 kB)
------------------------------------ 181.3/181.3 kB 684.2 kB/s eta 0:00:00
Requirement already satisfied: setuptools-git>=1.2 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (1.2)
Requirement already satisfied: python-dateutil>=2.8.0 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (2.8.2)
Requirement already satisfied: tqdm>=4.36.1 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from fbprophet) (4.64.0)
Requirement already satisfied: pymeeus<=1,>=0.3.13 in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from convertdate>=2.1.2->fbprophet) (0.5.11)
Requirement already satisfied: korean-lunar-calendar in c:\users\33646\appdata\local\programs\python\python310\lib\site-packages (from holidays>=0.10.2->fbprophet) (0.2.1)
Collecting hijri-converter
...
Failed to build fbprophet pystan
Installing collected packages: pystan, hijri-converter, convertdate, LunarCalendar, holidays, cmdstanpy, fbprophet
Running setup.py install for pystan: started
Running setup.py install for pystan: finished with status 'error'
Output exceeds the size limit. Open the full output data in a text editor
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_student_t_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_student_t_lpdf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multi_student_t_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multinomial_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multinomial_lpmf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\multinomial_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\neg_binomial_2_log_glm_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\neg_binomial_2_log_glm_lpmf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\normal_id_glm_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\normal_id_glm_lpdf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_logistic_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_logistic_lpmf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_logistic_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_probit_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_probit_lpmf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\ordered_probit_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\poisson_log_glm_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\poisson_log_glm_lpmf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\wishart_log.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\wishart_lpdf.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
copying pystan\stan\lib\stan_math\stan\math\prim\mat\prob\wishart_rng.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\prob
creating build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\vectorize
copying pystan\stan\lib\stan_math\stan\math\prim\mat\vectorize\apply_scalar_unary.hpp -> build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\mat\vectorize
creating build\lib.win-amd64-cpython-310\pystan\stan\lib\stan_math\stan\math\prim\scal
...
╰─> pystan
note: This is an issue with the package mentioned above, not pip.```

Related

ERROR: Failed building wheel for tokenizers

I am using Windows 11 x64, Python 3.11.0 run
pip install transformers
error
Microsoft Windows [Version 10.0.22621.674]
(c) Microsoft Corporation. All rights reserved.
C:\Users\donhu>python
Python 3.11.0 (main, Oct 24 2022, 18:26:48) [MSC v.1933 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> exit()
C:\Users\donhu>conda install -c huggingface transformers
'conda' is not recognized as an internal or external command,
operable program or batch file.
C:\Users\donhu>pip install transformers
Collecting transformers
Downloading transformers-4.24.0-py3-none-any.whl (5.5 MB)
---------------------------------------- 5.5/5.5 MB 12.5 MB/s eta 0:00:00
Collecting filelock
Using cached filelock-3.8.0-py3-none-any.whl (10 kB)
Collecting huggingface-hub<1.0,>=0.10.0
Using cached huggingface_hub-0.10.1-py3-none-any.whl (163 kB)
Requirement already satisfied: numpy>=1.17 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from transformers) (1.23.4)
Requirement already satisfied: packaging>=20.0 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from transformers) (21.3)
Collecting pyyaml>=5.1
Using cached PyYAML-6.0-cp311-cp311-win_amd64.whl (143 kB)
Collecting regex!=2019.12.17
Using cached regex-2022.10.31-cp311-cp311-win_amd64.whl (267 kB)
Requirement already satisfied: requests in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from transformers) (2.28.1)
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1
Downloading tokenizers-0.13.1.tar.gz (358 kB)
---------------------------------------- 358.7/358.7 kB 21.8 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting tqdm>=4.27
Using cached tqdm-4.64.1-py2.py3-none-any.whl (78 kB)
Collecting typing-extensions>=3.7.4.3
Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from packaging>=20.0->transformers) (3.0.9)
Requirement already satisfied: colorama in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from tqdm>=4.27->transformers) (0.4.6)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers) (2.1.1)
Requirement already satisfied: idna<4,>=2.5 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers) (3.4)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers) (1.26.12)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\donhu\appdata\local\programs\python\python311\lib\site-packages (from requests->transformers) (2022.9.24)
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [51 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-311
creating build\lib.win-amd64-cpython-311\tokenizers
copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers
creating build\lib.win-amd64-cpython-311\tokenizers\models
copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\models
creating build\lib.win-amd64-cpython-311\tokenizers\decoders
copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\decoders
creating build\lib.win-amd64-cpython-311\tokenizers\normalizers
copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\normalizers
creating build\lib.win-amd64-cpython-311\tokenizers\pre_tokenizers
copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\pre_tokenizers
creating build\lib.win-amd64-cpython-311\tokenizers\processors
copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\processors
creating build\lib.win-amd64-cpython-311\tokenizers\trainers
copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\trainers
creating build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\implementations
creating build\lib.win-amd64-cpython-311\tokenizers\tools
copying py_src\tokenizers\tools\visualizer.py -> build\lib.win-amd64-cpython-311\tokenizers\tools
copying py_src\tokenizers\tools\__init__.py -> build\lib.win-amd64-cpython-311\tokenizers\tools
copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers
copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\models
copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\decoders
copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\normalizers
copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\pre_tokenizers
copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\processors
copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-311\tokenizers\trainers
copying py_src\tokenizers\tools\visualizer-styles.css -> build\lib.win-amd64-cpython-311\tokenizers\tools
running build_ext
running build_rust
error: can't find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
pip install --upgrade pip
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
C:\Users\donhu>
How to fix it?

Running setup.py install for h5py ... error

I'm trying to run this repository with TX2 but I stuck in some errors. The requirement of this repository is flahsing JetPack 4.2 but I couldn't flash it because there's no JetPack 4.2. Then, I've installed JetPack 4.5 on this webpage, not JetPack 4.2.
After that, I tried to install wheel file but there are some errors with below codes.
sudo apt update
sudo apt upgrade
sudo apt-get install python-setuptools
sudo apt-get install python-pip
cd ~/Downloads
sudo pip install tensorflow-1.12.1-cp27-cp27mu-linux_aarch64.whl
The directory '/home/nvidia/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
The directory '/home/nvidia/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Processing ./tensorflow-1.12.1-cp27-cp27mu-linux_aarch64.whl
Requirement already satisfied: astor>=0.6.0 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: termcolor>=1.1.0 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: numpy>=1.13.3 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: tensorboard<1.13.0,>=1.12.0 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: six>=1.10.0 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: backports.weakref>=1.0rc1 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: absl-py>=0.1.6 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: wheel in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: gast>=0.2.0 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: keras-preprocessing>=1.0.5 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: protobuf>=3.6.1 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: enum34>=1.1.6 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Collecting keras-applications>=1.0.6 (from tensorflow==1.12.1)
Collecting mock>=2.0.0 (from tensorflow==1.12.1)
Downloading https://files.pythonhosted.org/packages/05/d2/f94e68be6b17f46d2c353564da56e6fb89ef09faeeff3313a046cb810ca9/mock-3.0.5-py2.py3-none-any.whl
Requirement already satisfied: grpcio>=1.8.6 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorflow==1.12.1)
Requirement already satisfied: markdown>=2.6.8 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow==1.12.1)
Requirement already satisfied: werkzeug>=0.11.10 in /home/nvidia/.local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow==1.12.1)
Requirement already satisfied: futures>=3.1.1; python_version < "3" in /home/nvidia/.local/lib/python2.7/site-packages (from tensorboard<1.13.0,>=1.12.0->tensorflow==1.12.1)
Collecting h5py (from keras-applications>=1.0.6->tensorflow==1.12.1)
Downloading https://files.pythonhosted.org/packages/5f/97/a58afbcf40e8abecededd9512978b4e4915374e5b80049af082f49cebe9a/h5py-2.10.0.tar.gz (301kB)
100% |████████████████████████████████| 307kB 1.9MB/s
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in /usr/lib/python2.7/dist-packages (from mock>=2.0.0->tensorflow==1.12.1)
Requirement already satisfied: setuptools>=36 in /home/nvidia/.local/lib/python2.7/site-packages (from markdown>=2.6.8->tensorboard<1.13.0,>=1.12.0->tensorflow==1.12.1)
Installing collected packages: h5py, keras-applications, mock, tensorflow
Running setup.py install for h5py ... error
Complete output from command /usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-6EHebF/h5py/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-4T2QcD-record/install-record.txt --single-version-externally-managed --compile:
running install
running build
running build_py
creating build
creating build/lib.linux-aarch64-2.7
creating build/lib.linux-aarch64-2.7/h5py
copying h5py/ipy_completer.py -> build/lib.linux-aarch64-2.7/h5py
copying h5py/version.py -> build/lib.linux-aarch64-2.7/h5py
copying h5py/__init__.py -> build/lib.linux-aarch64-2.7/h5py
copying h5py/h5py_warnings.py -> build/lib.linux-aarch64-2.7/h5py
copying h5py/highlevel.py -> build/lib.linux-aarch64-2.7/h5py
creating build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/vds.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/group.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/selections.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/dataset.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/filters.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/selections2.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/__init__.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/dims.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/files.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/attrs.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/compat.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/datatype.py -> build/lib.linux-aarch64-2.7/h5py/_hl
copying h5py/_hl/base.py -> build/lib.linux-aarch64-2.7/h5py/_hl
creating build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_filters.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_file2.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_attrs.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5t.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/common.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_slicing.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5pl.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/__init__.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_attrs_data.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_attribute_create.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5d_direct_chunk.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dataset_getitem.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5f.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_group.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5p.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_selections.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_datatype.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dimension_scales.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_file.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dataset.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_base.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_deprecation.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_file_image.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dims_dimensionproxy.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_h5.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_objects.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_threads.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_completions.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dtype.py -> build/lib.linux-aarch64-2.7/h5py/tests
copying h5py/tests/test_dataset_swmr.py -> build/lib.linux-aarch64-2.7/h5py/tests
creating build/lib.linux-aarch64-2.7/h5py/tests/test_vds
copying h5py/tests/test_vds/test_virtual_source.py -> build/lib.linux-aarch64-2.7/h5py/tests/test_vds
copying h5py/tests/test_vds/__init__.py -> build/lib.linux-aarch64-2.7/h5py/tests/test_vds
copying h5py/tests/test_vds/test_highlevel_vds.py -> build/lib.linux-aarch64-2.7/h5py/tests/test_vds
copying h5py/tests/test_vds/test_lowlevel_vds.py -> build/lib.linux-aarch64-2.7/h5py/tests/test_vds
running build_ext
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-build-6EHebF/h5py/setup.py", line 159, in <module>
cmdclass = CMDCLASS,
File "/home/nvidia/.local/lib/python2.7/site-packages/setuptools/__init__.py", line 162, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/home/nvidia/.local/lib/python2.7/site-packages/setuptools/command/install.py", line 61, in run
return orig.install.run(self)
File "/usr/lib/python2.7/distutils/command/install.py", line 601, in run
self.run_command('build')
File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/lib/python2.7/distutils/command/build.py", line 128, in run
self.run_command(cmd_name)
File "/usr/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "setup_build.py", line 166, in run
config.run()
File "setup_configure.py", line 160, in run
self.hdf5_version = autodetect_version(self.hdf5)
File "setup_configure.py", line 196, in autodetect_version
import pkgconfig
File "/tmp/pip-build-6EHebF/h5py/.eggs/pkgconfig-1.5.3-py2.7.egg/pkgconfig/__init__.py", line 1, in <module>
from .pkgconfig import *
File "/tmp/pip-build-6EHebF/h5py/.eggs/pkgconfig-1.5.3-py2.7.egg/pkgconfig/pkgconfig.py", line 281
flags = _query(packages, *os_opts, *_build_options(option, static=static))
^
SyntaxError: invalid syntax
----------------------------------------
Command "/usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-6EHebF/h5py/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-4T2QcD-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-6EHebF/h5py/
Please help me and feel free to ask anything about it. Thank you for reading it.

How to run Faust from Docker - ERROR: Failed building wheel for python-rocksdb

I'm trying to run Python Faust from Docker.
Based on this documentation: https://faust.readthedocs.io/en/latest/userguide/installation.html
I created a simple Docker file:
FROM python:3
ADD ./app/app.py /
RUN pip3 install --upgrade pip
RUN pip install -U faust
RUN pip install "faust[rocksdb]"
RUN pip install "faust[rocksdb,uvloop,fast,redis]"
CMD ["python", "./app.py"]
When I create a docker file I receive an error at the 5th stage (Step 5/7 : RUN pip install "faust[rocksdb]")
---> Running in 1e42a5e50cbe Requirement already satisfied:
faust[rocksdb] in /usr/local/lib/python3.10/site-packages (1.10.4)
Requirement already satisfied: terminaltables<4.0,>=3.1 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (3.1.10)
Requirement already satisfied: click<8.0,>=6.7 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (7.1.2)
Requirement already satisfied: yarl<2.0,>=1.0 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (1.7.2)
Requirement already satisfied: aiohttp-cors<2.0,>=0.7 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (0.7.0)
Requirement already satisfied: mypy-extensions in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (0.4.3)
Requirement already satisfied: colorclass<3.0,>=2.2 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (2.2.2)
Requirement already satisfied: opentracing<2.0.0,>=1.3.0 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (1.3.0)
Requirement already satisfied: mode<4.4,>=4.3.2 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (4.3.2)
Requirement already satisfied: venusian<2.0,>=1.1 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (1.2.0)
Requirement already satisfied: aiohttp<4.0,>=3.5.2 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (3.8.1)
Requirement already satisfied: robinhood-aiokafka<1.2,>=1.1.6 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (1.1.6)
Requirement already satisfied: croniter>=0.3.16 in
/usr/local/lib/python3.10/site-packages (from faust[rocksdb]) (1.1.0)
Collecting python-rocksdb>=0.6.7 Downloading
python-rocksdb-0.7.0.tar.gz (219 kB) Preparing metadata (setup.py):
started Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: aiosignal>=1.1.2 in
/usr/local/lib/python3.10/site-packages (from
aiohttp<4.0,>=3.5.2->faust[rocksdb]) (1.2.0) Requirement already
satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/site-packages
(from aiohttp<4.0,>=3.5.2->faust[rocksdb]) (21.2.0) Requirement
already satisfied: frozenlist>=1.1.1 in
/usr/local/lib/python3.10/site-packages (from
aiohttp<4.0,>=3.5.2->faust[rocksdb]) (1.2.0) Requirement already
satisfied: charset-normalizer<3.0,>=2.0 in
/usr/local/lib/python3.10/site-packages (from
aiohttp<4.0,>=3.5.2->faust[rocksdb]) (2.0.9) Requirement already
satisfied: multidict<7.0,>=4.5 in
/usr/local/lib/python3.10/site-packages (from
aiohttp<4.0,>=3.5.2->faust[rocksdb]) (5.2.0) Requirement already
satisfied: async-timeout<5.0,>=4.0.0a3 in
/usr/local/lib/python3.10/site-packages (from
aiohttp<4.0,>=3.5.2->faust[rocksdb]) (4.0.2) Requirement already
satisfied: python-dateutil in /usr/local/lib/python3.10/site-packages
(from croniter>=0.3.16->faust[rocksdb]) (2.8.2) Requirement already
satisfied: colorlog>=2.9.0 in /usr/local/lib/python3.10/site-packages
(from mode<4.4,>=4.3.2->faust[rocksdb]) (6.6.0) Requirement already
satisfied: setuptools>=25 in /usr/local/lib/python3.10/site-packages
(from python-rocksdb>=0.6.7->faust[rocksdb]) (57.5.0) Requirement
already satisfied: kafka-python<1.5,>=1.4.6 in
/usr/local/lib/python3.10/site-packages (from
robinhood-aiokafka<1.2,>=1.1.6->faust[rocksdb]) (1.4.7) Requirement
already satisfied: idna>=2.0 in
/usr/local/lib/python3.10/site-packages (from
yarl<2.0,>=1.0->faust[rocksdb]) (3.3) Requirement already satisfied:
six>=1.5 in /usr/local/lib/python3.10/site-packages (from
python-dateutil->croniter>=0.3.16->faust[rocksdb]) (1.16.0)
And an ERROR PART:
Building
wheels for collected packages: python-rocksdb Building wheel for
python-rocksdb (setup.py): started ERROR: Command errored out with
exit status 1: command: /usr/local/bin/python -u -c 'import io, os,
sys, setuptools, tokenize; sys.argv[0] =
'"'"'/tmp/pip-install-b8y7g4hs/python-rocksdb_b1c08993fd134ac4bc59e6f5d18bcd91/setup.py'"'"';
file='"'"'/tmp/pip-install-b8y7g4hs/python-rocksdb_b1c08993fd134ac4bc59e6f5d18bcd91/setup.py'"'"';f
= getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import
setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"',
'"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))'
bdist_wheel -d /tmp/pip-wheel-9_o4ek6z
cwd: /tmp/pip-install-b8y7g4hs/python-rocksdb_b1c08993fd134ac4bc59e6f5d18bcd91/
Complete output (64 lines): running bdist_wheel running build
running build_py creating build creating
build/lib.linux-x86_64-3.10 creating
build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/interfaces.py ->
build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/errors.py ->
build/lib.linux-x86_64-3.10/rocksdb copying
rocksdb/merge_operators.py -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/init.py -> build/lib.linux-x86_64-3.10/rocksdb
creating build/lib.linux-x86_64-3.10/rocksdb/tests copying
rocksdb/tests/test_memtable.py ->
build/lib.linux-x86_64-3.10/rocksdb/tests copying
rocksdb/tests/test_db.py -> build/lib.linux-x86_64-3.10/rocksdb/tests
copying rocksdb/tests/init.py ->
build/lib.linux-x86_64-3.10/rocksdb/tests copying
rocksdb/tests/test_options.py ->
build/lib.linux-x86_64-3.10/rocksdb/tests running egg_info writing
python_rocksdb.egg-info/PKG-INFO writing dependency_links to
python_rocksdb.egg-info/dependency_links.txt writing requirements to
python_rocksdb.egg-info/requires.txt writing top-level names to
python_rocksdb.egg-info/top_level.txt reading manifest file
'python_rocksdb.egg-info/SOURCES.txt' reading manifest template
'MANIFEST.in' writing manifest file
'python_rocksdb.egg-info/SOURCES.txt' copying rocksdb/_rocksdb.cpp
-> build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/rocksdb.pyx -> build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/backup.pxd -> build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/cache.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/comparator.pxd
-> build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/db.pxd -> build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/env.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying
rocksdb/filter_policy.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/iterator.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/logger.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/memtablerep.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/merge_operator.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/options.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying rocksdb/slice.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying
rocksdb/slice_transform.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/snapshot.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/status.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/std_memory.pxd -> build/lib.linux-x86_64-3.10/rocksdb
copying rocksdb/table_factory.pxd ->
build/lib.linux-x86_64-3.10/rocksdb copying
rocksdb/universal_compaction.pxd ->
build/lib.linux-x86_64-3.10/rocksdb creating
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/comparator_wrapper.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/filter_policy_wrapper.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/memtable_factories.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/merge_operator_wrapper.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/slice_transform_wrapper.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp copying
rocksdb/cpp/utils.hpp -> build/lib.linux-x86_64-3.10/rocksdb/cpp
copying rocksdb/cpp/write_batch_iter_helper.hpp ->
build/lib.linux-x86_64-3.10/rocksdb/cpp running build_ext
cythoning rocksdb/_rocksdb.pyx to rocksdb/_rocksdb.cpp
/tmp/pip-install-b8y7g4hs/python-rocksdb_b1c08993fd134ac4bc59e6f5d18bcd91/.eggs/Cython-0.29.26-py3.10-linux-x86_64.egg/Cython/Compiler/Main.py:369:
FutureWarning: Cython directive 'language_level' not set, using 2 for
now (Py2). This will change in a later release! File:
/tmp/pip-install-b8y7g4hs/python-rocksdb_b1c08993fd134ac4bc59e6f5d18bcd91/rocksdb/_rocksdb.pyx
tree = Parsing.p_module(s, pxd, full_module_name) building 'rocksdb._rocksdb' extension creating build/temp.linux-x86_64-3.10
creating build/temp.linux-x86_64-3.10/rocksdb gcc -pthread
-Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/usr/local/include/python3.10 -c rocksdb/_rocksdb.cpp -o build/temp.linux-x86_64-3.10/rocksdb/_rocksdb.o -std=c++11 -O3 -Wall
-Wextra -Wconversion -fno-strict-aliasing -fno-rtti rocksdb/_rocksdb.cpp:705:10: fatal error: rocksdb/slice.h: No such
file or directory
705 | #include "rocksdb/slice.h"
| ^~~~~~~~~~~~~~~~~ compilation terminated. error: command '/usr/bin/gcc' failed with exit code 1
---------------------------------------- Building wheel for python-rocksdb (setup.py): finished with status 'error' ERROR:
Failed building wheel for python-rocksdb
Can anyone help me to move on with this? I'd like to use Faust from Docker on Kubernetes.
Read the error message, where it is clearly stated you are missing a header file:
fatal error: rocksdb/slice.h: No such file or directory 705 | #include "rocksdb/slice.h" | ^~~~~~~~~~~~~~~~~ compilation terminated. error: command '/usr/bin/gcc' failed with exit code 1
Accordingly, you'll need to build and install RocksDB. This is separate from the installation of faust[rocksdb] with pip. That simply installs python-rocksdb, the Python interface to the underlying libraries.
There is even a (third-party) RocksDB docker image based on Python 3.7 Slim.
You could use that directly or take some tricks from the Dockerfile for that image.

How to fix error “ERROR: Command errored out with exit status 1: python.” when trying to install py -m pip install google-assistant-sdk[samples]

This is what i got when i ran the command py -m pip install google-assistant-sdk[samples] in cmd with administrator rights.
C:\WINDOWS\system32>py -m pip install google-assistant-sdk[samples]
Collecting google-assistant-sdk[samples]
Using cached
https://files.pythonhosted.org/packages/47/26/b405a0236ea5dd128f4b9c00806f4c457904309e1a6c60ec590e46cc19c4/google_assistant_sdk-0.5.1-py2.py3-none-any.whl
Requirement already satisfied: google-auth-oauthlib[tool]>=0.1.0 in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (0.4.1)
Requirement already satisfied: sounddevice<0.4,>=0.3.7; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (0.3.14)
Requirement already satisfied: futures<4,>=3.1.1; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (3.1.1)
Collecting google-assistant-grpc==0.2.1; extra == "samples"
Using cached https://files.pythonhosted.org/packages/4b/5d/50dbb8197961acf8a4339e8950e0110159456c4ef48234751d1b5f2e919b/google_assistant_grpc-0.2.1-py2.py3-none-any.whl
Requirement already satisfied: pathlib2<3,>=2.3.0; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (2.3.5)
Requirement already satisfied: urllib3[secure]<2,>=1.21; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (1.25.6)
Requirement already satisfied: tenacity<5,>=4.1.0; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (4.12.0)
Requirement already satisfied: click<7,>=6.7; extra == "samples" in c:\program files (x86)\python38-32\lib\site-packages (from google-assistant-sdk[samples]) (6.7)
Requirement already satisfied: google-auth in c:\program files (x86)\python38-32\lib\site-packages (from google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (1.6.3)
Requirement already satisfied: requests-oauthlib>=0.7.0 in c:\program files (x86)\python38-32\lib\site-packages (from google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (1.2.0)
Requirement already satisfied: CFFI>=1.0 in c:\program files (x86)\python38-32\lib\site-packages (from sounddevice<0.4,>=0.3.7; extra == "samples"->google-assistant-sdk[samples]) (1.13.1)
Collecting googleapis-common-protos>=1.5.2
Using cached https://files.pythonhosted.org/packages/eb/ee/e59e74ecac678a14d6abefb9054f0bbcb318a6452a30df3776f133886d7d/googleapis-common-protos-1.6.0.tar.gz
Collecting grpcio>=1.3.5
Using cached https://files.pythonhosted.org/packages/e4/60/40c4d2b61d9e4349bc89445deb8d04cc000b10a63446c42d311e0d21d127/grpcio-1.25.0.tar.gz
Requirement already satisfied: six in c:\program files (x86)\python38-32\lib\site-packages (from pathlib2<3,>=2.3.0; extra == "samples"->google-assistant-sdk[samples]) (1.12.0)
Requirement already satisfied: idna>=2.0.0; extra == "secure" in c:\program files (x86)\python38-32\lib\site-packages (from urllib3[secure]<2,>=1.21; extra == "samples"->google-assistant-sdk[samples]) (2.8)
Requirement already satisfied: pyOpenSSL>=0.14; extra == "secure" in c:\program files (x86)\python38-32\lib\site-packages (from urllib3[secure]<2,>=1.21; extra == "samples"->google-assistant-sdk[samples]) (19.0.0)
Requirement already satisfied: cryptography>=1.3.4; extra == "secure" in c:\program files (x86)\python38-32\lib\site-packages (from urllib3[secure]<2,>=1.21; extra == "samples"->google-assistant-sdk[samples]) (2.8)
Requirement already satisfied: certifi; extra == "secure" in c:\program files (x86)\python38-32\lib\site-packages (from urllib3[secure]<2,>=1.21; extra == "samples"->google-assistant-sdk[samples]) (2019.9.11)
Requirement already satisfied: cachetools>=2.0.0 in c:\program files (x86)\python38-32\lib\site-packages (from google-auth->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (3.1.1)
Requirement already satisfied: rsa>=3.1.4 in c:\program files (x86)\python38-32\lib\site-packages (from google-auth->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (4.0)
Requirement already satisfied: pyasn1-modules>=0.2.1 in c:\program files (x86)\python38-32\lib\site-packages (from google-auth->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (0.2.7)
Requirement already satisfied: requests>=2.0.0 in c:\program files (x86)\python38-32\lib\site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (2.22.0)
Requirement already satisfied: oauthlib>=3.0.0 in c:\program files (x86)\python38-32\lib\site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (3.1.0)
Requirement already satisfied: pycparser in c:\program files (x86)\python38-32\lib\site-packages (from CFFI>=1.0->sounddevice<0.4,>=0.3.7; extra == "samples"->google-assistant-sdk[samples]) (2.19)
Collecting protobuf>=3.6.0
Using cached https://files.pythonhosted.org/packages/70/81/6d2dfdc9e8a377e151b1a481293dda7149c44c77428029645c978df22bc0/protobuf-3.11.0-py2.py3-none-any.whl
Requirement already satisfied: pyasn1>=0.1.3 in c:\program files (x86)\python38-32\lib\site-packages (from rsa>=3.1.4->google-auth->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (0.4.7)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in c:\program files (x86)\python38-32\lib\site-packages (from requests>=2.0.0->requests-oauthlib>=0.7.0->google-auth-oauthlib[tool]>=0.1.0->google-assistant-sdk[samples]) (3.0.4)
Requirement already satisfied: setuptools in c:\program files (x86)\python38-32\lib\site-packages (from protobuf>=3.6.0->googleapis-common-protos>=1.5.2->google-assistant-grpc==0.2.1; extra == "samples"->google-assistant-sdk[samples]) (41.2.0)
Installing collected packages: protobuf, googleapis-common-protos, grpcio, google-assistant-grpc, google-assistant-sdk
Running setup.py install for googleapis-common-protos ... done
Running setup.py install for grpcio ... error
ERROR: Command errored out with exit status 1:
command: 'C:\Program Files (x86)\Python38-32\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\91999\\AppData\\Local\\Temp\\pip-install-vnrp98y6\\grpcio\\setup.py'"'"'; __file__='"'"'C:\\Users\\91999\\AppData\\Local\\Temp\\pip-install-vnrp98y6\\grpcio\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\91999\AppData\Local\Temp\pip-record-q8x2ji7q\install-record.txt' --single-version-externally-managed --compile
cwd: C:\Users\91999\AppData\Local\Temp\pip-install-vnrp98y6\grpcio\
Complete output (66 lines):
Found cython-generated files...
running install
running build
running build_py
running build_project_metadata
creating python_build
creating python_build\lib.win32-3.8
creating python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_auth.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_channel.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_common.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_compression.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_grpcio_metadata.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_interceptor.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_plugin_wrapping.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_server.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\_utilities.py -> python_build\lib.win32-3.8\grpc
copying src\python\grpcio\grpc\__init__.py -> python_build\lib.win32-3.8\grpc
creating python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\implementations.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\interfaces.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\utilities.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\_client_adaptations.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\_metadata.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\_server_adaptations.py -> python_build\lib.win32-3.8\grpc\beta
copying src\python\grpcio\grpc\beta\__init__.py -> python_build\lib.win32-3.8\grpc\beta
creating python_build\lib.win32-3.8\grpc\experimental
copying src\python\grpcio\grpc\experimental\gevent.py -> python_build\lib.win32-3.8\grpc\experimental
copying src\python\grpcio\grpc\experimental\session_cache.py -> python_build\lib.win32-3.8\grpc\experimental
copying src\python\grpcio\grpc\experimental\__init__.py -> python_build\lib.win32-3.8\grpc\experimental
creating python_build\lib.win32-3.8\grpc\framework
copying src\python\grpcio\grpc\framework\__init__.py -> python_build\lib.win32-3.8\grpc\framework
creating python_build\lib.win32-3.8\grpc\_cython
copying src\python\grpcio\grpc\_cython\__init__.py -> python_build\lib.win32-3.8\grpc\_cython
creating python_build\lib.win32-3.8\grpc\experimental\aio
copying src\python\grpcio\grpc\experimental\aio\_channel.py -> python_build\lib.win32-3.8\grpc\experimental\aio
copying src\python\grpcio\grpc\experimental\aio\_server.py -> python_build\lib.win32-3.8\grpc\experimental\aio
copying src\python\grpcio\grpc\experimental\aio\__init__.py -> python_build\lib.win32-3.8\grpc\experimental\aio
creating python_build\lib.win32-3.8\grpc\framework\common
copying src\python\grpcio\grpc\framework\common\cardinality.py -> python_build\lib.win32-3.8\grpc\framework\common
copying src\python\grpcio\grpc\framework\common\style.py -> python_build\lib.win32-3.8\grpc\framework\common
copying src\python\grpcio\grpc\framework\common\__init__.py -> python_build\lib.win32-3.8\grpc\framework\common
creating python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\abandonment.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\callable_util.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\future.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\logging_pool.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\stream.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\stream_util.py -> python_build\lib.win32-3.8\grpc\framework\foundation
copying src\python\grpcio\grpc\framework\foundation\__init__.py -> python_build\lib.win32-3.8\grpc\framework\foundation
creating python_build\lib.win32-3.8\grpc\framework\interfaces
copying src\python\grpcio\grpc\framework\interfaces\__init__.py -> python_build\lib.win32-3.8\grpc\framework\interfaces
creating python_build\lib.win32-3.8\grpc\framework\interfaces\base
copying src\python\grpcio\grpc\framework\interfaces\base\base.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\base
copying src\python\grpcio\grpc\framework\interfaces\base\utilities.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\base
copying src\python\grpcio\grpc\framework\interfaces\base\__init__.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\base
creating python_build\lib.win32-3.8\grpc\framework\interfaces\face
copying src\python\grpcio\grpc\framework\interfaces\face\face.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\face
copying src\python\grpcio\grpc\framework\interfaces\face\utilities.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\face
copying src\python\grpcio\grpc\framework\interfaces\face\__init__.py -> python_build\lib.win32-3.8\grpc\framework\interfaces\face
creating python_build\lib.win32-3.8\grpc\_cython\_cygrpc
copying src\python\grpcio\grpc\_cython\_cygrpc\__init__.py -> python_build\lib.win32-3.8\grpc\_cython\_cygrpc
creating python_build\lib.win32-3.8\grpc\_cython\_credentials
copying src\python\grpcio\grpc\_cython\_credentials\roots.pem -> python_build\lib.win32-3.8\grpc\_cython\_credentials
running build_ext
error: [WinError 2] The system cannot find the file specified
----------------------------------------
ERROR: Command errored out with exit status 1: 'C:\Program Files (x86)\Python38-32\python.exe' -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\91999\\AppData\\Local\\Temp\\pip-install-vnrp98y6\\grpcio\\setup.py'"'"'; __file__='"'"'C:\\Users\\91999\\AppData\\Local\\Temp\\pip-install-vnrp98y6\\grpcio\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record 'C:\Users\91999\AppData\Local\Temp\pip-record-q8x2ji7q\install-record.txt' --single-version-externally-managed --compile Check the logs for full command output.
pls help
This problem should be fixed now. Try installing grpcio 1.26.0rc1.

python.exe - Entry point not found (mkl_dnn_BatchNormalizationCreateBackward_v2_F32)

I'm referring to this specific Python library: https://github.com/antoinecarme/pyaf
My machine comprises of Anaconda3-4.4.0 with a separate environment which runs Python 3.5 as the pyAF library requires Python 3.5 to work.
I have installed this library on my laptop using the recommended options
on the above Github page, based on the following commands:
pip install scipy pandas sklearn matplotlib pydot dill pathos sqlalchemy
pip install --upgrade git+git://github.com/antoinecarme/pyaf.git
To execute the above, I change the environment on my Github bash to run it on Python 3.5 and I observe the following installation results:
**Dinesh#DESKTOP-O5O752M MINGW64 ~**
$ source activate python35
(python35)
**Dinesh#DESKTOP-O5O752M MINGW64 ~**
$ pip install scipy pandas sklearn matplotlib pydot dill pathos sqlalchemy
Requirement already satisfied: scipy in c:\toolkits\anaconda3-4.4.0\envs\python3 5\lib\site-packages
Requirement already satisfied: pandas in c:\toolkits\anaconda3-4.4.0\envs\python 35\lib\site-packages
Requirement already satisfied: sklearn in c:\toolkits\anaconda3-4.4.0\envs\pytho n35\lib\site-packages
Requirement already satisfied: matplotlib in c:\toolkits\anaconda3-4.4.0\envs\py thon35\lib\site-packages
Requirement already satisfied: pydot in c:\toolkits\anaconda3-4.4.0\envs\python3 5\lib\site-packages
Requirement already satisfied: dill in c:\toolkits\anaconda3-4.4.0\envs\python35 \lib\site-packages
Requirement already satisfied: pathos in c:\toolkits\anaconda3-4.4.0\envs\python 35\lib\site-packages
Requirement already satisfied: sqlalchemy in c:\toolkits\anaconda3-4.4.0\envs\py thon35\lib\site-packages
Requirement already satisfied: numpy>=1.8.2 in c:\toolkits\anaconda3-4.4.0\envs\ python35\lib\site-packages (from scipy)
Requirement already satisfied: python-dateutil>=2 in c:\toolkits\anaconda3-4.4.0 \envs\python35\lib\site-packages (from pandas)
Requirement already satisfied: pytz>=2011k in c:\toolkits\anaconda3-4.4.0\envs\p ython35\lib\site-packages (from pandas)
Requirement already satisfied: scikit-learn in c:\toolkits\anaconda3-4.4.0\envs\ python35\lib\site-packages (from sklearn)
Requirement already satisfied: six>=1.10 in c:\toolkits\anaconda3-4.4.0\envs\pyt hon35\lib\site-packages (from matplotlib)
Requirement already satisfied: cycler>=0.10 in c:\toolkits\anaconda3-4.4.0\envs\ python35\lib\site-packages (from matplotlib)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=1.5.6 in c:\to olkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from matplotlib)
Requirement already satisfied: pyreadline>=1.7.1 in c:\toolkits\anaconda3-4.4.0\ envs\python35\lib\site-packages (from dill)
Requirement already satisfied: pox>=0.2.3 in c:\toolkits\anaconda3-4.4.0\envs\py thon35\lib\site-packages (from pathos)
Requirement already satisfied: ppft>=1.6.4.7 in c:\toolkits\anaconda3-4.4.0\envs \python35\lib\site-packages (from pathos)
Requirement already satisfied: multiprocess>=0.70.5 in c:\toolkits\anaconda3-4.4 .0\envs\python35\lib\site-packages (from pathos)
(python35)
**Dinesh#DESKTOP-O5O752M MINGW64 ~**
$ pip install --upgrade git+git://github.com/antoinecarme/pyaf.git
Collecting git+git://github.com/antoinecarme/pyaf.git
Cloning git://github.com/antoinecarme/pyaf.git to c:\users\dines\appdata\local\temp\pip-_upr_c25-build
Requirement already up-to-date: scipy in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Collecting pandas (from pyaf==1.0)
Using cached pandas-0.20.3-cp35-cp35m-win_amd64.whl
Requirement already up-to-date: sklearn in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Requirement already up-to-date: matplotlib in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Requirement already up-to-date: pydot in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Requirement already up-to-date: dill in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Requirement already up-to-date: pathos in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pyaf==1.0)
Collecting sqlalchemy (from pyaf==1.0)
Requirement already up-to-date: numpy>=1.8.2 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from scipy->pyaf==1.0)
Collecting python-dateutil>=2 (from pandas->pyaf==1.0)
Using cached python_dateutil-2.6.1-py2.py3-none-any.whl
Requirement already up-to-date: pytz>=2011k in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pandas->pyaf==1.0)
Collecting scikit-learn (from sklearn->pyaf==1.0)
Using cached scikit_learn-0.19.0-cp35-cp35m-win_amd64.whl
Requirement already up-to-date: six>=1.10 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from matplotlib->pyaf==1.0)
Requirement already up-to-date: cycler>=0.10 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from matplotlib->pyaf==1.0)
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=1.5.6 (from matplotlib->pyaf==1.0)
Using cached pyparsing-2.2.0-py2.py3-none-any.whl
Requirement already up-to-date: pyreadline>=1.7.1 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from dill->pyaf==1.0)
Requirement already up-to-date: ppft>=1.6.4.7 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pathos->pyaf==1.0)
Requirement already up-to-date: pox>=0.2.3 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pathos->pyaf==1.0)
Requirement already up-to-date: multiprocess>=0.70.5 in c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages (from pathos->pyaf==1.0)
Installing collected packages: python-dateutil, pandas, sqlalchemy, pyaf, scikit-learn, pyparsing
Found existing installation: python-dateutil 2.6.0
Uninstalling python-dateutil-2.6.0:
Successfully uninstalled python-dateutil-2.6.0
Found existing installation: pandas 0.20.1
Uninstalling pandas-0.20.1:
Successfully uninstalled pandas-0.20.1
Found existing installation: SQLAlchemy 1.1.9
Uninstalling SQLAlchemy-1.1.9:
Successfully uninstalled SQLAlchemy-1.1.9
Found existing installation: pyaf 1.0
Uninstalling pyaf-1.0:
Successfully uninstalled pyaf-1.0
Running setup.py install for pyaf: started
Running setup.py install for pyaf: finished with status 'done'
Found existing installation: scikit-learn 0.18.1
DEPRECATION: Uninstalling a distutils installed project (scikit-learn) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.
Uninstalling scikit-learn-0.18.1:
Successfully uninstalled scikit-learn-0.18.1
Found existing installation: pyparsing 2.1.4
Uninstalling pyparsing-2.1.4:
Successfully uninstalled pyparsing-2.1.4
Successfully installed pandas-0.20.3 pyaf-1.0 pyparsing-2.2.0 python-dateutil-2.6.1 scikit-learn-0.19.0 sqlalchemy-1.1.13
Traceback (most recent call last):
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\Scripts\pip-script.py", line 5, in <module>
sys.exit(pip.main())
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\__init__.py", line 249, in main
return command.main(cmd_args)
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\basecommand.py", line 252, in main
pip_version_check(session)
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\utils\outdated.py", line 102, in pip_version_check
installed_version = get_installed_version("pip")
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\utils\__init__.py", line 838, in get_installed_version
working_set = pkg_resources.WorkingSet()
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 644, in __init__
self.add_entry(entry)
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 700, in add_entry
for dist in find_distributions(entry, True):
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1949, in find_eggs_in_zip
if metadata.has_metadata('PKG-INFO'):
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1463, in has_metadata
return self.egg_info and self._has(self._fn(self.egg_info, name))
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1823, in _has
return zip_path in self.zipinfo or zip_path in self._index()
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1703, in zipinfo
return self._zip_manifests.load(self.loader.archive)
File "C:\Toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1643, in load
mtime = os.stat(path).st_mtime
FileNotFoundError: [WinError 2] The system cannot find the file specified: 'C:\\Toolkits\\anaconda3-4.4.0\\envs\\python35\\lib\\site-packages\\pyaf-1.0-py3.5.egg'
(python35)
With this first method, I can see that there are errors at the end of the output. Hence, I downloaded the entire pyAF package from Github on my desktop and attempted to install the library as follows:
**(python35) C:\Users\dines\Desktop>** cd pyaf-master
**(python35) C:\Users\dines\Desktop\pyaf-master>** python setup.py install
running install
running bdist_egg
running egg_info
writing pyaf.egg-info\PKG-INFO
writing dependency_links to pyaf.egg-info\dependency_links.txt
writing requirements to pyaf.egg-info\requires.txt
writing top-level names to pyaf.egg-info\top_level.txt
package init file 'pyaf\__init__.py' not found (or not a regular file)
package init file 'pyaf\TS\__init__.py' not found (or not a regular file)
package init file 'pyaf\CodeGen\__init__.py' not found (or not a regular file)
package init file 'pyaf\Bench\__init__.py' not found (or not a regular file)
reading manifest file 'pyaf.egg-info\SOURCES.txt'
writing manifest file 'pyaf.egg-info\SOURCES.txt'
installing library code to build\bdist.win-amd64\egg
running install_lib
running build_py
copying pyaf\ForecastEngine.py -> build\lib\pyaf
copying pyaf\HierarchicalForecastEngine.py -> build\lib\pyaf
copying pyaf\TS\Exogenous.py -> build\lib\pyaf\TS
copying pyaf\TS\Keras_Models.py -> build\lib\pyaf\TS
copying pyaf\TS\Options.py -> build\lib\pyaf\TS
copying pyaf\TS\Perf.py -> build\lib\pyaf\TS
copying pyaf\TS\Plots.py -> build\lib\pyaf\TS
copying pyaf\TS\PredictionIntervals.py -> build\lib\pyaf\TS
copying pyaf\TS\Scikit_Models.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalDecomposition.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalDecomposition_AR.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalDecomposition_Cycle.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalDecomposition_Quant.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalDecomposition_Trend.py -> build\lib\pyaf\TS
copying pyaf\TS\SignalHierarchy.py -> build\lib\pyaf\TS
copying pyaf\TS\Signal_Grouping.py -> build\lib\pyaf\TS
copying pyaf\TS\Signal_Transformation.py -> build\lib\pyaf\TS
copying pyaf\TS\Time.py -> build\lib\pyaf\TS
copying pyaf\TS\TimeSeriesModel.py -> build\lib\pyaf\TS
copying pyaf\TS\Utils.py -> build\lib\pyaf\TS
copying pyaf\CodeGen\TS_CodeGenerator.py -> build\lib\pyaf\CodeGen
copying pyaf\CodeGen\TS_CodeGen_Objects.py -> build\lib\pyaf\CodeGen
copying pyaf\Bench\Artificial.py -> build\lib\pyaf\Bench
copying pyaf\Bench\download_all_stock_prices.py -> build\lib\pyaf\Bench
copying pyaf\Bench\GenericBenchmark.py -> build\lib\pyaf\Bench
copying pyaf\Bench\MComp.py -> build\lib\pyaf\Bench
copying pyaf\Bench\NN3.py -> build\lib\pyaf\Bench
copying pyaf\Bench\stocks_symbol_list.py -> build\lib\pyaf\Bench
copying pyaf\Bench\TS_datasets.py -> build\lib\pyaf\Bench
copying pyaf\Bench\YahooStocks.py -> build\lib\pyaf\Bench
creating build\bdist.win-amd64\egg
creating build\bdist.win-amd64\egg\pyaf
creating build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\Artificial.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\download_all_stock_prices.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\GenericBenchmark.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\MComp.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\NN3.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\stocks_symbol_list.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\TS_datasets.py -> build\bdist.win-amd64\egg\pyaf\Bench
copying build\lib\pyaf\Bench\YahooStocks.py -> build\bdist.win-amd64\egg\pyaf\Bench
creating build\bdist.win-amd64\egg\pyaf\CodeGen
copying build\lib\pyaf\CodeGen\TS_CodeGenerator.py -> build\bdist.win-amd64\egg\pyaf\CodeGen
copying build\lib\pyaf\CodeGen\TS_CodeGen_Objects.py -> build\bdist.win-amd64\egg\pyaf\CodeGen
copying build\lib\pyaf\ForecastEngine.py -> build\bdist.win-amd64\egg\pyaf
copying build\lib\pyaf\HierarchicalForecastEngine.py -> build\bdist.win-amd64\egg\pyaf
creating build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Exogenous.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Keras_Models.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Options.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Perf.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Plots.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\PredictionIntervals.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Scikit_Models.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalDecomposition.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalDecomposition_AR.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalDecomposition_Cycle.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalDecomposition_Quant.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalDecomposition_Trend.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\SignalHierarchy.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Signal_Grouping.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Signal_Transformation.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Time.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\TimeSeriesModel.py -> build\bdist.win-amd64\egg\pyaf\TS
copying build\lib\pyaf\TS\Utils.py -> build\bdist.win-amd64\egg\pyaf\TS
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\Artificial.py to Artificial.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\download_all_stock_prices.py to download_all_stock_prices.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\GenericBenchmark.py to GenericBenchmark.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\MComp.py to MComp.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\NN3.py to NN3.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\stocks_symbol_list.py to stocks_symbol_list.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\TS_datasets.py to TS_datasets.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\Bench\YahooStocks.py to YahooStocks.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\CodeGen\TS_CodeGenerator.py to TS_CodeGenerator.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\CodeGen\TS_CodeGen_Objects.py to TS_CodeGen_Objects.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\ForecastEngine.py to ForecastEngine.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\HierarchicalForecastEngine.py to HierarchicalForecastEngine.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Exogenous.py to Exogenous.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Keras_Models.py to Keras_Models.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Options.py to Options.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Perf.py to Perf.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Plots.py to Plots.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\PredictionIntervals.py to PredictionIntervals.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Scikit_Models.py to Scikit_Models.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalDecomposition.py to SignalDecomposition.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalDecomposition_AR.py to SignalDecomposition_AR.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalDecomposition_Cycle.py to SignalDecomposition_Cycle.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalDecomposition_Quant.py to SignalDecomposition_Quant.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalDecomposition_Trend.py to SignalDecomposition_Trend.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\SignalHierarchy.py to SignalHierarchy.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Signal_Grouping.py to Signal_Grouping.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Signal_Transformation.py to Signal_Transformation.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Time.py to Time.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\TimeSeriesModel.py to TimeSeriesModel.cpython-35.pyc
byte-compiling build\bdist.win-amd64\egg\pyaf\TS\Utils.py to Utils.cpython-35.pyc
creating build\bdist.win-amd64\egg\EGG-INFO
copying pyaf.egg-info\PKG-INFO -> build\bdist.win-amd64\egg\EGG-INFO
copying pyaf.egg-info\SOURCES.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying pyaf.egg-info\dependency_links.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying pyaf.egg-info\requires.txt -> build\bdist.win-amd64\egg\EGG-INFO
copying pyaf.egg-info\top_level.txt -> build\bdist.win-amd64\egg\EGG-INFO
zip_safe flag not set; analyzing archive contents...
creating 'dist\pyaf-1.0-py3.5.egg' and adding 'build\bdist.win-amd64\egg' to it
removing 'build\bdist.win-amd64\egg' (and everything under it)
Processing pyaf-1.0-py3.5.egg
Copying pyaf-1.0-py3.5.egg to c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Adding pyaf 1.0 to easy-install.pth file
Installed c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages\pyaf-1.0-py3.5.egg
Processing dependencies for pyaf==1.0
Searching for SQLAlchemy==1.1.13
Best match: SQLAlchemy 1.1.13
Adding SQLAlchemy 1.1.13 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pathos==0.2.1
Best match: pathos 0.2.1
Adding pathos 0.2.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for dill==0.2.7.1
Best match: dill 0.2.7.1
Adding dill 0.2.7.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pydot==1.2.3
Best match: pydot 1.2.3
Adding pydot 1.2.3 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for matplotlib==2.0.2
Best match: matplotlib 2.0.2
Adding matplotlib 2.0.2 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for sklearn==0.0
Best match: sklearn 0.0
Adding sklearn 0.0 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pandas==0.20.3
Best match: pandas 0.20.3
Adding pandas 0.20.3 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for scipy==0.19.1
Best match: scipy 0.19.1
Adding scipy 0.19.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for ppft==1.6.4.7.1
Best match: ppft 1.6.4.7.1
Adding ppft 1.6.4.7.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pox==0.2.3
Best match: pox 0.2.3
Adding pox 0.2.3 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for multiprocess==0.70.5
Best match: multiprocess 0.70.5
Adding multiprocess 0.70.5 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pyreadline==2.1
Best match: pyreadline 2.1
Adding pyreadline 2.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pyparsing==2.2.0
Best match: pyparsing 2.2.0
Adding pyparsing 2.2.0 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for cycler==0.10.0
Best match: cycler 0.10.0
Adding cycler 0.10.0 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for pytz==2017.2
Best match: pytz 2017.2
Adding pytz 2017.2 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for python-dateutil==2.6.1
Best match: python-dateutil 2.6.1
Adding python-dateutil 2.6.1 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for six==1.10.0
Best match: six 1.10.0
Adding six 1.10.0 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for numpy==1.13.1+mkl
Best match: numpy 1.13.1+mkl
Adding numpy 1.13.1+mkl to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Searching for scikit-learn==0.19.0
Best match: scikit-learn 0.19.0
Adding scikit-learn 0.19.0 to easy-install.pth file
Using c:\toolkits\anaconda3-4.4.0\envs\python35\lib\site-packages
Finished processing dependencies for pyaf==1.0
(python35) C:\Users\dines\Desktop\pyaf-master>
I think the pyAF library has been installed successfully. Now, I try running by importing it on Jupyter with the following code:
import pandas as pd
csvfile_link = "https://raw.githubusercontent.com/antoinecarme/TimeSeriesData/master/ozone-la.csv"
ozone_dataframe = pd.read_csv(csvfile_link);
import datetime
ozone_dataframe['Month'] = ozone_dataframe['Month'].apply(lambda x : datetime.datetime.strptime(x, "%Y-%m"))
ozone_dataframe.head()
%matplotlib inline
ozone_dataframe.plot.line('Month', ['Ozone'], grid = True, figsize=(12, 8))
Then, when I import the following library:
import pyaf.ForecastEngine as autof
lEngine = autof.cForecastEngine()
lEngine.train(ozone_dataframe , 'Month' , 'Ozone', 12);
I get the error pop up saying:
The procedure entry point mkl_dnn_BatchNormalizationCreateBackward_v2_F32
could not be located in the dynamic link library C:\Toolkits\anaconda3-
4.4.0\envs\python35\lib\site-packages\numpy\core\mkl_intel_thread.dll
This error pop up only dissapears when I keep clicking OK button 3 times.
Could anyone please help me on what I need to do to run this library successfully ? I'm not even aware as to what the root cause of this issue is. Please help.
Executing numpy as np and np.show_config()
(python35) C:\Users\dines>python
Python 3.5.3 |Anaconda custom (64-bit)| (default, May 15 2017, 10:43:23) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy as np
>>> np.show_config()
lapack_mkl_info:
include_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/include']
define_macros = [('SCIPY_MKL_H', None), ('HAVE_CBLAS', None)]
library_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/lib/intel64_win']
libraries = ['mkl_lapack95_lp64', 'mkl_blas95_lp64', 'mkl_rt']
blas_mkl_info:
include_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/include']
define_macros = [('SCIPY_MKL_H', None), ('HAVE_CBLAS', None)]
library_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/lib/intel64_win']
libraries = ['mkl_lapack95_lp64', 'mkl_blas95_lp64', 'mkl_rt']
blas_opt_info:
include_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/include']
define_macros = [('SCIPY_MKL_H', None), ('HAVE_CBLAS', None)]
library_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/lib/intel64_win']
libraries = ['mkl_lapack95_lp64', 'mkl_blas95_lp64', 'mkl_rt']
lapack_opt_info:
include_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/include']
define_macros = [('SCIPY_MKL_H', None), ('HAVE_CBLAS', None)]
library_dirs = ['C:/Program Files (x86)/IntelSWTools/compilers_and_libraries_2017/windows/mkl/lib/intel64_win']
libraries = ['mkl_lapack95_lp64', 'mkl_blas95_lp64', 'mkl_rt']
>>>
(Post my comments as an answer as it is a confirmed solution.)
The problem is that the locally installed MKL package not compatible with numpy. The solution is install numpy using conda install numpy instead of pip install numpy.
To find out the MKL library numpy is linked against, run the following Python script in the activated conda environment,
import numpy as np
np.show_config()
I have a working conda environment with numpy installed. Notice the results show that the MKL library is part of the conda installation which guarantees compatibility:
blas_mkl_info:
libraries = ['mkl_intel_lp64', 'mkl_intel_thread', 'mkl_core', 'iomp5', 'pthread']
library_dirs = ['/Users/neurite/Applications/miniconda3/envs/temp/lib']
define_macros = [('SCIPY_MKL_H', None), ('HAVE_CBLAS', None)]
include_dirs = ['/Users/neurite/Applications/miniconda3/envs/temp/include']
...

Categories

Resources