During package installation on my Windows, there was an error, during which conda (most probably) deleted itself. Now, conda command is not present on my Windows.
The main error is:
ERROR conda.core.link:_execute(733):
An error occurred while installing package 'defaults::conda-22.11.0-py39haa95532_1'
Question. How can I restore conda command without (deleting and) reinstalling all Anaconda?
I do not want t lose all installed packages, environments, etc. and reinstall everything. Is there a solution to this?
Details:
The result of OCR tool (I could not just copy-paste):
C:\WINDOWS\system32> conda install -c conda-forge textblob
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: C:\Users\user\anaconda3
added / updated specs:
- textblob
The following packages will be downloaded:
package | build
----------------------------|---------------
conda-22.11.0 | py39haa9SS32_1 932 KB
openssl-l.l.ls | hcfcfb64_1 5.1 MB conda-forge
ruamel.yaml-0.16.12 | py39h2bbfflb_3 173 KB
ruamel.yaml.clib-0.2.7 | py39ha55989b_0 111 KB conda-forge
------------------------------------------------------
Total: 6.3 MB
The following NEW packages will be INSTALLED:
ruamel.yaml pkgs/main/win-64::ruamel.yaml-0.16.12-py39h2bbfflb_3 None
ruamel.yaml.clib conda-forge/win-64::ruamel.yaml.clib-0.2.7-py39ha55989b_0 None
textblob conda-forge/noarch::textblob-0.15.3-py_0 None
The following packages will be UPDATED:
conda conda-forge::conda-22.9.0-py39hcbf530- --> pkgs/main::conda-22.11.0-py39haa95532_l None
openssl l.l.ls-hcfcfb64_0 --> 1.1.ls-hcfcfb64_l None
Proceed ([y]/n)? y
Downloading and Extracting Packages
openssl-l.l.ls | 5.1 MB |
ruamel.yaml-0.16.12 | 173 KB
conda-22.11.0 | 932 KB
ruamel.yaml.clib-0.2 | 111 KB |
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
ERROR conda.core.link:_execute(733): An error occurred while installing package 'defaults::conda-22.11.0-py39haa95532_1'.
Rolling back transaction: done
CondaError: Cannot link a source that does not exist.
C:\Users\user\anaconda3\.condatmp\0d55d647-3842-4S31-a301-4bedl75b9998
Running 'conda clean --packages' may resolve your problem.
()
[Errno 2] No such file or directory: ’C:\\Users\\user\\anaconda3\\conda-meta\\openssl-l.l.ls-hcfcfb64_0.json’
[Errno 2] No such file or directory: ’C:\\Users\\user\\anaconda3\\conda-meta\\conda-22.9.0-py39hcbf5309_2.json’
The batch file cannot be found.
The batch file cannot be found.
I want to install Scrapy on Windows Server 2019, running in a Docker container (please see here and here for the history of my installation).
On my local Windows 10 machine I can run my Scrapy commands like so in Windows PowerShell (after simply starting Docker Desktop):
scrapy crawl myscraper -o allobjects.json in folder C:\scrapy\my1stscraper\
For Windows Server as recommended here I first installed Anaconda following these steps: https://docs.scrapy.org/en/latest/intro/install.html.
I then opened the Anaconda prompt and typed conda install -c conda-forge scrapy in D:\Programs
(base) PS D:\Programs> dir
Directory: D:\Programs
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 4/22/2021 10:52 AM Anaconda3
-a---- 4/22/2021 11:20 AM 0 conda
(base) PS D:\Programs> conda install -c conda-forge scrapy
Collecting package metadata (current_repodata.json): done
Solving environment: done
==> WARNING: A newer version of conda exists. <==
current version: 4.9.2
latest version: 4.10.1
Please update conda by running
$ conda update -n base -c defaults conda
## Package Plan ##
environment location: D:\Programs\Anaconda3
added / updated specs:
- scrapy
The following packages will be downloaded:
package | build
---------------------------|-----------------
automat-20.2.0 | py_0 30 KB conda-forge
conda-4.10.1 | py38haa244fe_0 3.1 MB conda-forge
constantly-15.1.0 | py_0 9 KB conda-forge
cssselect-1.1.0 | py_0 18 KB conda-forge
hyperlink-21.0.0 | pyhd3deb0d_0 71 KB conda-forge
incremental-17.5.0 | py_0 14 KB conda-forge
itemadapter-0.2.0 | pyhd8ed1ab_0 12 KB conda-forge
parsel-1.6.0 | py_0 15 KB conda-forge
pyasn1-0.4.8 | py_0 53 KB conda-forge
pyasn1-modules-0.2.7 | py_0 60 KB conda-forge
pydispatcher-2.0.5 | py_1 12 KB conda-forge
pyhamcrest-2.0.2 | py_0 29 KB conda-forge
python_abi-3.8 | 1_cp38 4 KB conda-forge
queuelib-1.6.1 | pyhd8ed1ab_0 14 KB conda-forge
scrapy-2.4.1 | py38haa95532_0 372 KB
service_identity-18.1.0 | py_0 12 KB conda-forge
twisted-21.2.0 | py38h294d835_0 5.1 MB conda-forge
twisted-iocpsupport-1.0.1 | py38h294d835_0 49 KB conda-forge
w3lib-1.22.0 | pyh9f0ad1d_0 21 KB conda-forge
------------------------------------------------------------
Total: 9.0 MB
The following NEW packages will be INSTALLED:
automat conda-forge/noarch::automat-20.2.0-py_0
constantly conda-forge/noarch::constantly-15.1.0-py_0
cssselect conda-forge/noarch::cssselect-1.1.0-py_0
hyperlink conda-forge/noarch::hyperlink-21.0.0-pyhd3deb0d_0
incremental conda-forge/noarch::incremental-17.5.0-py_0
itemadapter conda-forge/noarch::itemadapter-0.2.0-pyhd8ed1ab_0
parsel conda-forge/noarch::parsel-1.6.0-py_0
pyasn1 conda-forge/noarch::pyasn1-0.4.8-py_0
pyasn1-modules conda-forge/noarch::pyasn1-modules-0.2.7-py_0
pydispatcher conda-forge/noarch::pydispatcher-2.0.5-py_1
pyhamcrest conda-forge/noarch::pyhamcrest-2.0.2-py_0
python_abi conda-forge/win-64::python_abi-3.8-1_cp38
queuelib conda-forge/noarch::queuelib-1.6.1-pyhd8ed1ab_0
scrapy pkgs/main/win-64::scrapy-2.4.1-py38haa95532_0
service_identity conda-forge/noarch::service_identity-18.1.0-py_0
twisted conda-forge/win-64::twisted-21.2.0-py38h294d835_0
twisted-iocpsuppo~ conda-forge/win-64::twisted-iocpsupport-1.0.1-py38h294d835_0
w3lib conda-forge/noarch::w3lib-1.22.0-pyh9f0ad1d_0
The following packages will be UPDATED:
conda pkgs/main::conda-4.9.2-py38haa95532_0 --> conda-forge::conda-4.10.1-py38haa244fe_0
Proceed ([y]/n)? y
Downloading and Extracting Packages
constantly-15.1.0 | 9 KB | ############################################################################ | 100%
itemadapter-0.2.0 | 12 KB | ############################################################################ | 100%
twisted-21.2.0 | 5.1 MB | ############################################################################ | 100%
pydispatcher-2.0.5 | 12 KB | ############################################################################ | 100%
queuelib-1.6.1 | 14 KB | ############################################################################ | 100%
service_identity-18. | 12 KB | ############################################################################ | 100%
pyhamcrest-2.0.2 | 29 KB | ############################################################################ | 100%
cssselect-1.1.0 | 18 KB | ############################################################################ | 100%
automat-20.2.0 | 30 KB | ############################################################################ | 100%
pyasn1-0.4.8 | 53 KB | ############################################################################ | 100%
twisted-iocpsupport- | 49 KB | ############################################################################ | 100%
python_abi-3.8 | 4 KB | ############################################################################ | 100%
hyperlink-21.0.0 | 71 KB | ############################################################################ | 100%
conda-4.10.1 | 3.1 MB | ############################################################################ | 100%
scrapy-2.4.1 | 372 KB | ############################################################################ | 100%
incremental-17.5.0 | 14 KB | ############################################################################ | 100%
w3lib-1.22.0 | 21 KB | ############################################################################ | 100%
pyasn1-modules-0.2.7 | 60 KB | ############################################################################ | 100%
parsel-1.6.0 | 15 KB | ############################################################################ | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(base) PS D:\Programs>
In PowerShell on my VPS I then tried to run scrapy via D:\Programs\Anaconda3\Scripts\scrapy.exe
I want to run the spider I have stored in folder D:\scrapy\my1stscraper, see:
The Docker Engine service is running as a Windows Service (presuming I don't need to explicitly start a container when running my scrapy command, if I do, I would not know how):
I tried starting my scraper like so D:\Programs\Anaconda3\Scripts\scrapy.exe crawl D:\scrapy\my1stscraper\spiders\my1stscraper -o allobjects.json, resulting in errors:
Traceback (most recent call last):
File "D:\Programs\Anaconda3\Scripts\scrapy-script.py", line 6, in <module>
from scrapy.cmdline import execute
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\__init__.py", line 12, in <module>
from scrapy.spiders import Spider
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\spiders\__init__.py", line 11, in <module>
from scrapy.http import Request
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\http\__init__.py", line 11, in <module>
from scrapy.http.request.form import FormRequest
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\http\request\form.py", line 10, in <module>
import lxml.html
File "D:\Programs\Anaconda3\lib\site-packages\lxml\html\__init__.py", line 53, in <module>
from .. import etree
ImportError: DLL load failed while importing etree: The specified module could not be found.
I checked here:
from lxml import etree ImportError: DLL load failed: The specified module could not be found
This talks about pip, which I did not use, but to be sure I did install the C++ build tools:
I still get the same error. How can I run my Scrapy crawler in the Docker container?
UPDATE 1
My VPS is my only environment so not sure how to test in a virtual environment.
What I did now:
Uninstall Anacondo
Install Miniconda with Python 3.8 (https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe), did not add to path and used miniconda as systems' python 3.8
Looking at your recommendations:
Get steps to manually install the app on Windows Server - ideally test in a virtualised environment so you can reset it cleanly
When you say app, what do you mean? Scrapy? Conda?
Convert all steps to a fully automatic powershell script (e.g. for conda, need to download the installer via wget, execute the installer etc.
I now installed Conda on the host OS, since I thought that would allow me to have the least amount of overhead. Or would you install it in the image directly and if so, how do I not have to install it each time?
Lastly, just to check to be sure, I want to run multiple Scrapy scrapers, but I want to do this with as little overhead as possible.
I should just repeat the RUN command in the SAME docker container for each scraper I want to execute, correct?
UPDATE 2
whomami indeed returns user manager\containeradministrator
scrapy benchmark returns
Scrapy 2.4.1 - no active project
Unknown command: benchmark
Use "scrapy" to see available commands
I have the scrapy project I want to run in folder D:\scrapy\my1stscraper, how can I run that project, since D:\ drive is not available within my container?
UPDATE 3
A few months later when we discussed this, when I now run your proposed the Dockerfile it breaks and I now get this output:
PS D:\Programs> docker build . -t scrapy
Sending build context to Docker daemon 1.644GB
Step 1/9 : FROM mcr.microsoft.com/windows/servercore:ltsc2019
---> d1724c2d9a84
Step 2/9 : SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
---> Running in 5f79f1bf9b62
Removing intermediate container 5f79f1bf9b62
---> 8bb2a477eaca
Step 3/9 : RUN setx /M PATH $('C:\Users\ContainerAdministrator\miniconda3\Library\bin;C:\Users\ContainerAdministrator\miniconda3\Scripts;C:\Users\ContainerAdministrator\miniconda3;' + $Env:PATH)
---> Running in f3869c4f64d5
SUCCESS: Specified value was saved.
Removing intermediate container f3869c4f64d5
---> 82a2fa969a88
Step 4/9 : RUN Invoke-WebRequest "https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe" -OutFile miniconda3.exe -UseBasicParsing; Start-Process -FilePath 'miniconda3.exe' -Wait -ArgumentList '/S', '/D=C:\Users\ContainerAdministrator\miniconda3'; Remove-Item .\miniconda3.exe; conda install -y -c conda-forge scrapy;
---> Running in 3eb8b7bfe878
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
UnsatisfiableError: The following specifications were found to be incompatible with the existing python installation in your environment:
Specifications:
- scrapy -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
Your python: python=3.9
If python is on the left-most side of the chain, that's the version you've asked for.
When python appears to the right, that indicates that the thing on the left is somehow
not available for the python version you are constrained to. Note that conda will not
change your python version to a different minor version unless you explicitly specify
that.
Not sure if I'm reading this correctly but it seems as if Scrapy does not support Python 3.9, except that here I see "Scrapy requires Python 3.6+" https://docs.scrapy.org/en/latest/intro/install.html
Do you know what's causing this issue? I also checked here but no answer yet either.
To run a containerised app, it must be installed in a container image first - you don't want to install any software on the host machine.
For linux there are off-the-shelf container images for everything which is probably what your docker desktop environment was using; I see 1051 results on docker hub search for scrapy but none of them are windows containers.
The full process of creating a windows container from scratch for an app is:
Get steps to manually install the app (scrapy and its dependencies) on Windows Server - ideally test in a virtualised environment so you can reset it cleanly
Convert all steps to a fully automatic powershell script (e.g. for conda, need to download the installer via wget, execute the installer etc.
Optionaly, test the powershell steps in an interactive container
docker run -it --isolation=process mcr.microsoft.com/windows/servercore:ltsc2019 powershell
This runs a windows container and gives you a shell to verify that your install script works
When you exit the shell the container is stopped
Create a Dockerfile
Use mcr.microsoft.com/windows/servercore:ltsc2019 as the base image via FROM
Use the RUN command for each line of your powershell script
I tried installing scrapy on an existing windows Dockerfile that used conda / python 3.6, it threw error SettingsFrame has no attribute 'ENABLE_CONNECT_PROTOCOL' at a similar stage.
However I tried again with miniconda and python 3.8, and was able to get scrapy running, here's the dockerfile:
FROM mcr.microsoft.com/windows/servercore:ltsc2019
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
RUN setx /M PATH $('C:\Users\ContainerAdministrator\miniconda3\Library\bin;C:\Users\ContainerAdministrator\miniconda3\Scripts;C:\Users\ContainerAdministrator\miniconda3;' + $Env:PATH)
RUN Invoke-WebRequest "https://repo.anaconda.com/miniconda/Miniconda3-py38_4.10.3-Windows-x86_64.exe" -OutFile miniconda3.exe -UseBasicParsing; \
Start-Process -FilePath 'miniconda3.exe' -Wait -ArgumentList '/S', '/D=C:\Users\ContainerAdministrator\miniconda3'; \
Remove-Item .\miniconda3.exe; \
conda install -y -c conda-forge scrapy;
Build it with docker build . -t scrapy and run with docker run -it scrapy.
To verify you are running a shell inside the container run whoami - should return user manager\containeradministrator.
Then, scrapy benchmark command should be able to run and dump some stats.
The container will stop when you close the shell.
I am trying to install the arch package https://pypi.org/project/arch/ using Anaconda.
The suggested install runs fine
(base) C:\Users\john>conda install arch-py -c conda-forge
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: C:\Users\john\anaconda3
added / updated specs:
- arch-py
The following NEW packages will be INSTALLED:
arch-py conda-forge/win-64::arch-py-4.18-py38h294d835_0
cython conda-forge/win-64::cython-0.29.22-py38h885f38d_0
icc_rt pkgs/main/win-64::icc_rt-2019.0.0-h0cc432a_1
patsy conda-forge/noarch::patsy-0.5.1-py_0
property-cached conda-forge/noarch::property-cached-1.6.4-py_0
scipy pkgs/main/win-64::scipy-1.6.1-py38h14eb087_0
statsmodels conda-forge/win-64::statsmodels-0.12.2-py38h347fdf6_0
The following packages will be UPDATED:
certifi pkgs/main::certifi-2020.12.5-py38haa9~ --> conda-forge::certifi-2020.12.5-py38haa244fe_1
The following packages will be SUPERSEDED by a higher-priority channel:
ca-certificates pkgs/main::ca-certificates-2021.1.19-~ --> conda-forge::ca-certificates-2020.12.5-h5b45459_0
conda pkgs/main::conda-4.9.2-py38haa95532_0 --> conda-forge::conda-4.9.2-py38haa244fe_0
openssl pkgs/main::openssl-1.1.1j-h2bbff1b_0 --> conda-forge::openssl-1.1.1j-h8ffe710_0
Proceed ([y]/n)? y
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(base) C:\Users\john>spyder
Unfortunately, I cannot import the package correctly when I start Spyder.
from arch import arch_model
ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject
What should I do?
Thanks!
The package requires the most recent version of numpy. I tried to remove numpy and reinstall version 1.20.0 (the version needed) without success. Anaconda would stick to 1.19
Ultimately, I did what I should have done a long time ago. Download miniconda (not anaconda) and install only the packages I need. That way, no annoying conflicts when updating packages with conda!
Sagemaker default python environments hosted in my work environment have outdated pandas, and therefore must have their conda environment updated. However, this is incredibly slow (15-30 mins), and I would like to find a faster way to get a working environment
I update with the following:
!conda update pandas fsspec --yes
Which gives the following output, with the key problem being an inconsistent starting environment (How?) as shown by
failed with repodata from current_repodata.json, will retry with next repodata source. Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source. Collecting package metadata (repodata.json): done
output:
Collecting package metadata (current_repodata.json): done
Solving environment: /
The environment is inconsistent, please check the package plan carefully
The following packages are causing the inconsistency:
- defaults/linux-64::pandas==1.0.1=py36h0573a6f_0
- defaults/noarch::jupyterlab==1.2.6=pyhf63ae98_0
- defaults/linux-64::scikit-learn==0.22.1=py36hd81dba3_0
- defaults/linux-64::python-language-server==0.31.7=py36_0
- defaults/linux-64::bkcharts==0.2=py36_0
- defaults/linux-64::nb_conda==2.2.1=py36_0
- defaults/noarch::numpydoc==0.9.2=py_0
- defaults/linux-64::pytest-arraydiff==0.3=py36h39e3cac_0
- defaults/linux-64::bottleneck==1.3.2=py36heb32a55_0
- defaults/linux-64::pywavelets==1.1.1=py36h7b6447c_0
- defaults/noarch::pytest-astropy==0.8.0=py_0
- defaults/linux-64::numexpr==2.7.1=py36h423224d_0
- defaults/noarch::anaconda-project==0.8.4=py_0
- defaults/noarch::boto3==1.9.162=py_0
- defaults/linux-64::s3transfer==0.2.1=py36_0
- defaults/linux-64::nbconvert==5.6.1=py36_0
- defaults/linux-64::h5py==2.10.0=py36h7918eee_0
- defaults/linux-64::bokeh==1.4.0=py36_0
- defaults/noarch::jupyterlab_server==1.0.6=py_0
- defaults/linux-64::numpy-base==1.18.1=py36hde5b4d6_1
- defaults/noarch::botocore==1.12.189=py_0
- defaults/linux-64::jupyter==1.0.0=py36_7
- defaults/linux-64::astropy==4.0=py36h7b6447c_0
- defaults/linux-64::patsy==0.5.1=py36_0
- defaults/linux-64::scikit-image==0.16.2=py36h0573a6f_0
- defaults/linux-64::matplotlib-base==3.1.3=py36hef1b27d_0
- defaults/linux-64::imageio==2.6.1=py36_0
- defaults/linux-64::pytables==3.6.1=py36h71ec239_0
- defaults/linux-64::nb_conda_kernels==2.2.4=py36_0
- defaults/linux-64::mkl_fft==1.0.15=py36ha843d7b_0
- defaults/linux-64::statsmodels==0.11.0=py36h7b6447c_0
- defaults/linux-64::spyder==4.0.1=py36_0
- defaults/noarch::seaborn==0.10.0=py_0
- defaults/linux-64::requests==2.22.0=py36_1
- defaults/linux-64::numba==0.48.0=py36h0573a6f_0
- defaults/linux-64::scipy==1.4.1=py36h0b6359f_0
- defaults/noarch::pytest-doctestplus==0.5.0=py_0
- defaults/linux-64::mkl_random==1.1.0=py36hd6b4f25_0
- defaults/noarch::dask==2.11.0=py_0
- defaults/noarch::ipywidgets==7.5.1=py_0
- defaults/linux-64::widgetsnbextension==3.5.1=py36_0
- defaults/noarch::s3fs==0.4.2=py_0
- defaults/linux-64::notebook==6.0.3=py36_0
- defaults/linux-64::matplotlib==3.1.3=py36_0
- defaults/linux-64::anaconda-client==1.7.2=py36_0
- defaults/linux-64::numpy==1.18.1=py36h4f9e942_0
failed with repodata from current_repodata.json, will retry with next repodata source.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: |
The environment is inconsistent, please check the package plan carefully
The following packages are causing the inconsistency:
- defaults/noarch::jupyterlab==1.2.6=pyhf63ae98_0
- defaults/linux-64::python-language-server==0.31.7=py36_0
- defaults/linux-64::nb_conda==2.2.1=py36_0
- defaults/noarch::numpydoc==0.9.2=py_0
- defaults/noarch::anaconda-project==0.8.4=py_0
- defaults/noarch::boto3==1.9.162=py_0
- defaults/linux-64::s3transfer==0.2.1=py36_0
- defaults/linux-64::nbconvert==5.6.1=py36_0
- defaults/linux-64::bokeh==1.4.0=py36_0
- defaults/noarch::jupyterlab_server==1.0.6=py_0
- defaults/noarch::botocore==1.12.189=py_0
- defaults/linux-64::jupyter==1.0.0=py36_7
- defaults/linux-64::scikit-image==0.16.2=py36h0573a6f_0
- defaults/linux-64::imageio==2.6.1=py36_0
- defaults/linux-64::nb_conda_kernels==2.2.4=py36_0
- defaults/linux-64::spyder==4.0.1=py36_0
- defaults/linux-64::requests==2.22.0=py36_1
- defaults/noarch::dask==2.11.0=py_0
- defaults/noarch::ipywidgets==7.5.1=py_0
- defaults/linux-64::widgetsnbextension==3.5.1=py36_0
- defaults/noarch::s3fs==0.4.2=py_0
- defaults/linux-64::notebook==6.0.3=py36_0
- defaults/linux-64::anaconda-client==1.7.2=py36_0
done
==> WARNING: A newer version of conda exists. <==
current version: 4.8.4
latest version: 4.9.2
Please update conda by running
$ conda update -n base conda
## Package Plan ##
environment location: /home/ec2-user/anaconda3/envs/python3
added / updated specs:
- fsspec
- pandas
- s3fs
The following packages will be downloaded:
package | build
---------------------------|-----------------
astroid-2.4.2 | py36h9f0ad1d_1 297 KB conda-forge
certifi-2020.12.5 | py36h5fab9bb_1 143 KB conda-forge
docutils-0.16 | py36h5fab9bb_3 738 KB conda-forge
pandas-1.1.4 | py36hd87012b_0 10.5 MB conda-forge
pillow-7.1.2 | py36hb39fc2d_0 604 KB
pylint-2.6.0 | py36h9f0ad1d_1 446 KB conda-forge
sphinx-3.4.3 | pyhd8ed1ab_0 1.5 MB conda-forge
toml-0.10.2 | pyhd8ed1ab_0 18 KB conda-forge
urllib3-1.25.11 | py_0 93 KB conda-forge
------------------------------------------------------------
Total: 14.3 MB
The following NEW packages will be INSTALLED:
astroid conda-forge/linux-64::astroid-2.4.2-py36h9f0ad1d_1
bleach conda-forge/noarch::bleach-3.2.1-pyh9f0ad1d_0
brotlipy conda-forge/linux-64::brotlipy-0.7.0-py36he6145b8_1001
docutils conda-forge/linux-64::docutils-0.16-py36h5fab9bb_3
pillow pkgs/main/linux-64::pillow-7.1.2-py36hb39fc2d_0
pylint conda-forge/linux-64::pylint-2.6.0-py36h9f0ad1d_1
sphinx conda-forge/noarch::sphinx-3.4.3-pyhd8ed1ab_0
toml conda-forge/noarch::toml-0.10.2-pyhd8ed1ab_0
urllib3 conda-forge/noarch::urllib3-1.25.11-py_0
The following packages will be UPDATED:
ca-certificates 2020.11.8-ha878542_0 --> 2020.12.5-ha878542_0
certifi 2020.11.8-py36h5fab9bb_0 --> 2020.12.5-py36h5fab9bb_1
fsspec pkgs/main::fsspec-0.6.2-py_0 --> conda-forge::fsspec-0.8.5-pyhd8ed1ab_0
pandas pkgs/main::pandas-1.0.1-py36h0573a6f_0 --> conda-forge::pandas-1.1.4-py36hd87012b_0
Downloading and Extracting Packages
pillow-7.1.2 | 604 KB | ##################################### | 100%
astroid-2.4.2 | 297 KB | ##################################### | 100%
pylint-2.6.0 | 446 KB | ##################################### | 100%
sphinx-3.4.3 | 1.5 MB | ##################################### | 100%
pandas-1.1.4 | 10.5 MB | ##################################### | 100%
docutils-0.16 | 738 KB | ##################################### | 100%
urllib3-1.25.11 | 93 KB | ##################################### | 100%
certifi-2020.12.5 | 143 KB | ##################################### | 100%
toml-0.10.2 | 18 KB | ##################################### | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Happy to take any suggestions for how to get a python notebook up in sagemaker as quickly as possible with modern packages.
Other attempted solutions:
a fast pip install -U doesn't work due to dependency issues -- the
local environment in the notebook will try to point pandas to
outdated fsspec and it will crash
Following AWS documentation for adding my conda requests to the startup script doesn't work because there is a timeout on the startup script (10 mins I think?) so a 15+ minute conda update process just ensures the sagemaker instance cannot start
The reason for this issue is because conda does a dependencies checks. It tries to find the version of the package which is compatible with all packages while pip install the required package and it's dependencies which might result with inconsistency. [1]
There are two workarounds for this issue,
Creating a custom environment with the required packages and create a kernel to be used from Sagemaker notebook.
Using the --no-deps option pip install pandas==<version> --no-deps. You might need to use the -U option.
To recap, I would suggest either creating a custom environment or using pip to install the package and all it's dependencies with the option --no-deps. You might need to try both approaches while the notebook is running and then apply to the lifecycle configurations script.
Installing packages to start running some code is perhaps the hardest part of my job.
Anways, I tried installing opencv for use in anaconda python 3.6 environment. And I get the error:
conda install -c conda-forge opencv
Fetching package metadata ...........
Solving package specifications: ..........
Package plan for installation in environment C:\Program Files\Anaconda3\envs\py36:
The following packages will be downloaded:
package | build
---------------------------|-----------------
libwebp-0.5.2 | vc14_7 1.1 MB conda-forge
opencv-3.2.0 | np112py36_204 92.0 MB conda-forge
------------------------------------------------------------
Total: 93.1 MB
The following NEW packages will be INSTALLED:
libwebp: 0.5.2-vc14_7 conda-forge [vc14]
opencv: 3.2.0-np112py36_204 conda-forge
Proceed ([y]/n)? y
Fetching packages ...
libwebp-0.5.2- 100% |###############################| Time: 0:00:05 213.41 kB/s
opencv-3.2.0-n 100% |###############################| Time: 0:00:48 1.97 MB/s
Extracting packages ...
[ COMPLETE ]|##################################################| 100%
Linking packages ...
PaddingError: Placeholder of length '34' too short in package conda-forge::opencv-3.2.0-np112py36_204.
The package must be rebuilt with conda-build > 2.0.
I am on a Windows System. I do not understand the error and searching isn't helping.
Any comments or suggestions to resolve the error are welcome.
For the record, OpenCV installs fine with pip.
Tested on Windows 10 with Miniconda and Python 3.6:
> pip search opencv
...
opencv-python
...
> pip install opencv-python
Tells me Requirement already satisfied.
To make sure it was correctly installed, run:
> python
>>> import cv2
>>>
Go to the root conda environment.
And do conda update conda.
Then just import cv2 and use it.