want to create a python 3.6 environment. Below are the commands I executed and the output, but as you can see, the last python version shows 2.7. Why is this? I re-exit the terminal, re-enter, still 2.7 version
(base) :~/workspace/gem5$ conda create -n rcnn python=3.6
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /home/cuiyujie/anaconda3/envs/rcnn
added / updated specs:
- python=3.6
The following packages will be downloaded:
package | build
---------------------------|-----------------
certifi-2016.2.28 | py36_0 214 KB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
pip-9.0.1 | py36_1 1.7 MB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
python-3.6.2 | 0 31.5 MB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
setuptools-36.4.0 | py36_1 534 KB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
wheel-0.29.0 | py36_0 129 KB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
wincertstore-0.2 | py36_0 14 KB http://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free
------------------------------------------------------------
Total: 34.1 MB
The following NEW packages will be INSTALLED:
certifi anaconda/pkgs/free/win-64::certifi-2016.2.28-py36_0
pip anaconda/pkgs/free/win-64::pip-9.0.1-py36_1
python anaconda/pkgs/free/win-64::python-3.6.2-0
setuptools anaconda/pkgs/free/win-64::setuptools-36.4.0-py36_1
vc anaconda/pkgs/free/win-64::vc-14-0
vs2015_runtime anaconda/pkgs/free/win-64::vs2015_runtime-14.0.25420-0
wheel anaconda/pkgs/free/win-64::wheel-0.29.0-py36_0
wincertstore anaconda/pkgs/free/win-64::wincertstore-0.2-py36_0
Proceed ([y]/n)? y
Downloading and Extracting Packages
python-3.6.2 | 31.5 MB | ################################################################################################################################ | 100%
wheel-0.29.0 | 129 KB | ################################################################################################################################ | 100%
setuptools-36.4.0 | 534 KB | ################################################################################################################################ | 100%
certifi-2016.2.28 | 214 KB | ################################################################################################################################ | 100%
wincertstore-0.2 | 14 KB | ################################################################################################################################ | 100%
pip-9.0.1 | 1.7 MB | ################################################################################################################################ | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
# $ conda activate rcnn
#
# To deactivate an active environment, use
#
# $ conda deactivate
(base) :~/workspace/gem5$ conda activate rcnn
(rcnn) :~/workspace/gem5$ python --version
Python 2.7.12
(rcnn) :~/workspace/gem5$
At the end of my .bashrc is export PATH="/home/cuiyujie/anaconda3/bin:$PATH". This shows that the path of my anaconda is put to the front, but when I use echo $PATH, the output is /home/cuiyujie/bin:/home/cuiyujie/.local/bin:/home/cuiyujie/anaconda3/bin:/home/cuiyujie /anaconda3/condabin:/usr/local/sbin:. Why did Anaconda run behind again?
When I use the export PATH="/home/cuiyujie/anaconda3/bin:$PATH command directly in the terminal.
echo $PATH is
/home/cuiyujie/anaconda3/bin:/home/cuiyujie/bin:/home/cuiyujie/.local/bin:/home/cuiyujie/anaconda3/bin:/home/cuiyujie/anaconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/arm_4.4.3/bin
I tried the conda command again and checked the python version
(base) :~$ conda activate rcnn
(rcnn) :~$ python --version
Python 3.8.5
The version is 3.85, which is the same as the built-in version, not the 3.6 I created
(rcnn) :~$ which python
/home/cuiyujie/anaconda3/bin/python
(rcnn) :~$ echo $PATH
/home/cuiyujie/anaconda3/envs/rcnn/bin:/home/cuiyujie/bin:/home/cuiyujie/.local/bin:/home/cuiyujie/anaconda3/bin:/home/cuiyujie/anaconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/arm_4.4.3/bin
I am using the linux version. I am sure I downloaded the Linux version.Anaconda3-2020.07-Linux-x86_64.sh But I entered the package of the environment I created, which contained .ddl and exe files, which was so strange. I uninstalled and reinstalled it several times. But Anaconda is the linux version, and the created environment is the window version.
Did you alias python=/path/to/python2.7 ?
Or maybe you can try:
conda deactivate
conda activate rcnn
Check which python you are using by:
which python
Anaconda Navigator is for Setting up an environment or installing/updating packages in a specific environment of your choice. If you are having a new environment set up say, previously u have an existing 3.5 and now you are setting up 3.7 then, you will have to install the spyder kernel. Dont worry, when you start the system will flag this and you just have to install that through Navigator.
[Spyder > Tools > Preferences][1]
[1]: https://i.stack.imgur.com/qoSDV.png
Here in Spyder, your editor you typically would have the default python interpretor, change it to the environment of your choice.
[Prefernce > Set new python exe from specific environment][2]
[2]: https://i.stack.imgur.com/XiQro.png
So when you click on the Use the following interpretor, and open the file , you need to browse to the anaconda environment, select the respective version.
[Select Python Version][3]
[3]: https://i.stack.imgur.com/u4vQm.png
Now apply and say ok.
You need to restart your spyder for this to take effect for the first time.
It's very simple using Anaconda Navigator. Just choose your right environment and select Python version from the list. Check the screenshots for more clarity.
Related
I want to install Scrapy on Windows Server 2019, running in a Docker container (please see here and here for the history of my installation).
On my local Windows 10 machine I can run my Scrapy commands like so in Windows PowerShell (after simply starting Docker Desktop):
scrapy crawl myscraper -o allobjects.json in folder C:\scrapy\my1stscraper\
For Windows Server as recommended here I first installed Anaconda following these steps: https://docs.scrapy.org/en/latest/intro/install.html.
I then opened the Anaconda prompt and typed conda install -c conda-forge scrapy in D:\Programs
(base) PS D:\Programs> dir
Directory: D:\Programs
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 4/22/2021 10:52 AM Anaconda3
-a---- 4/22/2021 11:20 AM 0 conda
(base) PS D:\Programs> conda install -c conda-forge scrapy
Collecting package metadata (current_repodata.json): done
Solving environment: done
==> WARNING: A newer version of conda exists. <==
current version: 4.9.2
latest version: 4.10.1
Please update conda by running
$ conda update -n base -c defaults conda
## Package Plan ##
environment location: D:\Programs\Anaconda3
added / updated specs:
- scrapy
The following packages will be downloaded:
package | build
---------------------------|-----------------
automat-20.2.0 | py_0 30 KB conda-forge
conda-4.10.1 | py38haa244fe_0 3.1 MB conda-forge
constantly-15.1.0 | py_0 9 KB conda-forge
cssselect-1.1.0 | py_0 18 KB conda-forge
hyperlink-21.0.0 | pyhd3deb0d_0 71 KB conda-forge
incremental-17.5.0 | py_0 14 KB conda-forge
itemadapter-0.2.0 | pyhd8ed1ab_0 12 KB conda-forge
parsel-1.6.0 | py_0 15 KB conda-forge
pyasn1-0.4.8 | py_0 53 KB conda-forge
pyasn1-modules-0.2.7 | py_0 60 KB conda-forge
pydispatcher-2.0.5 | py_1 12 KB conda-forge
pyhamcrest-2.0.2 | py_0 29 KB conda-forge
python_abi-3.8 | 1_cp38 4 KB conda-forge
queuelib-1.6.1 | pyhd8ed1ab_0 14 KB conda-forge
scrapy-2.4.1 | py38haa95532_0 372 KB
service_identity-18.1.0 | py_0 12 KB conda-forge
twisted-21.2.0 | py38h294d835_0 5.1 MB conda-forge
twisted-iocpsupport-1.0.1 | py38h294d835_0 49 KB conda-forge
w3lib-1.22.0 | pyh9f0ad1d_0 21 KB conda-forge
------------------------------------------------------------
Total: 9.0 MB
The following NEW packages will be INSTALLED:
automat conda-forge/noarch::automat-20.2.0-py_0
constantly conda-forge/noarch::constantly-15.1.0-py_0
cssselect conda-forge/noarch::cssselect-1.1.0-py_0
hyperlink conda-forge/noarch::hyperlink-21.0.0-pyhd3deb0d_0
incremental conda-forge/noarch::incremental-17.5.0-py_0
itemadapter conda-forge/noarch::itemadapter-0.2.0-pyhd8ed1ab_0
parsel conda-forge/noarch::parsel-1.6.0-py_0
pyasn1 conda-forge/noarch::pyasn1-0.4.8-py_0
pyasn1-modules conda-forge/noarch::pyasn1-modules-0.2.7-py_0
pydispatcher conda-forge/noarch::pydispatcher-2.0.5-py_1
pyhamcrest conda-forge/noarch::pyhamcrest-2.0.2-py_0
python_abi conda-forge/win-64::python_abi-3.8-1_cp38
queuelib conda-forge/noarch::queuelib-1.6.1-pyhd8ed1ab_0
scrapy pkgs/main/win-64::scrapy-2.4.1-py38haa95532_0
service_identity conda-forge/noarch::service_identity-18.1.0-py_0
twisted conda-forge/win-64::twisted-21.2.0-py38h294d835_0
twisted-iocpsuppo~ conda-forge/win-64::twisted-iocpsupport-1.0.1-py38h294d835_0
w3lib conda-forge/noarch::w3lib-1.22.0-pyh9f0ad1d_0
The following packages will be UPDATED:
conda pkgs/main::conda-4.9.2-py38haa95532_0 --> conda-forge::conda-4.10.1-py38haa244fe_0
Proceed ([y]/n)? y
Downloading and Extracting Packages
constantly-15.1.0 | 9 KB | ############################################################################ | 100%
itemadapter-0.2.0 | 12 KB | ############################################################################ | 100%
twisted-21.2.0 | 5.1 MB | ############################################################################ | 100%
pydispatcher-2.0.5 | 12 KB | ############################################################################ | 100%
queuelib-1.6.1 | 14 KB | ############################################################################ | 100%
service_identity-18. | 12 KB | ############################################################################ | 100%
pyhamcrest-2.0.2 | 29 KB | ############################################################################ | 100%
cssselect-1.1.0 | 18 KB | ############################################################################ | 100%
automat-20.2.0 | 30 KB | ############################################################################ | 100%
pyasn1-0.4.8 | 53 KB | ############################################################################ | 100%
twisted-iocpsupport- | 49 KB | ############################################################################ | 100%
python_abi-3.8 | 4 KB | ############################################################################ | 100%
hyperlink-21.0.0 | 71 KB | ############################################################################ | 100%
conda-4.10.1 | 3.1 MB | ############################################################################ | 100%
scrapy-2.4.1 | 372 KB | ############################################################################ | 100%
incremental-17.5.0 | 14 KB | ############################################################################ | 100%
w3lib-1.22.0 | 21 KB | ############################################################################ | 100%
pyasn1-modules-0.2.7 | 60 KB | ############################################################################ | 100%
parsel-1.6.0 | 15 KB | ############################################################################ | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
(base) PS D:\Programs>
In PowerShell on my VPS I then tried to run scrapy via D:\Programs\Anaconda3\Scripts\scrapy.exe
I want to run the spider I have stored in folder D:\scrapy\my1stscraper, see:
The Docker Engine service is running as a Windows Service (presuming I don't need to explicitly start a container when running my scrapy command, if I do, I would not know how):
I tried starting my scraper like so D:\Programs\Anaconda3\Scripts\scrapy.exe crawl D:\scrapy\my1stscraper\spiders\my1stscraper -o allobjects.json, resulting in errors:
Traceback (most recent call last):
File "D:\Programs\Anaconda3\Scripts\scrapy-script.py", line 6, in <module>
from scrapy.cmdline import execute
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\__init__.py", line 12, in <module>
from scrapy.spiders import Spider
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\spiders\__init__.py", line 11, in <module>
from scrapy.http import Request
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\http\__init__.py", line 11, in <module>
from scrapy.http.request.form import FormRequest
File "D:\Programs\Anaconda3\lib\site-packages\scrapy\http\request\form.py", line 10, in <module>
import lxml.html
File "D:\Programs\Anaconda3\lib\site-packages\lxml\html\__init__.py", line 53, in <module>
from .. import etree
ImportError: DLL load failed while importing etree: The specified module could not be found.
I checked here:
from lxml import etree ImportError: DLL load failed: The specified module could not be found
This talks about pip, which I did not use, but to be sure I did install the C++ build tools:
I still get the same error. How can I run my Scrapy crawler in the Docker container?
UPDATE 1
My VPS is my only environment so not sure how to test in a virtual environment.
What I did now:
Uninstall Anacondo
Install Miniconda with Python 3.8 (https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe), did not add to path and used miniconda as systems' python 3.8
Looking at your recommendations:
Get steps to manually install the app on Windows Server - ideally test in a virtualised environment so you can reset it cleanly
When you say app, what do you mean? Scrapy? Conda?
Convert all steps to a fully automatic powershell script (e.g. for conda, need to download the installer via wget, execute the installer etc.
I now installed Conda on the host OS, since I thought that would allow me to have the least amount of overhead. Or would you install it in the image directly and if so, how do I not have to install it each time?
Lastly, just to check to be sure, I want to run multiple Scrapy scrapers, but I want to do this with as little overhead as possible.
I should just repeat the RUN command in the SAME docker container for each scraper I want to execute, correct?
UPDATE 2
whomami indeed returns user manager\containeradministrator
scrapy benchmark returns
Scrapy 2.4.1 - no active project
Unknown command: benchmark
Use "scrapy" to see available commands
I have the scrapy project I want to run in folder D:\scrapy\my1stscraper, how can I run that project, since D:\ drive is not available within my container?
UPDATE 3
A few months later when we discussed this, when I now run your proposed the Dockerfile it breaks and I now get this output:
PS D:\Programs> docker build . -t scrapy
Sending build context to Docker daemon 1.644GB
Step 1/9 : FROM mcr.microsoft.com/windows/servercore:ltsc2019
---> d1724c2d9a84
Step 2/9 : SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
---> Running in 5f79f1bf9b62
Removing intermediate container 5f79f1bf9b62
---> 8bb2a477eaca
Step 3/9 : RUN setx /M PATH $('C:\Users\ContainerAdministrator\miniconda3\Library\bin;C:\Users\ContainerAdministrator\miniconda3\Scripts;C:\Users\ContainerAdministrator\miniconda3;' + $Env:PATH)
---> Running in f3869c4f64d5
SUCCESS: Specified value was saved.
Removing intermediate container f3869c4f64d5
---> 82a2fa969a88
Step 4/9 : RUN Invoke-WebRequest "https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe" -OutFile miniconda3.exe -UseBasicParsing; Start-Process -FilePath 'miniconda3.exe' -Wait -ArgumentList '/S', '/D=C:\Users\ContainerAdministrator\miniconda3'; Remove-Item .\miniconda3.exe; conda install -y -c conda-forge scrapy;
---> Running in 3eb8b7bfe878
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
UnsatisfiableError: The following specifications were found to be incompatible with the existing python installation in your environment:
Specifications:
- scrapy -> python[version='2.7.*|3.5.*|3.6.*|>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|>=3.8,<3.9.0a0|>=3.5,<3.6.0a0|3.4.*']
Your python: python=3.9
If python is on the left-most side of the chain, that's the version you've asked for.
When python appears to the right, that indicates that the thing on the left is somehow
not available for the python version you are constrained to. Note that conda will not
change your python version to a different minor version unless you explicitly specify
that.
Not sure if I'm reading this correctly but it seems as if Scrapy does not support Python 3.9, except that here I see "Scrapy requires Python 3.6+" https://docs.scrapy.org/en/latest/intro/install.html
Do you know what's causing this issue? I also checked here but no answer yet either.
To run a containerised app, it must be installed in a container image first - you don't want to install any software on the host machine.
For linux there are off-the-shelf container images for everything which is probably what your docker desktop environment was using; I see 1051 results on docker hub search for scrapy but none of them are windows containers.
The full process of creating a windows container from scratch for an app is:
Get steps to manually install the app (scrapy and its dependencies) on Windows Server - ideally test in a virtualised environment so you can reset it cleanly
Convert all steps to a fully automatic powershell script (e.g. for conda, need to download the installer via wget, execute the installer etc.
Optionaly, test the powershell steps in an interactive container
docker run -it --isolation=process mcr.microsoft.com/windows/servercore:ltsc2019 powershell
This runs a windows container and gives you a shell to verify that your install script works
When you exit the shell the container is stopped
Create a Dockerfile
Use mcr.microsoft.com/windows/servercore:ltsc2019 as the base image via FROM
Use the RUN command for each line of your powershell script
I tried installing scrapy on an existing windows Dockerfile that used conda / python 3.6, it threw error SettingsFrame has no attribute 'ENABLE_CONNECT_PROTOCOL' at a similar stage.
However I tried again with miniconda and python 3.8, and was able to get scrapy running, here's the dockerfile:
FROM mcr.microsoft.com/windows/servercore:ltsc2019
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
RUN setx /M PATH $('C:\Users\ContainerAdministrator\miniconda3\Library\bin;C:\Users\ContainerAdministrator\miniconda3\Scripts;C:\Users\ContainerAdministrator\miniconda3;' + $Env:PATH)
RUN Invoke-WebRequest "https://repo.anaconda.com/miniconda/Miniconda3-py38_4.10.3-Windows-x86_64.exe" -OutFile miniconda3.exe -UseBasicParsing; \
Start-Process -FilePath 'miniconda3.exe' -Wait -ArgumentList '/S', '/D=C:\Users\ContainerAdministrator\miniconda3'; \
Remove-Item .\miniconda3.exe; \
conda install -y -c conda-forge scrapy;
Build it with docker build . -t scrapy and run with docker run -it scrapy.
To verify you are running a shell inside the container run whoami - should return user manager\containeradministrator.
Then, scrapy benchmark command should be able to run and dump some stats.
The container will stop when you close the shell.
Windows 10
conda 4.9.2 (via miniconda)
I installed a single package that did not require any other dependencies to be installed anew or upgraded. Once I realised that I had installed an unsuitable version of the package, I went to remove it, and this is the screen I was presented with:
(pydata) PS C:\Users\Navneeth> conda remove xlrd
Collecting package metadata (repodata.json): done
Solving environment: |
Warning: 2 possible package resolutions (only showing differing packages):
- defaults/win-64::libtiff-4.1.0-h56a325e_1, defaults/win-64::zstd-1.4.9-h19a0ad4_0
- defaults/win-64::libtiff-4.2.0-hd0e1b90_0, defaults/win-64::zstd-1.4.5-h04227a9done
## Package Plan ##
environment location: C:\Users\Navneeth\Miniconda3\envs\pydata
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
decorator-5.0.3 | pyhd3eb1b0_0 12 KB
importlib-metadata-3.7.3 | py38haa95532_1 31 KB
importlib_metadata-3.7.3 | hd3eb1b0_1 11 KB
ipython-7.22.0 | py38hd4e2768_0 998 KB
jupyter_client-6.1.12 | pyhd3eb1b0_0 88 KB
libtiff-4.1.0 | h56a325e_1 739 KB
nbformat-5.1.3 | pyhd3eb1b0_0 44 KB
notebook-6.3.0 | py38haa95532_0 4.4 MB
pandoc-2.12 | haa95532_0 13.2 MB
parso-0.8.2 | pyhd3eb1b0_0 69 KB
pillow-8.2.0 | py38h4fa10fc_0 671 KB
prometheus_client-0.10.0 | pyhd3eb1b0_0 46 KB
prompt-toolkit-3.0.17 | pyh06a4308_0 256 KB
terminado-0.9.4 | py38haa95532_0 26 KB
zipp-3.4.1 | pyhd3eb1b0_0 15 KB
zstd-1.4.9 | h19a0ad4_0 478 KB
------------------------------------------------------------
Total: 21.0 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd3eb1b0_0
The following packages will be UPDATED:
decorator 4.4.2-pyhd3eb1b0_0 --> 5.0.3-pyhd3eb1b0_0
importlib-metadata pkgs/main/noarch::importlib-metadata-~ --> pkgs/main/win-64::importlib-metadata-3.7.3-py38haa95532_1
importlib_metadata 2.0.0-1 --> 3.7.3-hd3eb1b0_1
ipython 7.21.0-py38hd4e2768_0 --> 7.22.0-py38hd4e2768_0
jupyter_client 6.1.7-py_0 --> 6.1.12-pyhd3eb1b0_0
nbformat 5.1.2-pyhd3eb1b0_1 --> 5.1.3-pyhd3eb1b0_0
notebook 6.2.0-py38haa95532_0 --> 6.3.0-py38haa95532_0
pandoc 2.11-h9490d1a_0 --> 2.12-haa95532_0
parso 0.8.1-pyhd3eb1b0_0 --> 0.8.2-pyhd3eb1b0_0
pillow 8.1.2-py38h4fa10fc_0 --> 8.2.0-py38h4fa10fc_0
prometheus_client 0.9.0-pyhd3eb1b0_0 --> 0.10.0-pyhd3eb1b0_0
prompt-toolkit 3.0.8-py_0 --> 3.0.17-pyh06a4308_0
sqlite 3.33.0-h2a8f88b_0 --> 3.35.3-h2bbff1b_0
terminado 0.9.2-py38haa95532_0 --> 0.9.4-py38haa95532_0
zipp 3.4.0-pyhd3eb1b0_0 --> 3.4.1-pyhd3eb1b0_0
zstd 1.4.5-h04227a9_0 --> 1.4.9-h19a0ad4_0
The following packages will be DOWNGRADED:
libtiff 4.2.0-he0120a3_0 --> 4.1.0-h56a325e_1
Proceed ([y]/n)?
Why does conda want to update or downgrade all these other packages when the opposite wasn't done when I installed xlrd? Is there a way that I can safely remove the just xlrd. (I hear using --force is risky.)
Asymmetry
Conda re-solves when removing. When installing, Conda first attempts a frozen solve, which amounts to keeping all installed packages fixed and just searching for a version of the requested package(s) that are compatible. In this specific case, xlrd (v2.1.0) is a noarch with only a python>=3.6 constraint. So this installs in this frozen solve pass.
The constraint xlrd will also be added to the explicit specifications.1
When removing, Conda will first remove the constraint, and then re-solves the environment with the new set of explicit specifications. It is in this solve that Conda identifies that newer versions of packages and then proposes updating then.
So, the asymmetry is that the frozen solve explicitly avoids checking for any new packages, but the removal will trigger such a check. There is not currently a way to avoid this without bypassing dependency checking.
Mamba
Actually, mamba, a compiled (fast!) drop-in replacement for conda, will remove only the specified package if it doesn't have anything depending on it. That is its default behavior in my testing.
Addendum: Still Some Unexplained Behavior
I replicated your experience by first creating an environment with two specs:
name: foo
channels:
- conda-forge
dependencies:
- python=3.8.0
- pip=20
To simulate this being an old environment, I went into the envs/foo/conda-meta/history and changed2 the line
# update specs: ['pip=20', 'python=3.8.0']
to
# update specs: ['python=3.8']
Subsequently running conda install xlrd does as expected. Then conda remove xlrd gives a somewhat odd result:
## Package Plan ##
environment location: /opt/conda/envs/foo
removed specs:
- xlrd
The following packages will be downloaded:
package | build
---------------------------|-----------------
pip-21.1.1 | pyhd8ed1ab_0 1.1 MB conda-forge
------------------------------------------------------------
Total: 1.1 MB
The following packages will be REMOVED:
xlrd-2.0.1-pyhd8ed1ab_3
The following packages will be UPDATED:
pip 20.3.4-pyhd8ed1ab_0 --> 21.1.1-pyhd8ed1ab_0
Proceed ([y]/n)?
This effectively replicates OP result, however, the additional oddity here is that the python package is not suggested to be updated, even though I had intentionally loosened its constraint from 3.8.0 to 3.8. It appears that only packages not in the explicit specifications are subject to updating during package removal.
[1] The explicit specifications are the internally maintained records that Conda keeps of every constraint a user has explicitly specified. One can view the current explicit specifications of an environment with conda env export --from-history. The raw internal records can be found at yourenv/conda-meta/history.
[2] Not a recommended practice!
I'd like to use PyTorch in a Python program. The instructions for installing it require conda. After installing Conda I ran:
>conda install -c pytorch pytorch (as instructed on the PyTorch [page][1])
It looked promising -- until the end.
Solving environment: done
## Package Plan ##
environment location: C:\ProgramData\Miniconda3
added / updated specs:
- pytorch
The following packages will be downloaded:
package | build
---------------------------|-----------------
icc_rt-2017.0.4 | h97af966_0 8.0 MB
vs2015_runtime-15.5.2 | 3 2.2 MB
pytorch-0.4.0 |py36_cuda80_cudnn7he774522_1 529.2 MB pytorch
mkl-2018.0.3 | 1 178.1 MB
numpy-1.14.5 | py36h9fa60d3_4 35 KB
intel-openmp-2018.0.3 | 0 1.7 MB
numpy-base-1.14.5 | py36h5c71026_4 3.8 MB
vc-14.1 | h0510ff6_3 5 KB
blas-1.0 | mkl 6 KB
conda-4.5.8 | py36_0 1.0 MB
mkl_fft-1.0.2 | py36hb217b18_0 113 KB
mkl_random-1.0.1 | py36h77b88f5_1 268 KB
------------------------------------------------------------
Total: 724.4 MB
The following NEW packages will be INSTALLED:
blas: 1.0-mkl
icc_rt: 2017.0.4-h97af966_0
intel-openmp: 2018.0.3-0
mkl: 2018.0.3-1
mkl_fft: 1.0.2-py36hb217b18_0
mkl_random: 1.0.1-py36h77b88f5_1
numpy: 1.14.5-py36h9fa60d3_4
numpy-base: 1.14.5-py36h5c71026_4
pytorch: 0.4.0-py36_cuda80_cudnn7he774522_1 pytorch
The following packages will be UPDATED:
conda: 4.5.4-py36_0 --> 4.5.8-py36_0
vc: 14-h0510ff6_3 --> 14.1-h0510ff6_3
vs2015_runtime: 14.0.25123-3 --> 15.5.2-3
Proceed ([y]/n)? y
Downloading and Extracting Packages
icc_rt-2017.0.4 | 8.0 MB | ############################################################################## | 100%
vs2015_runtime-15.5. | 2.2 MB | ############################################################################## | 100%
pytorch-0.4.0 | 529.2 MB | ############################################################################# | 100%
mkl-2018.0.3 | 178.1 MB | ############################################################################# | 100%
numpy-1.14.5 | 35 KB | ############################################################################## | 100%
intel-openmp-2018.0. | 1.7 MB | ############################################################################## | 100%
numpy-base-1.14.5 | 3.8 MB | ############################################################################## | 100%
vc-14.1 | 5 KB | ############################################################################## | 100%
blas-1.0 | 6 KB | ############################################################################## | 100%
conda-4.5.8 | 1.0 MB | ############################################################################## | 100%
mkl_fft-1.0.2 | 113 KB | ############################################################################## | 100%
mkl_random-1.0.1 | 268 KB | ############################################################################## | 100%
Preparing transaction: done
Verifying transaction: done
But then this.
Executing transaction: failed
ERROR conda.core.link:_execute(502): An error occurred while uninstalling package 'defaults::conda-4.5.4-py36_0'.
PermissionError(13, 'Access is denied')
Attempting to roll back.
Rolling back transaction: done
PermissionError(13, 'Access is denied')
Apparently it was at least partly installed because PyCharm was able to see it. But when I asked PyCharm to install it in an environment, I got this error.
RuntimeError: PyTorch does not currently provide packages for PyPI (see status at https://github.com/pytorch/pytorch/issues/566).
Please follow the instructions at http://pytorch.org/ to install with miniconda instead.
It suggests an alternative way to install PyTorch. So I tried that.
>conda install pytorch torchvision -c pytorch
Solving environment: failed
PackagesNotFoundError: The following packages are not available from current channels:
- torchvision
Current channels:
- https://conda.anaconda.org/pytorch/win-64
- https://conda.anaconda.org/pytorch/noarch
- https://repo.anaconda.com/pkgs/main/win-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/free/win-64
- https://repo.anaconda.com/pkgs/free/noarch
- https://repo.anaconda.com/pkgs/r/win-64
- https://repo.anaconda.com/pkgs/r/noarch
- https://repo.anaconda.com/pkgs/pro/win-64
- https://repo.anaconda.com/pkgs/pro/noarch
- https://repo.anaconda.com/pkgs/msys2/win-64
- https://repo.anaconda.com/pkgs/msys2/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
But when I do that and search for PyTorch, I eventually find myself back at the original instructions.
When I search for Torchvision, no Windows versions are listed.
Try the following steps in Windows:
Create a virtual environment using the command :
conda create -n py_env python=3.5
source activate py_env
conda install pytorch-cpu -c pytorch
pip install torchvision
Note: You can use any name instead of py_env
Thanks
What is your platform?
For your first installation method, the error message says that you don't have the permission. I encountered that error before on a Linux system. The reason was that Anaconda was installed by another user. I configured the path to point python to that installation so that I could run python without installing my own copy of Anaconda. However, it didn't permit me installing new packages and I got the same error message.
Solution: I installed my own copy of Anaconda and everything worked.
just run:
pip install torch torchvision
An alternative way to install PyTorch is the following steps:
conda create -n pytorch_env python=3
source activate pytorch_env
conda install pytorch-cpu torchvision -c pytorch
Go to python shell and import using the command
import torch
Open the terminal in administrative mode and if you are in linux try
sudo pip install "your package name"
Installing packages to start running some code is perhaps the hardest part of my job.
Anways, I tried installing opencv for use in anaconda python 3.6 environment. And I get the error:
conda install -c conda-forge opencv
Fetching package metadata ...........
Solving package specifications: ..........
Package plan for installation in environment C:\Program Files\Anaconda3\envs\py36:
The following packages will be downloaded:
package | build
---------------------------|-----------------
libwebp-0.5.2 | vc14_7 1.1 MB conda-forge
opencv-3.2.0 | np112py36_204 92.0 MB conda-forge
------------------------------------------------------------
Total: 93.1 MB
The following NEW packages will be INSTALLED:
libwebp: 0.5.2-vc14_7 conda-forge [vc14]
opencv: 3.2.0-np112py36_204 conda-forge
Proceed ([y]/n)? y
Fetching packages ...
libwebp-0.5.2- 100% |###############################| Time: 0:00:05 213.41 kB/s
opencv-3.2.0-n 100% |###############################| Time: 0:00:48 1.97 MB/s
Extracting packages ...
[ COMPLETE ]|##################################################| 100%
Linking packages ...
PaddingError: Placeholder of length '34' too short in package conda-forge::opencv-3.2.0-np112py36_204.
The package must be rebuilt with conda-build > 2.0.
I am on a Windows System. I do not understand the error and searching isn't helping.
Any comments or suggestions to resolve the error are welcome.
For the record, OpenCV installs fine with pip.
Tested on Windows 10 with Miniconda and Python 3.6:
> pip search opencv
...
opencv-python
...
> pip install opencv-python
Tells me Requirement already satisfied.
To make sure it was correctly installed, run:
> python
>>> import cv2
>>>
Go to the root conda environment.
And do conda update conda.
Then just import cv2 and use it.
Using conda to update conda, and then anaconda, seems to be working on two different installs of python. And therefore seem to work against one another. For example:
My-MacBook-Pro:~ me$ conda update conda
Error: unknown host: http://repo.continuum.io/pkgs/pro/osx-64/
Package plan for installation in environment /Users/myname/anaconda:
The following packages will be UN-linked:
package | build
---------------------------|-----------------
python-2.7.5 | 3
readline-6.2 | 1
The following packages will be linked:
package | build
---------------------------|-----------------
python-2.7.6 | 1 hard-link
readline-6.2 | 2 hard-link
Proceed ([y]/n)? y
Unlinking packages ...
[ COMPLETE ] |##################################################| 100%
Linking packages ...
[ COMPLETE ] |##################################################| 100%
My-MacBook-Pro:~ me$ conda update anaconda
Package plan for installation in environment /Users/myname/anaconda:
The following packages will be UN-linked:
package | build
---------------------------|-----------------
python-2.7.6 | 1
readline-6.2 | 2
The following packages will be linked:
package | build
---------------------------|-----------------
python-2.7.5 | 3 hard-link
readline-6.2 | 1 hard-link
Proceed ([y]/n)? y
Unlinking packages ...
[ COMPLETE ] |##################################################| 100%
Linking packages ...
[ COMPLETE ] |##################################################| 100%
My-MacBook-Pro:~ me$ conda update conda
Package plan for installation in environment /Users/myname/anaconda:
The following packages will be UN-linked:
package | build
---------------------------|-----------------
python-2.7.5 | 3
readline-6.2 | 1
The following packages will be linked:
package | build
---------------------------|-----------------
python-2.7.6 | 1 hard-link
readline-6.2 | 2 hard-link
Proceed ([y]/n)?
Recommendations for how to deal with this, it's not a huge issue at the moment functionality wise.
This happens because the version of Python in the latest version of Anaconda is 2.7.5. When a new version of Anaconda is released (which should happen this week), this problem will go away.
The anaconda package is a stable set of versions packages of packages that have been tested against one another. Hence, installing or updating anaconda may downgrade Python, because that is the version that is in Anaconda.