I use miniconda to manage my python environments on Windows 10. Additionally, I use software called ESRI ArcGIS Pro that comes bundled with it's own versions of conda and python that are somewhat modified to work with their software. I must use ESRI's conda to manage environments that interact with this application.
I have this same set up on both my laptop and desktop, and until recently had no issues. However, recently ESRI's conda stopped working on my laptop. Any conda commands (e.g. conda list, conda info --envs, conda create -n myenv, even just conda by itself) produce no output whatsoever. At first I suspected that PATH was set incorrectly, but I've check that this is not the case (even calling ERIS's conda.exe with a full path still does not work). I then suspected that the conda.exe file itself was corrupted, but this also is not the case (copied it to my desktop and it works fine there).
I suspect it may have something to do with my separate miniconda installation. It doesn't seem to be an issue of environmental variables being set incorrectly (again checked against the working system), but I'm wondering if there is any possibility that there are Registry entries (perhaps set by my Miniconda install) that could be causing this issue?
Any thoughts on why this might be the case? Or advice on how to proceed with diagnosing the issue?
EDIT:
Per merv's request, my conda environmental variables:
CONDA_DEFAULT_ENV=C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3
CONDA_PREFIX=C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3
CONDA_PS1_BACKUP=$P$G
Clearly these paths are different than normal due to the custom distribution.
To answer your other questions, no other conda commands generate any output whatsoever. As for activate I don't have any other environments to try activating (the arcgispro-py3 env you see above is the name of the 'base' env that ships with the software), but deactivate seems to work. Another slight difference to mention is that conda activate ... is not a command in this special conda, you have to just use activate by itself which AFAICT calls a shell script.
Related
TL/DR: Is there really no way to just tell jupyter console to run in some conda environment, without first unnecessarily installing (and hence depending on) Jupyter in that environment?
I really did try to make this not appear entirely like a rant... I hope you see there is an actual question here.
It seems like getting Jupyter to work with conda environments requires either
Installing a new Jupyter in every conda environment you want to use, or
Installing ipykernel in every conda environment you want to use (which depends on the jupyter package...), and creating a new kernel from within the environment.
I find this a bit astonishing, since I do not think of Jupyter as a requirement of the project, but rather as just another editor/IDE-like thing, making use of environments. Conda's purpose is to manage reproducible dependencies; Jupyter's should be to interpret code within the environment I tell it. Since I'd like to store the environment.yml in git and share it with others, I see no purpose in also requiring them to install Jupyter; they might not even use it.
Yet, it seems not to work that way at all. It feels like when I would like to use Emacs to make use of an environment, I'd have to install an "emacskernel" package in every environment. That's not how it works.
What I would like is to have one globally installed Jupyter, which can just be pointed to different environments -- similar to how the Julia REPL with julia --project=... works (yeah I know that conda is not a built-into-the-language package manager, but you should get the analogy...). (This would kinda work if conda environments would "inherit", i.e., fall back to the "global environment" for unfound dependencies, and you could just use the global Jupyter from within each one; but as I understand, they don't?)
Is this possible at all? What am I missing? Are there any better alternatives providing global Jupyter + local environments? (I must admit I have never used virtualenv or the like...)
(This older question seems to cover the same topic for pipenv, but there's not real answer there... neither a definite NO, nor an explaination why.)
I want to install Eric with pip into a Linux environment set up with pyenv. The program works, but I can not see my packages. I can not see what the problem is.
Here is what I did
I installed eric into a virtualenv "eric6" usind pip install eric-ide. That worked fine.
I run eric from a direkt link to the executable in the .pyenv folder.
I add a new virtualenv
pyenv virtualenv 3.6.9 default
pyenv activate default
pip install numpy
Now start Eric, start a program with the line import numpy and you will get a ModuleFileNotFound error. Programs with no external modules work fine.
The PyPI page will not show the same list for "default" than pip list on the command line.
What could be the problem?
Eric-IDE is a great and fully open source environment for Python. It has stunningly comfortable and widely configurable features and offers a perfect workflow while being rather intuitive to use. Really a full blown IDE. Although I am not using QT (where it offers even more integration features) I am really glad to resolve this major show stopper I came across which was issues with pyenv. I found the solution for my problems and now can answer my own question. Maybe its useful for others.
Eric-IDE can be used well with pyenv in Linux.
Install Eric into an own virtualenv. And your programs in others. At least there is no need to install Eric into the OS system.
Eric has 2 dialogs in the "extra" menu dealing with Virtual Envs and one PyPI Window where your can list the installed packages per virtual env.
The "Manager"
Eric only recognizes automatically on virtualevn. This is the one where eric was installed.
This is named , but you may change the name. I change it to "eric" in my installations to avod confusion. There is one setting in the related Edit Dialog saying "Global Environment". For the virtualenv where Eric is installed this is checked and works, even though its actually not the global (=system) environment. Maybe the change that in versions after 20.5, because it would be more logical if that checkbox would be unchecked.
Use the Add button to enter data of other virtual environments you have on your system. It does matter which folders you enter as it might not see your packets when you entered the wrong folder (e.g. wrong folder for the "Interpreter"). Following is what is correct for a standard pyenv virtualenv created with "pyenv virtualevn 2.8.18 pythonLegacy" in Ubuntu 18.04. When you point the Interpreter path to the actual python installation Eric will not see your packages. It only works if you point to the symlink as in the example.
Logical Name pythonLegacy
Directory
/home/user/.pyenv/versions/pythonLegacy/lib/python2.7/site-packages
Python Interpreter /home/user/.pyenv/versions/pythonLegacy/bin/python2.7
All options unchecked
PATH Prefix <empty>
if you add the OS system environment you must check the "Global Environment" checkbox. It will work as expected.
The "Configuration"
the name is misleading, because this dialog allows you to create a new virtual env from within Eric IDE.
Please be careful what you enter, because it will write (or overwrite) data in your pyenv folder. As there is not documentation about this dialog, its a good idea to backup your " ~/.pyenv" (or whatever) pyenv home folder before testing what settings it needs.
PS: I would hope more people start using this, so the default Ubuntu repositories would start upgrading the package. Currently it is not among the maintained ones.
TLDR: How should miniconda environments be added to the environment variables such that multiple conda environments will run with minimal fuss?
Long Story/Background
I'm on Windows 10 and got fed up with trying to use python directly and decided to give miniconda a go. I'm running python 3.8 with the main installed package being numpy. Everything was fine in the console, but Pycharm had the classic Importing the numpy c-extensions failed. After trying reinstallation, I found another question that had gotten it to work by adding more folders to the system path. This worked only when the additional library paths
C:\Users\USERNAME\.conda\envs\num38
C:\Users\USERNAME\.conda\envs\num38\DLLs
C:\Users\USERNAME\.conda\envs\num38\Lib
C:\Users\USERNAME\.conda\envs\num38\Library
C:\Users\USERNAME\.conda\envs\num38\Library\bin
C:\Users\USERNAME\.conda\envs\num38\Scripts
were directly added to the system path, not as a secondary path variable, i.e. %num38_path%. I also tried it with the secondary path as a runtime environment variable for the configuration in Pycharm but that also didn't work.
Why doesn't this secondary path method work?
I'm currently only using this one virtual environment, but if, in the future, I want to have another conda environment, would these paths being in system path be an issue?
Is it correct to expect a Conda environment to provide complete isolation and containment for pip/pipenv usage?
Let's say I create and activate a Conda environment and name it "pip-pip", then proceed with my project, which uses pipenv, while completely ignoring the fact that this is happening with a Conda environment activated.
Will all traces of that pipenv project be contained in "pip-pip", or is there a possibility of a spillover?
Will the fact that pip/pipenv is used from within "pip-pip" negatively affect the experience in any way?
This arrangement should work fine, as long as your shell and environment variables are configured correctly.
If you try to activate the Pipenv without the "Pip-pip" Conda environment active, you might have breakage or other unpredictable behavior, as Pipenv was installed with one Python and is being run with another. The extent of the breakage depends on the implementation details of Pipenv.
As a general rule, it should be possible to nest such "environment" programs arbitrarily, as long as they are well-designed, and as long as you activate the chain of environments in the order that they were originally installed. Whether this negatively affects your experience depends on your tolerance for annoyance.
However, Pipenv by default creates virtual environments in a global location. I'm not sure what that location is, but it's possible that you could end up with Pipenv environments installed alongside each other that depend on different Python versions. This, I think, might constitute "spillover" in the sense of your question.
I have a project called ABC, I have a conda env just for it in the fold ~/anaconda/envs/ABC, I believe it is a venv, and I want to use some specific packages from the global site packages.
For normal Python installation it can be done be removing the no-global-site-package.txt from the venv folder, or by setting the venv to use global-site-packages, but I didn't find any equivalent approach to do this in Anaconda. The online documentation does not have answer either.
How to do this for Anaconda?
you cannot do this explicitly in conda, where the principle is that envs are entirely separate.
but the current default behavior of conda is to allow all global user site-packages to be seen from within environments, as mentioned in this question. so, the default behavior will allow you to do as you wish, but there is no way to allow only "some specific" global packages as requested.
this behavior has caused one or two issues. to avoid it, export PYTHONNOUSERSITE=1 before source activate <your env>. note that the devs are planning to change the default behavior to set PYTHONNOUSERSITE=1 in 4.4.0 (per the second issue linked).
In case anyone is coming back to this now, for conda 4.7.12, entering export PYTHONNOUSERSITE=True before the conda activate call successfully isolated the conda environment from global/user site packages for me.
On the other hand, entering export PYTHONNOUSERSITE=0 allows the conda environment to be reintroduced to the global/user site packages.
Note: This is instead of the previously suggested export PYTHONNOUSERSITE=1.
You could use the PYTHONPATH environment variable. For example
export PYTHONPATH="/Users/me/anaconda/lib/python2.7/site-packages:$PYTHONPATH"
would give every environment access to all the libraries in the anaconda distribution. Sort of defeats the purpose of environments though. And if you then want access to a library you installed with home-brew as well, you would add
export PYTHONPATH=/usr/local/Cellar/another_package/lib/python2.7/site-packages:$PYTHONPATH