Lasagne package issue - python

In Conda prompt it shows resolving metadata and then in the solving environment it takes lots of time & shows error saying it couldn't perform by frozen method so tries for dynamic pan but still not executed.
How can I install the lasagne package in Anaconda?

This is most likely a Python version issue. The Lasagne Python package has not been updated in a few years, and therefore the only versions of Python it is compatible with are 2.7, 3.4, and 3.5 (see available files on Anaconda Cloud). Any relatively recent installations of Anaconda are likely not using any of these versions and changing the Python version in your base env is not recommended - hence why it likely fails to solve. Instead, create a new environment specifically for Lasagne, e.g.,
conda create -n my_lasagne_env lasagne
Then you can use this env with conda activate my_lasagne_env.
Generally, the best practice in Conda is to create a new environment for each project or project type, and minimize updating or adding new packages after you have the packages required for the development project. This includes not avoiding making changes to your base env, aside from the occasional conda update conda to make sure the package manager itself stays up to date.

To install this package with conda run:
conda install -c toli lasagne
Solving environment look at Here
If not you can find solution : https://github.com/conda/conda/issues/7690

Related

why does pip export with a different version on subsequent conda env exports?

I have been trying to establish a pre-commit git hook to detect environment changes and create a new env.yml export automatically ... similar to the ones described here
Where I am having trouble is that the git hook is detecting an environment change with the pip package on every run of the pre-commit file. Is this possibly related to some scripts using different versions of pip?
If so, I don't understand why the same version isn't being exported every time I run conda env export > env.yml. It almost seems like it is randomly toggling between versions ... but I know there must be some rationale
conda and pip have their own versions of every package installed (provided that you have installed a certain app using both). anaconda (if it's what you're using) is also known for giving plenty of headaches even in simple cases when you pip install something instead of conda install and start mixing dependencies installed with either of those. The general advice is to be very careful about being consistent with each environment separately. In my personal experience, anaconda always tries to superimpose itself by by breaking dependencies managed by pip. In short, if you are using a conda env, make sure that you're using the dependencies installed by conda and conda only.

How to create an python evironment, where only difference is version of python, in conda?

I currently have a python 3.7 installed by using anaconda on the machine. My intention is to create a lower version of python environment, say 3.6, for reason of compatibility. I follow the documentation to create the conda environment as conda create -n py36 python=3.6 However, this environment is a clean version of python, where many additional package like numpy, scipy are missing and these packages are already installed on python3.7. So what is the best way that I can create not only a python but also migrate all other packages in previous python version.(python3.7)
I understand the dependency may be different since some packages are not compatible with old version of python, but I still want to migrate as many packages as possible and let conda itself to decide the dependency tree. Current, what I can do is to first create a clean environment and manually conda install numpy and so on, which definitely not a good idea.
#Save all the info about previous env in requirements file
conda list -e > requirement.txt
then change the python version in the created 'requirement.txt' file
#the create new env from requirement file:
conda env create -f requirement.txt

Scikit-learn - installing development version (0.20)

I currently have scikit-learn 0.19 installed. I'd like to test my code using the latest development version as there seems to be a fix for Incremental PCA.
How do I go about installing this new version if I've previously installed scikit-learn using anaconda?
Also, how would I revert back to the stable release in the event that 0.20 does not solve my problem?
I am in need of some hand holding here, as I've read the docs on the website and not sure I completely understand the process (especially being able to revert back to the stable version if needed).
The whole point of the Anaconda Python distribution (apart from the convenience of having a bunch of useful packages included) is that you get the conda environment manager, which exists to meet exactly this sort of requirement.
What you want to do is to create a new conda environment by launching the Anaconda prompt and typing
conda create -n myenv scikit-learn other-package other-package2 etc
where myenv is the name you want to give the new environment and other-package other-package2 etc are the names of any other packages you will want to use (import) in your code. conda will figure out any dependencies of these packages and show you a list of what is going to be installed before it proceeds.
If you want to specify that a package should be a particular version, add that to the package name e.g. other-package=1.1.0, otherwise conda will install the latest versions of each package that are mutually compatible. You can also specify a particular version of Python by including it in the package list, e.g. python=3.4. You can check what versions of a package are available with conda search package-name (where package-name is the name of the package you want, obviously).
To run your code in the newly created environment, first activate the environment at the Anaconda prompt. If you use the Spyder IDE, launch it after activating the correct environment, or use the start menu shortcut specific to that environment if you have one. Other IDEs may have their own method of selecting a specific environment to work in.
To revert to the version(s) you were using before, activate the environment containing those versions - if you've never created a new environment before, that'll be root.
Just in case someone comes here looking for a solution without conda:
The website recommends that you download the latest code via
git clone git://github.com/scikit-learn/scikit-learn.git
and then include it in pip via (after changing to the directory)
pip install --editable .
You can also add the --user flag to have pip install to a local directory. Then, uninstalling should be as easy as pip uninstall sklearn.

python libraries managed by conda and native python

In my Ubuntu 14.04 computer, I have installed two kinds of Python, one is called native python, which comes along with the Ubuntu operating system, and the other is the conda version, which is installed after I installed the conda package.
If I launch python command, the default python refers to the conda version.
Using conda can bring a lot of advantages for package management. But before I installed conda, I have already installed some Python modules with the native Python by using pip install command. These modules, however, are not reachable by conda Python. So, here is my question: how can I set conda so that it can use the packages managed by native python?
When I ask this question, I cannot help asking another questions:
Is it a good practice to mix packages managed by conda and native python? Any practice I can follow?
How can I switch Python to native Python?
It is not a good practice to mix packages managed by conda and native python. You can still, however, install Python modules using pip into Anaconda. I would recommend strictly using Anaconda (and use conda virtual environments as well as the conda package manager), and not using native python any longer.
Your best bet is to use strictly Anaconda moving forward. I would reinstall the packages into a conda virtual environment.
conda create --name NAME_HERE
or
conda create --name NAME_HERE --clone root if you want to include all packages that come with Anaconda by default.
Then switch to your new environment with source activate NAME_HERE (Linux, macOS) or activate NAME_HERE (Windows). Then you can install packages with both the conda package manager and pip.
See the conda docs on managing conda virtual environments for details.
Kind user #MikhailKnyazev has pointed out that this is how you would use the packages managed by native python. It is still not recommended.
Although it is certainly not a good practice, it is useful to know
that you can add system-side packages inside virtual environment by
symlinking them like this: ln -s /usr/lib/<PYTHON_VER>/dist-packages/<PACKAGE> <virtualenv_path>/lib/<PYTHON_VER>/site-packages/

anaconda update all possible packages?

I tried the conda search --outdated, there are lots of outdated packages, for example the scipy is 0.17.1 but the latest is 0.18.0. However, when I do the conda update --all. It will not update any packages.
update 1
conda update --all --alt-hint
Fetching package metadata .......
Solving package specifications: ..........
# All requested packages already installed.
# packages in environment at /home/user/opt/anaconda2:
#
update 2
I can update those packages separately. I can do conda update scipy. But why I cannot update all of them in one go?
TL;DR: dependency conflicts: Updating one requires (by it's requirements) to downgrade another
You are right:
conda update --all
is actually the way to go1. Conda always tries to upgrade the packages to the newest version in the series (say Python 2.x or 3.x).
Dependency conflicts
But it is possible that there are dependency conflicts (which prevent a further upgrade). Conda usually warns very explicitly if they occur.
e.g. X requires Y <5.0, so Y will never be >= 5.0
That's why you 'cannot' upgrade them all.
Resolving
Update 1: since a while, mamba has proven to be an extremely powerful drop-in replacement for conda in terms of dependency resolution and (IMH experience) finds solutions to problems where conda fails. A way to invoke it without installing mamba is via the --solver=libmamba flag (requires conda-libmamba-solver), as pointed out by matteo in the comments.
To add: maybe it could work but a newer version of X working with Y > 5.0 is not available in conda. It is possible to install with pip, since more packages are available in pip. But be aware that pip also installs packages if dependency conflicts exist and that it usually breaks your conda environment in the sense that you cannot reliably install with conda anymore. If you do that, do it as a last resort and after all packages have been installed with conda. It's rather a hack.
A safe way you can try is to add conda-forge as a channel when upgrading (add -c conda-forge as a flag) or any other channel you find that contains your package if you really need this new version. This way conda does also search in this places for available packages.
Considering your update: You can upgrade them each separately, but doing so will not only include an upgrade but also a downgrade of another package as well. Say, to add to the example above:
X > 2.0 requires Y < 5.0, X < 2.0 requires Y > 5.0
So upgrading Y > 5.0 implies downgrading X to < 2.0 and vice versa.
(this is a pedagogical example, of course, but it's the same in reality, usually just with more complicated dependencies and sub-dependencies)
So you still cannot upgrade them all by doing the upgrades separately; the dependencies are just not satisfiable so earlier or later, an upgrade will downgrade an already upgraded package again. Or break the compatibility of the packages (which you usually don't want!), which is only possible by explicitly invoking an ignore-dependencies and force-command. But that is only to hack your way around issues, definitely not the normal-user case!
1 If you actually want to update the packages of your installation, which you usually don't. The command run in the base environment will update the packages in this, but usually you should work with virtual environments (conda create -n myenv and then conda activate myenv). Executing conda update --all inside such an environment will update the packages inside this environment. However, since the base environment is also an environment, the answer applies to both cases in the same way.
To answer more precisely to the question:
conda (which is conda for miniconda as for Anaconda) updates all but ONLY within a specific version of a package -> major and minor. That's the paradigm.
In the documentation you will find "NOTE: Conda updates to the highest version in its series, so Python 2.7 updates to the highest available in the 2.x series and 3.6 updates to the highest available in the 3.x series."
doc
If Wang does not gives a reproducible example, one can only assist.
e.g. is it really the virtual environment he wants to update or could Wang get what he/she wants with
conda update -n ENVIRONMENT --all
*PLEASE read the docs before executing "update --all"!
This does not lead to an update of all packages by nature. Because conda tries to resolve the relationship of dependencies between all packages in your environment, this can lead to DOWNGRADED packages without warnings.
If you only want to update almost all, you can create a pin file
echo "conda ==4.0.0" >> ~/miniconda3/envs/py35/conda-meta/pinned
echo "numpy 1.7.*" >> ~/miniconda3/envs/py35/conda-meta/pinned
before running the update. conda issues not pinned
If later on you want to ignore the file in your env for an update, you can do:
conda update --all --no-pin
You should not do update --all. If you need it nevertheless you are saver to test this in a cloned environment.
First step should always be to backup your current specification:
conda list -n py35 --explicit
(but even so there is not always a link to the source available - like for jupyterlab extensions)
Next you can clone and update:
conda create -n py356 --clone py35
conda activate py356
conda config --set pip_interop_enabled True # for conda>=4.6
conda update --all
conda config
update:
Currently I would use mamba (or micromamba) as conda pkg-manager replacement
update:
Because the idea of conda is nice but it is not working out very well for complex environments I personally prefer the combination of nix-shell (or lorri) and poetry [as superior pip/conda .-)] (intro poetry2nix).
Alternatively you can use nix and mach-nix (where you only need you requirements file. It resolves and builds environments best.
On Linux / macOS you could use nix like
nix-env -iA nixpkgs.python37
to enter an environment that has e.g. in this case Python3.7 (for sure you can change the version)
or as a very good Python (advanced) environment you can use mach-nix (with nix) like
mach-nix env ./env -r requirements.txt
(which even supports conda [but currently in beta])
or via api like
nix-shell -p nixFlakes --run "nix run github:davhau/mach-nix#with.ipython.pandas.seaborn.bokeh.scikit-learn "
Finally if you really need to work with packages that are not compatible due to its dependencies, it is possible with technologies like NixOS/nix-pkgs.
Imagine the dependency graph of packages, when the number of packages grows large, the chance of encountering a conflict when upgrading/adding packages is much higher. To avoid this, simply create a new environment in Anaconda.
Be frugal, install only what you need. For me, I installed the following packages in my new environment:
pandas
scikit-learn
matplotlib
notebook
keras
And I have 84 packages in total.
I agree with Mayou36.
For example, I was doing the mistake to install new packages in the base environment using conda for some packages and pip for some other packages.
Why this is bad?
1.None of this is going to help with updating packages that have been > installed >from PyPI via pip, or any packages installed using python
setup.py install. conda list will give you some hints about the
pip-based Python packages you have in an environment, but it won't do
anything special to update them.
And I had all my projects in the same one environment! And I used update all -which is bad and did not update all-.
So, the best thing to do is to create a new environment for each project. Why?
2. A Conda environment is a directory that contains a specific collection of Conda packages that you have installed. For example, you
may be working on a research project that requires NumPy 1.18 and its
dependencies, while another environment associated with an finished
project has NumPy 1.12 (perhaps because version 1.12 was the most
current version of NumPy at the time the project finished). If you
change one environment, your other environments are not affected. You
can easily activate or deactivate environments, which is how you
switch between them.
So, to wrap it up:
Create a new environment for each project
Be aware for the differences in conda and pip
3.Only include the packages that you will actually need and update them properly only if necessary.
if working in MS windows, you can use Anaconda navigator. click on the environment, in the drop-down box, it's "installed" by default. You can select "updatable" and start from there
To update all possible packages I used conda update --update-all
It works!
I solved this problem with conda and pip.
Firstly, I run:
conda uninstall qt and conda uninstall matplotlib and conda uninstall PyQt5
After that, I opened the cmd and run this code that
pip uninstall qt , pip uninstall matplotlib , pip uninstall PyQt5
Lastly, You should install matplotlib in pip by this code that pip install matplotlib

Categories

Resources