Install Test Python Package in Different directory Using Pip - python

I have a python distutils module that I use in production. I have that production module installed in a virtual environment. However, I'd like to be able to test upgrades before installing in the production environment. I'm also trying to avoid creating a second virtual environment. Therefore I tried the following:
# inside my virtualenv
# checkout master of my repo
git clone git+git://github.com/myrepo
cd myrepo
# create directory where my testing install will live
mkdir testinstall
# prepend my testing install to the PYTHONPATH to over-ride the
# production install of my repo
export PYTHONPATH=$PWD/testinstall/lib/python2.7/site-packages:$PYTHONPATH
# install my local package with pip into the test area
pip install --prefix=$PWD/testinstall .
In this case I get the error from pip:
Requirement already satisfied from /path/to/production/myrepo
If I use
pip install --upgrade --prefix=$PWD/testinstall
pip proceeds to uninstall the production version from /path/to/production/myrepo and install my test version in the testinstall area.
Any idea how to force pip to install in this way?

Try doing it this way instead
pip install --target=d:\somewhere\other\than\the\default package_name

I'd just use a new venv but if you really can't, use the -t (target) option.
pip3 install --upgrade -t my_new_directory

Related

Create standalone isolated python

I'm to deploy Python into a production system and the python script I have has a number of modules associated with it.
Is there a way to install python with only a specific list of modules? Abit like with generating a jar, you can have a folder with all the other dependency jar's in a folder, which is nice and clean. I don't want to compile the python code so I want something similar.
(Note: I also don't want to create a virtual environment - I want the default environment like this)
You can either use virtualenv, which basically is what the name suggests, or you can use Docker, which personally I prefer
If you don't want to do like what Amir is suggesting above, then 2 other options are available:
Copy those modules and place them in the same folder where your script is installed
Create a requirements.txt file with the name & version of those modules and then run "pip install -r requirements.txt" to install these modules in your site-packages folder
To manage your python packages you can use great virtualenv tool, it looks really simple and works well on linux/macOS/Windows. Any package which will be installed in activated virtualenv will be available only in this virtualenv, so you can have for example 3 different versions of "Django" package on your machine and work with them using different virtual environments:
Install virtualenv:
$ pip3 install virtualenv
Create your virtualenv:
$ virtualenv -p python3 my_virtualenv_name
Activate your virtualenv:
$ . my_virtualenv_name/bin/activate
Check what packages have been installed:
$ pip freeze
Install any package for example "Django":
$ pip install Django
Confirm installation:
$ pip freeze | grep Django
Uninstall any package from your virtual environment:
$ pip uninstall Django -y
Uninstall all packages from your virtual environment:
$ pip freeze | xargs pip uninstall -y
Deactivate virtualenv
$ deactivate
More info in the official documentation: https://virtualenv.pypa.io/en/latest/

Equivalent in python of package.json and "npm install --save" command to easly save every new package [duplicate]

I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.

python: how to install and use setuptools in virtual python

I do not have root previlege on a linux server so I want to creat a virtual python according to creating a "virtual" python.
After I run virtual-python.py, I do have python in ~/bin/python:
Then, according to setuptools PyPI page, I download ez_setup.py and run ~/bin/python ez_setup.py. Error occurs:
What should I do?
Looking at the linked website, it looks outdated. You use pip, not easy_install.
For installing development packages, I always take the following rules in account:
The system package manager is responsible for system-wide packages, so never use sudo pip. This doesn't just match the question, but this is always a good idea.
The package manager packages are probably outdated. You'll want an up-to-date version for development tools.
I recommend the following way to install local development tools.
$ # Install pip and setuptools on a user level
$ curl https://bootstrap.pypa.io/get-pip.py | python - --user
$ # Add the executables to your path. Add this to your `.bashrc` or `.profile` as well
$ export PATH=$PATH/$HOME/.local/bin
At this point pip should be accessible from the command line and usable without sudo. Use this to install virtualenv, the most widely used tools to set up virtual environments.
$ pip install virtualenv --user
Now simply use virtualenv to set up an environment to run your application in:
$ virtualenv myapp
Now activate the virtual environment and do whatever you would like to do with it. Note that after activating the virtual environment, pip refers to pip installed inside of the virtualenv, not the one installed on a user level.
$ source myapp/bin/activate
(myapp)$ pip install -r requirements.txt # This is just an example
You'll want to create a new virtual environment for each application you run on the server, so the dependencies can't conflict.

Installing Python Package from Github Using PIP

I've seen it documented that you can install a Github hosting Python package using pip via:
sudo pip install -e git+git://github.com/myuser/myproject.git#egg=myproject
However, this appears to install the package to the current working directory, which is almost never where is should be.
How do you instruct pip to install it into the standard Python package directory (e.g. on Ubuntu this is /usr/local/lib/python2.6/dist-packages)?
The -e flag tells pip to install it as "editable", i.e. keep the source around. Drop the -e flag and it should do about what you expect.
sudo pip install git+git://github.com/myuser/myproject.git#egg=myproject
If that doesn't work try using https instead of git.
sudo pip install git+https://github.com/myuser/myproject.git#egg=myproject
For Python 3 make sure you have python3-pip installed (and of course git installed):
The syntax just changed to:
sudo pip3 install git+git://github.com/someuser/someproject.git

Bypass confirmation prompt for pip uninstall

I'm trying to uninstall all django packages in my superuser environment to ensure that all my webapp dependencies are installed to my virtualenv.
sudo su
sudo pip freeze | grep -E '^django-' | xargs pip -q uninstall
But pip wants to confirm every package uninstall, and there doesn't seem to be a -y option for pip. Is there a better way to uninstall a batch of python modules? Is rm -rf .../site-packages/ a proper way to go? Is there an easy_install alternative?
Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies, e.g. pip --upgrade install, but forcing even equally old versions to be installed to override any system modules. I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).
starting with pip version 7.1.2 you can run pip uninstall -y <python package(s)>
pip uninstall -y package1 package2 package3
or from file
pip uninstall -y -r requirements.txt
Pip does NOT include a --yes option (as of pip version 1.3.1).
WORKAROUND: pipe yes to it!
$ sudo ls # enter pw so not prompted again
$ /usr/bin/yes | sudo pip uninstall pymongo
If you want to uninstall every package from requirements.txt,
pip uninstall -y -r requirements.txt
on www.saturncloud.io, Jupiter notebooks one can use like this:
!yes | pip uninstall tensorflow
!yes | pip uninstall gast
!yes | pip uninstall tensorflow-probability
Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies,
Yes. Don't mess too much with the inbuilt system installed packages. Many of the system packages, particularly in OS X (even the debian and the derived varieties) depend too much on them.
pip --upgrade install, but forcing even equally old versions to be installed to override any system modules.
It should not be a big deal if there are a few more packages installed within the venv that are already there in the system package, particularly if they are of different version. Thats the whole point of virtualenv.
I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).
No, it doesn't install the packages already there in the main installation unless you have used the --no-site-packages flag to create it, or the required and present versions are different..
Lakshman Prasad was right, pip --upgrade and/or virtualenv --no-site-packages is the way to go. Uninstalling the system-wide python modules is bad.
The --upgrade option to pip does install required modules in the virtual env, even if they already exist in the system environment, and even if the required version or latest available version is the same as the system version.
pip --upgrade install
And, using the --no-site-packages option when creating the virtual environment ensures that missing dependencies can't possibly be masked by the presence of missing modules in the system path. This helps expose problems during migration of a module from one package to another, e.g. pinax.apps.groups -> django-groups, especially when the problem is with load templatetags statements in django which search all available modules for templatetags directories and the tag definitions within.
pip install -U xxxx
can bypass confirm

Categories

Resources