I have 2-3 dozen Python projects on my local hard drive, and each one has its own virtualenv. The problem is that adds up to a lot of space, and there's a lot of duplicated files since most of my projects have similar dependencies.
Is there a way to configure virtualenv or pip to install packages into a common directory, with each package namespaced by the package version and Python version the same way Wheels are?
For example:
~/.cache/pip/common-install/django_celery-3.1.16-py2-none-any/django_celery/
~/.cache/pip/common-install/django_celery-3.1.17-py2-none-any/django_celery/
Then any virtualenv that needs django-celery can just symlink to the version it needs?
The whole point of virutalenv is to isolate and compartmentalize dependencies. What you are describing directly contradicts its use case. You could go into each individual project and modify the environmental variables but that's a hackish solution.
Related
I want to download python libraries like NumPy, scipy, etc. in a separate folder. I want to include that folder in the python project so that whenever I switch to some other laptop, I don't need to install the libraries again rather I import libraries from that folder. Is there any way?
You can easily install python virtualenv.
Your libraries will be installed in directory created by virtualenv.
https://pypi.org/project/virtualenv/.
Other option, you can also use docker.
I suggest using virtual environment in this case. You could use pipenv so that the project hast exactly the libraries you need for it to run.
You can do it.
for numpy library: https://pypi.org/project/numpy/#files
You can download the files statically from pypi.
I would not recommend you go with this approach. There are several reasons to do that.
There would be a dependency on this kind of library. So you have to keep these dependencies along with the NumPy package.
These libraries are getting updated after some time with some newly added functionality and some bug fixes. So with the time other libraries might not compatible with this library.
Recommended way:
Just create an requirement.txt file that contains all the dependency with its version number.
whenever you want to use your project elsewhere, just install all these libraries with below command.
pip install -r requirement.txt
There are two major way you can install python libs to a separate folder: a virtual environment or a container.
Virtula environment (like venv, pipenv, etc) is good as this is the simplest way to your project's own liblaries set which is not impact any other pythonic script in your system. The downside of this case is thet you really have to set up an environment (including lib install) on every computer you move your script to. This can and should be autimated, of course, but this should be done either way.
The container, in other hand, requires additional resources to handle and to build, build, but it is exactly the box with a specific version of your script along with all libs and binaries it requires. No need to reinstall libs while moving to new laptop/desktop/server/cloud/whatever. For this case I would recommend the Docker/Kubernetes. But it's better to start with Docker.
I have a requirements.txt file for a Python code base. The file has everything specified:
pytz==2017.2
requests==2.18.4
six==1.11.0
I am adding a new package. Should I list its version? If yes, how do I pick a version to specify?
Check out the pip docs for more info, but basically you do not need to specify a version. Doing so can avoid headaches though, as specifying a version allows you to guarantee you do not end up in dependency hell.
Note that if you are creating a package to be deployed and pip-installed, you should use the install-requires metadata instead of relying on requirements.txt.
Also, it's a good idea to get into the habit of using virtual environments to avoid dependency issues, especially when developing your own stuff. Anaconda offers a simple solution with the conda create command, and virtualenv works great with virtualenvwrapper for a lighter-weight solution. Another solution, pipenv, is quite popular.
Specifying a version is not a requirement though it does help a lot in the future. Some versions of packages will not work well with other packages and their respective versions. It is hard to predict how changes in the future will effect these interrelationships. This is why it is very beneficial to create a snapshot in time (in your requirements.txt) showing which version interrelationships do work.
To create a requirements.txt file including the versions of the packages that you're using do the following. In your console/ terminal cd into the location that you would like your requirement.txt to be and enter:
pip freeze > requirements.txt
This will automatically generate a requirement.txt file including the packages that you have installed with their respective versions.
A tip - you should aim to be using a virtual environment for each individual project that you'll be working on. This creates a 'bubble' for you to work within and to install specific package versions in, without it effecting your other projects. It will save you a lot of headaches in the future as your packages and versions will be kept project specific. I suggest using Anacondas virtual environment.
No, there is no need to specify a version. It's probably a good idea to specify one, though.
If you want to specify a version but you don't know which version to specify, try using pip freeze, which will dump out a list of all the packages you currently have installed and what their versions are.
I have a Python program that has dependencies on other Python libraries. I have used virtualenv and pip to get requirements.txt for all the libs needed to run the app, and thus keeping the environment clean of unnecessary libs. Things work great and I can make progress in developing the app.
This works on my machine, but the issue is that I need to package the app and deploy/distribute to an environment where the requirements.txt and pip cannot be used to simply download the dependencies. The target environment needs a fully functional application.
I'm a bit confused with all these tools offered by Python, such as setuptools and distutils, since none of them seem to offer this (at least easily).
I'm used to Java way, with Maven/Gradle etc. where one simply states dependencies and they are added to the distributable jar/war, unless explicitly stated otherwise.
The dependencies are install inside my virtual environment, under scripts/dir. Is there some easy way to get the dependencies bundled within my app with standard tools, or do I need to roll my own for this?
I have a Python package that is one of a collection of company Python packages. When I run
python setup.py install
I want the package to be installed to a common company directory, along with other company packages. I want this directory to be relative to the default Python install directory, e.g.,
/usr/lib/python2.7/site-packages/<company_name>/<python_package_name>
That is, I want to insert <company_name> into the installation path at install time.
I've seen ways to prefix this path, but can't seem to work out how to do what I've described.
Unfortunately Python packaging doesn't work like that. You could probably bend it to work like that but that would be quite an effort for a person without experience in Python packaging and even for experienced persons the amount/output tradeoff would not make sense. You do not mention any motive to do this besides your personal preference.
Instead, to have well-managed and human-navigable package installation folder, I recommend you to study the following resources
PEP 0382 - Namespace Packages: How to create packages like companyname.foobar, companyname.moomoo
Installing packages into a virtualenv - Python packaging installation guide (official)
Scrambler: Symlink namespaced Python packages to a single folder
I have a django project set up in a virtual environment. I want to turn into a package (may be a tar), which I will just make available for download so that any one can just download, extract & run the project without any hassles of installing dependencies.
No, you shouldn't just simply make a tar of your django project and distribute it.
In the original documentation of Django there are instructions how reusable apps can be packaged and distributed. An explanation here would go beyond the scope of an answer. Please check the particular part of the documenation:
https://docs.djangoproject.com/en/1.8/intro/reusable-apps/
You can provide the dependencies for your package, which will be then automatically installed by pip.