I have written a Python utility script that relies on the Debian package python3-apt:
import apt
...
def get_packages():
cache = apt.Cache()
for pkg in cache:
if pkg.installed and pkg.name in PACKAGE_LIST:
yield pkg.name
I am now expanding the script into a project, with the eventual intent of making it available on PyPI and/or as a Debian package itself.
I use virtualenvs to isolate my Python development environments. What package name (or path) do I need to add to my virtualenv so that I can call import apt from within that environment?
So far I have tried:
apt on PyPI. Strange old release.
vext. Does not currently support apt.
other things on PyPI that start with "apt". None of them are a simple intermediary for python3-apt.
You can achieve this with pipenv as follows (similar instructions should work for other venv managers):
pipenv --site-packages # see note 1
PIP_IGNORE_INSTALLED=1 pipenv install # see note 2
You are more likely to run this as:
pipenv --site-packages
PIP_IGNORE_INSTALLED=1 pipenv install -e . --dev
# treats codebase as a package, also installs dev dependencies
Note 1: We must access system packages (aka site packages) so that we can import apt.
Note 2: ...but we prefer virtualenv packages to system packages. See
https://pipenv.pypa.io/en/latest/advanced/#working-with-platform-provided-python-components for details.
Comments:
This means that all other system packages not defined in your Pipfile are also available in your venv. You must remember that they aren't necessarily available to other developers using the same codebase. If you have a basic CI environment, it should catch this.
This approach will work for other packages not supported by vext.
Related
Running command line:
virtualenv --system-site-packages venv
I'm expecting venv folder venv\Lib\site-packages to contain all the necessary library from the projects that are located in:
C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
But it's not the case, only a few are installed.
Example, my program currently use pdfminer which is in
C:\Users\XXXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
I want it to be included in venv\Lib\site-packages but it is not copied.
Any advice?
--system-site-packages doesn't copy packages, it just allows python from the virtualenv to access packages in C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\.
There is no way to copy packages because they could depend on their installation directory. If you want these packages in the virtualenv don't use --system-site-packages and install all packages in the virtualenv.
A virtualenv environment is the same as if you have just installed a new version of Python. It has no packages other than the standard packages provided with Python. If you want other packages, you have to install them with 'pip' or however you'd do it with the native Python version that you are using.
So in general, just do pip install <packagename>.
If you find yourself often wanting to create virtualenvs with a standard set of base packages, then put together a requirements.txt file listing all of the packages you want to install as a base, and do pip install -r requirements.txt inside a new virtualenv, right after you create it.
One nice thing about a virtualenv is that it's all yours. Your user owns 100% of it, unlike the base Python version that is owned by the system. To install new packages into the base Python version, you often have to have root access (sudo privileges). With virtualenvs, you don't need special permissions (in fact, you'll get all screwed up if you use sudo in a virtualenv) to install all the packages you want. Everything you do lives within your own home directory. Another neat thing is that when you are done with a virtualenv, you just throw away the root directory that contains it.
If its not mandatory to use virtualenv, I would suggest to go with Anaconda. That'll pretty much help your concern.
Conda as a package manager helps you find and install packages. By default quite a few packages are already installed, so as to set you up quickly for your project. To check the list of packages installed in terminal, type: conda list to obtain the packages installed using conda.
If you need a package that requires a different version of Python, you do not need to switch to a different environment manager, because conda is also an environment manager.
With just a few commands, you can set up a totally separate environment to run that different version of Python, while continuing to run your usual version of Python in your normal environment
I am trying to install flask and a few other modules for Python2.
When I try to install them using command pip install flask, it installs these for Python3.
This has created major issues because things like django are not compatible with Python3.
When I want to run a program using Python2, I cant use any of these modules.
Question
How do I use pip to install modules into a specified version of Python?
Try:
python2.7 -m pip install flask
Python “Virtual Environments” allow Python packages to be installed in an isolated location for a particular application, rather than being installed globally.
Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.7/site-packages (or whatever your platform’s standard location is), it’s easy to end up in a situation where you unintentionally upgrade an application that shouldn’t be upgraded.
Or more generally, what if you want to install an application and leave it be? If an application works, any change in its libraries or the versions of those libraries can break the application.
Also, what if you can’t install packages into the global site-packages directory? For instance, on a shared host.
In all these cases, virtual environments can help you. They have their own installation directories and they don’t share libraries with other virtual environments.
Currently, there are two common tools for creating Python virtual environments:
venv is available by default in Python 3.3 and later, and installs pip and setuptools into created virtual environments in Python 3.4 and later.
virtualenv needs to be installed separately, but supports Python 2.6+ and Python 3.3+, and pip, setuptools and wheel are always installed into created virtual environments by default (regardless of Python version).
The basic usage is like so:
Using virtualenv:
virtualenv <DIR>
source <DIR>/bin/activate
Using venv:
python3 -m venv <DIR>
source <DIR>/bin/activate
For more information, see the virtualenv docs or the venv docs.
Managing multiple virtual environments directly can become tedious, so the dependency management tutorial introduces a higher level tool, Pipenv, that automatically manages a separate virtual environment for each project and application that you work on.
https://packaging.python.org/tutorials/installing-packages/#creating-virtual-environments
I am a ruby programmer trying to learn python. I am pretty familiar with pyenv since it is like a copy and paste from rbenv. Pyenv helps allow to have more than one version of python in a system and also to isolate the python without touching sensitive parts of system.
I suppose every python installation comes with pip package. What I still don't understand is, there are many good python libs out there that suggest to use this virtualenv and anaconda. I can even find a virtualenv plugin for pyenv.
Now I am getting confused with the purpose of these two pyenv and virtualenv.
worse inside pyenv there is a virtualenv plugin.
My questions are:
what is the difference between pyenv and virtualenv?
Is there any difference in using pip command inside both pyenv and virtualenv?
what does this pyenv virutalenv do?
Your explanation with example will be highly appreciated.
Edit: It's worth mentioning pip here as well, as conda and pip have similarities and differences that are relevant to this topic.
pip: the Python Package Manager.
You might think of pip as the python equivalent of the ruby gem command
pip is not included with python by default.
You may install Python using homebrew, which will install pip automatically: brew install python
The final version of OSX did not include pip by default. To add pip to your mac system's version of python, you can sudo easy_install pip
You can find and publish python packages using PyPI: The Python Package Index
The requirements.txt file is comparable to the ruby gemfile
To create a requirements text file, pip freeze > requirements.txt
Note, at this point, we have python installed on our system, and we have created a requirements.txt file that outlines all of the python packages that have been installed on your system.
pyenv: Python Version Manager
From the docs: pyenv lets you easily switch between multiple versions of Python. It's simple, unobtrusive, and follows the UNIX tradition of single-purpose tools that do one thing well. This project was forked from rbenv and ruby-build, and modified for Python.
Many folks hesitate to use python3.
If you need to use different versions of python, pyenv lets you manage this easily.
virtualenv: Python Environment Manager.
From the docs: The basic problem being addressed is one of dependencies and versions, and indirectly permissions. Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.7/site-packages (or whatever your platform’s standard location is), it’s easy to end up in a situation where you unintentionally upgrade an application that shouldn’t be upgraded.
To create a virtualenv, simply invoke virtualenv ENV, where ENV is is a directory to place the new virtual environment.
To initialize the virtualenv, you need to source ENV/bin/activate. To stop using, simply call deactivate.
Once you activate the virtualenv, you might install all of a workspace's package requirements by running pip install -r against the project's requirements.txt file.
Anaconda: Package Manager + Environment Manager + Additional Scientific Libraries.
**Anaconda is a commercial distribution of Python with the most popular python libraries, you are not permitted to use Anaconda in an organisation with more than 200 employees.
From the docs: Anaconda 4.2.0 includes an easy installation of Python (2.7.12, 3.4.5, and/or 3.5.2) and updates of over 100 pre-built and tested scientific and analytic Python packages that include NumPy, Pandas, SciPy, Matplotlib, and IPython, with over 620 more packages available via a simple conda install <packagename>
As a web developer, I haven't used Anaconda. It's ~3GB including all the packages.
There is a slimmed down miniconda version, which seems like it could be a more simple option than using pip + virtualenv, although I don't have experience using it personally.
While conda allows you to install packages, these packages are separate than PyPI packages, so you may still need to use pip additionally depending on the types of packages you need to install.
See also:
conda vs pip vs virtualenv (section in documentation from anaconda)
the difference between pip and conda (stackoverflow)
the relationship between virtualenv and pyenv (stackoverflow)
Simple analogy:
pyenv ~ rbenv
pip ~ bundler
virtual env ~ gemset in rvm. This can be managed by bundler directly without gemset.
Since I use python3 I prefer the python3 built-in virtual environment named venv. venv is simple and easy to use. I would recommend you to read its official docs. The doc is short and concise.
In ruby, we don't really need a virtual environment because the bundler takes care of it. Both virtual env and bundler are great, however, they have different solutions to solve the same problem.
Simple explanation: https://docs.conda.io/projects/conda/en/latest/commands.html#conda-vs-pip-vs-virtualenv-commands
If you have used pip and virtualenv in the past, you can use conda to perform all of the same operations.
Pip is a package manager
virtualenv is an environment manager
Conda is both
def get_pkgs():
pkgs = []
for importer, modname, ispkg in \
pkgutil.walk_packages(path=None, onerror=lambda x: None):
pkgs.append(modname)
return pkgs
Above code snippet gives me all python packages in distribution. Can someone suggest me a way to get all packages out of these used by a python application ?
Snakefood is nice, as already answered by Sven Marnach, but this gives all file dependencies and I'm not sure if it will just tell you what packages are used, versus every single file dependency.
For packages available in the Python Package Index, you could use virtualenv (and pip, which comes with it) to get a simple list of required/used packages.
For example, assuming you also have the excellent virtualenvwrapper tools installed (highly recommended), below is a sequence that shows what the package requirements are for pylint:
$ mkvirtualenv pylint_dep_check --no-site-packages
New python executable in pylint_dep_check/bin/python
Installing setuptools............done.
$ pip freeze #Note the wsgiref 'bug' where it always shows up
wsgiref==0.1.2
$ workon pylint_dep_check
(pylint_dep_check) $ pip install pylint
(... snipped lengthy install text ...)
(pylint_dep_check) $ pip freeze
logilab-astng==0.21.1
logilab-common==0.55.0
pylint==0.23.0
unittest2==0.5.1
wsgiref==0.1.2
Noe the all-important use of the --no-site-packages virtualenv creation option which (surprise!) ensures that your virtualenv is completely fresh and has none of the site-packages from your distribution installed. This way it is clear what is needed for the app you installed.
If this is an application that you have developed, a nice way to keep track of your dependencies (and an excellent/clean way to work) is to set up the application in a clean virtualenv (created using the --no-site-packages option again) and then again use pip freeze to figure out what packages you've installed to make it work.
The ability to start a "fresh" python installation with the --no-site-packages option is extremely useful. I do it for all applications and to test out packages I'm interested in without cluttering my workspace(s).
If you aren't using virtualenv and pip yet, get on it already. Here is a good introduction:
http://mathematism.com/2009/07/30/presentation-pip-and-virtualenv/
The simplest way is to use the Python dependency analysis tool snakefood.
Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).