I'm trying to maintain dependencies using pip install -r requirements.txt. However, some of required packages do not support Python 3 directly, but can be converted manually using 2to3.
Is there a way to force pip to run 2to3 on those packages automagically when doing pip install -r requirements.txt?
No, it needs to be part of the package setup configuration instead. See Supporting both Python 2 and 3 with Distribute.
You add metadata to your package installer:
setup(
name='your.module',
version = '1.0',
description='This is your awesome module',
author='You',
author_email='your#email',
package_dir = {'': 'src'},
packages = ['your', 'your.module'],
test_suite = 'your.module.tests',
use_2to3 = True,
convert_2to3_doctests = ['src/your/module/README.txt'],
use_2to3_fixers = ['your.fixers'],
use_2to3_exclude_fixers = ['lib2to3.fixes.fix_import'],
)
Such a package would then automatically run 2to3 on installation into a Python 3 system.
2to3 is a tool, not a magic bullet, you cannot apply it to an arbitrary package pip downloads from PyPI. The package needs to support it in the way it is coded. Thus, running it automatically from pip is not going to work; the responsibility lies with the package maintainer.
Note that just because 2to3 runs successfully on a package, it does not necessarily follow the package will work in Python 3. Assumptions about bytes vs. unicode usually crop up when you actually start using the package.
Contact the maintainers of the packages you are interested in and ask what the status is for that package for Python 3. Supplying patches to them usually helps. If such requests and offers for help fall on deaf ears, for Open Source packages you can always fork them and apply the necessary changes yourself.
Related
I'm working on a program which a few other people are going to be using, made in Python. They all have python installed, however, there are several libraries that this program imports from.
I would ideally like to be able to simply send them the Python files and them be able to just run it, instead of having to tell them each of the libraries they have to install and as them to get each one manually using pip.
Is there any way I can include all the libraries that my project uses with my Python files (or perhaps set up an installer that installs it for them)? Or do I just have to give them a full list of python libraries they must install and get them to do each one manually?
That's the topic of Python packaging tutorial.
In brief, you pass them as install_requires parameter to setuptools.setup().
You can then generate a package in many formats including wheel, egg, and even a Windows installer package.
Using the standard packaging infrastructure will also give you the benefit of easy version management.
You can build a package and specify dependencies in your setup.py. This is the right way
http://python-packaging.readthedocs.io/en/latest/dependencies.html
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
install_requires=[
'markdown',
],
zip_safe=False)
The need it now hack option is some lib that lets your script interact with shell. pexpect is my favorite for automating shell interactions.
https://pexpect.readthedocs.io/en/stable/
You should use virtualenv to manage packages of your project. If you do, you can then pip freeze > requirements.txt to save all dependencies of a project. After this requirements.txt will contain all necessary requirements to run your app. Add this file to your repository. All packages from requirements.txt can be install with
pip install -r requirements.txt
The other option will be to create a PyPI package. You can find a tutorial on this topic here
I've created a new RPM using python bdist_rpm . Normally python setup.py install would install python dependencies like websocket-client or any other package. But the RPM just refuses to install anything.
Apparently the suggestion from various other posts seem to be in the line of just requiring them in setup.cfg as rpm packages. This doesn't make sense to me since most of the rpm packages seem to be on really old version and I can't possibly create rpms for all the python packages i require. I need a much recent version and it doesn't make sense that the yum installs don't actually install the packages.
What is the right (clean and easiest) way to do it ? I believe if a setup.py has something like
install_requires=[
"validictory",
"requests",
"netlogger>=4.3.0",
"netifaces",
"pyzmq",
"psutil",
"docopt"
],
Then it should try to either include them in the rpm or try to install it.
I am trying on a clean centos vm using vagrant which I keep destroying and then install the rpm.
Well the super hack way i used was to just add a post install script with all the requirements as easy_install installation (instead of pip because older versions may not have pip and even after installing pip, the approach failed on systems with python 2.6)
#Adding this in setup.py
options = {'bdist_rpm':{'post_install' : 'scripts/rpm_postinstall.sh'}},
Then the script is as follows:
easy_install -U <pkgnames>
Of course a post_uninstall can also be added if you want to clean up which I wouldn't because you have no clue what is using the packages installed apart from this app.
The logic of the rpm approach seems to be for this but its honestly over engineering and I'd rather package all the modules with the rpm to ensure it always works. ** Screaming out for a cleaner solution **
I'm working on creating a Python package that is somewhat modular in nature and I was wondering what's the best way to go about handling multiple installation configurations?
Take the following setup.py for the package of a simple document parser:
setup(
name = 'my_document_parser',
...
entry_points = {
'my_document_parser.parsers': [
'markdown = my_document_parser.parsers.misaka:Parser [misaka]',
'textile = my_document_parser.parsers.textile:Parser [textile]'
]
},
install_requires = [
'some_required_package'
],
extras_require = {
'misaka' = ['misaka'],
'textile' = ['textile']
},
...
)
I'd like to make something like the following available:
A default install
python setup.py install or pip install my_document_parser
installs my_document_parser, some_required_package, misaka
A bare install
something like python setup.py install --bare
installs my_document_parser, some_required_package
A way to enable other dependencies
something like python setup.py install --bare --with-textile
installs my_document_parser, some_required_package, textile
This is my first time messing around with Python packages from a developer standpoint, so I'm open to anything anyone has to offer if I'm going about things the wrong way or such.
Thanks.
The default installation will install everything in your install_requires and packages sections. Packages under extras_require will never be installed by your setup.py script; only if you create some logic for this kind of installation.
By default, setuptools and distutils have no way to explictly say "install using these sections of my setup call". You would need to implement by yourself.
pip always installs packages using no parameter to setup.py but --single-version-externally-managed, to tell setuptools to never create a .egg file.
Conclusion: if you want people to install your package by using pip, forget --bare and --with-textile. The best you can do is a full installation including misaka and textile as dependencies.
Maybe you should read The Hitchhiker's Guide To Packaging, specifically Creating a Package section and setuptool's specification -- it is a pretty good one, it covers everything setuptools can do.
A tweet reads:
Don't use easy_install, unless you
like stabbing yourself in the face.
Use pip.
Why use pip over easy_install? Doesn't the fault lie with PyPI and package authors mostly? If an author uploads crap source tarball (eg: missing files, no setup.py) to PyPI, then both pip and easy_install will fail. Other than cosmetic differences, why do Python people (like in the above tweet) seem to strongly favor pip over easy_install?
(Let's assume that we're talking about easy_install from the Distribute package, that is maintained by the community)
From Ian Bicking's own introduction to pip:
pip was originally written to improve on easy_install in the following ways
All packages are downloaded before installation. Partially-completed installation doesn’t occur as a result.
Care is taken to present useful output on the console.
The reasons for actions are kept track of. For instance, if a package is being installed, pip keeps track of why that package was required.
Error messages should be useful.
The code is relatively concise and cohesive, making it easier to use programmatically.
Packages don’t have to be installed as egg archives, they can be installed flat (while keeping the egg metadata).
Native support for other version control systems (Git, Mercurial and Bazaar)
Uninstallation of packages.
Simple to define fixed sets of requirements and reliably reproduce a set of packages.
Many of the answers here are out of date for 2015 (although the initially accepted one from Daniel Roseman is not). Here's the current state of things:
Binary packages are now distributed as wheels (.whl files)—not just on PyPI, but in third-party repositories like Christoph Gohlke's Extension Packages for Windows. pip can handle wheels; easy_install cannot.
Virtual environments (which come built-in with 3.4, or can be added to 2.6+/3.1+ with virtualenv) have become a very important and prominent tool (and recommended in the official docs); they include pip out of the box, but don't even work properly with easy_install.
The distribute package that included easy_install is no longer maintained. Its improvements over setuptools got merged back into setuptools. Trying to install distribute will just install setuptools instead.
easy_install itself is only quasi-maintained.
All of the cases where pip used to be inferior to easy_install—installing from an unpacked source tree, from a DVCS repo, etc.—are long-gone; you can pip install ., pip install git+https://.
pip comes with the official Python 2.7 and 3.4+ packages from python.org, and a pip bootstrap is included by default if you build from source.
The various incomplete bits of documentation on installing, using, and building packages have been replaced by the Python Packaging User Guide. Python's own documentation on Installing Python Modules now defers to this user guide, and explicitly calls out pip as "the preferred installer program".
Other new features have been added to pip over the years that will never be in easy_install. For example, pip makes it easy to clone your site-packages by building a requirements file and then installing it with a single command on each side. Or to convert your requirements file to a local repo to use for in-house development. And so on.
The only good reason that I know of to use easy_install in 2015 is the special case of using Apple's pre-installed Python versions with OS X 10.5-10.8. Since 10.5, Apple has included easy_install, but as of 10.10 they still don't include pip. With 10.9+, you should still just use get-pip.py, but for 10.5-10.8, this has some problems, so it's easier to sudo easy_install pip. (In general, easy_install pip is a bad idea; it's only for OS X 10.5-10.8 that you want to do this.) Also, 10.5-10.8 include readline in a way that easy_install knows how to kludge around but pip doesn't, so you also want to sudo easy_install readline if you want to upgrade that.
Another—as of yet unmentioned—reason for favoring pip is because it is the new hotness and will continue to be used in the future.
The infographic below—from the Current State of Packaging section in the The Hitchhiker's Guide to Packaging v1.0—shows that setuptools/easy_install will go away in the future.
Here's another infographic from distribute's documentation showing that Setuptools and easy_install will be replaced by the new hotness—distribute and pip. While pip is still the new hotness, Distribute merged with Setuptools in 2013 with the release of Setuptools v0.7.
Two reasons, there may be more:
pip provides an uninstall command
if an installation fails in the middle, pip will leave you in a clean state.
REQUIREMENTS files.
Seriously, I use this in conjunction with virtualenv every day.
QUICK DEPENDENCY MANAGEMENT TUTORIAL, FOLKS
Requirements files allow you to create a snapshot of all packages that have been installed through pip. By encapsulating those packages in a virtualenvironment, you can have your codebase work off a very specific set of packages and share that codebase with others.
From Heroku's documentation https://devcenter.heroku.com/articles/python
You create a virtual environment, and set your shell to use it. (bash/*nix instructions)
virtualenv env
source env/bin/activate
Now all python scripts run with this shell will use this environment's packages and configuration. Now you can install a package locally to this environment without needing to install it globally on your machine.
pip install flask
Now you can dump the info about which packages are installed with
pip freeze > requirements.txt
If you checked that file into version control, when someone else gets your code, they can setup their own virtual environment and install all the dependencies with:
pip install -r requirements.txt
Any time you can automate tedium like this is awesome.
pip won't install binary packages and isn't well tested on Windows.
As Windows doesn't come with a compiler by default pip often can't be used there. easy_install can install binary packages for Windows.
UPDATE: setuptools has absorbed distribute as opposed to the other way around, as some thought. setuptools is up-to-date with the latest distutils changes and the wheel format. Hence, easy_install and pip are more or less on equal footing now.
Source: http://pythonhosted.org/setuptools/merge-faq.html#why-setuptools-and-not-distribute-or-another-name
As an addition to fuzzyman's reply:
pip won't install binary packages and isn't well tested on Windows.
As Windows doesn't come with a compiler by default pip often can't be
used there. easy_install can install binary packages for Windows.
Here is a trick on Windows:
you can use easy_install <package> to install binary packages to avoid building a binary
you can use pip uninstall <package> even if you used easy_install.
This is just a work-around that works for me on windows.
Actually I always use pip if no binaries are involved.
See the current pip doku: http://www.pip-installer.org/en/latest/other-tools.html#pip-compared-to-easy-install
I will ask on the mailing list what is planned for that.
Here is the latest update:
The new supported way to install binaries is going to be wheel!
It is not yet in the standard, but almost. Current version is still an alpha: 1.0.0a1
https://pypi.python.org/pypi/wheel
http://wheel.readthedocs.org/en/latest/
I will test wheel by creating an OS X installer for PySide using wheel instead of eggs. Will get back and report about this.
cheers - Chris
A quick update:
The transition to wheel is almost over. Most packages are supporting wheel.
I promised to build wheels for PySide, and I did that last summer. Works great!
HINT:
A few developers failed so far to support the wheel format, simply because they forget to
replace distutils by setuptools.
Often, it is easy to convert such packages by replacing this single word in setup.py.
Just met one special case that I had to use easy_install instead of pip, or I have to pull the source codes directly.
For the package GitPython, the version in pip is too old, which is 0.1.7, while the one from easy_install is the latest which is 0.3.2.rc1.
I'm using Python 2.7.8. I'm not sure about the underlay mechanism of easy_install and pip, but at least the versions of some packages may be different from each other, and sometimes easy_install is the one with newer version.
easy_install GitPython
This is somewhat related to this question. Let's say I have a package that I want to deploy via rpm because I need to do some file copying on post-install and I have some non-python dependencies I want to declare. But let's also say I have some python dependencies that are easily available in PyPI. It seems like if I just package as an egg, an unzip followed by python setup.py install will automatically take care of my python dependencies, at the expense of losing any post-install functionality and non-python dependencies.
Is there any recommended way of doing this? I suppose I could specify this in a pre-install script, but then I'm getting into information duplication and not really using setuptools for much of anything.
(My current setup involves passing install_requires = ['dependency_name'] to setup, which works for python setup.py bdist_egg and unzip my_package.egg; python my_package/setup.py install, but not for python setup.py bdist_rpm --post-install post-install.sh and rpm --install my_package.rpm.)
I think it would be best if your python dependencies were available as RPMs also, and declared as dependencies in the RPM. If they aren't available elsewhere, create them yourself, and put them in your yum repository.
Running PyPI installations as a side effect of RPM installation is evil, as it won't support proper uninstallation (i.e. uninstalling your RPM will remove your package, but leave the dependencies behind, with no proper removal procedure).