Creating a Python module with its dependencies self contained - python

When creating a Python module, you can specify dependencies of your module by using the install_requires List.
Lets look at this basic example.
setup(name='some_module',
version='0.0.1',
packages=find_packages(),
install_requires=[
'requests==2.21.0'
])
I package my module python3 setup.py sdist and upload it to a package repository.
But when I go to install pip3 install some_module==0.0.1 it will install requests==2.21.0 globally in my python3 site-packages/.
To my question, how do I get similar functionality to npm, with nested node_modules/ where my Python module would have its own site-packages/ and it would reference its local version of requests instead of overwriting my global version.
Thanks!

I would consider using PyInstaller for this task!
Directly from the Pyinstaller's documentation:
PyInstaller bundles a Python application and all its dependencies into a single package. The user can run the packaged app without installing a Python interpreter or any modules. PyInstaller supports Python 2.7 and Python 3.4+, and correctly bundles the major Python packages such as numpy, PyQt, Django, wxPython, and others.
You can install it with a simple pip install:
pip install PyInstaller
For a more detailed tutorial check out this link
Hopefully that helps!

Related

How to include library dependencies with a python project?

I'm working on a program which a few other people are going to be using, made in Python. They all have python installed, however, there are several libraries that this program imports from.
I would ideally like to be able to simply send them the Python files and them be able to just run it, instead of having to tell them each of the libraries they have to install and as them to get each one manually using pip.
Is there any way I can include all the libraries that my project uses with my Python files (or perhaps set up an installer that installs it for them)? Or do I just have to give them a full list of python libraries they must install and get them to do each one manually?
That's the topic of Python packaging tutorial.
In brief, you pass them as install_requires parameter to setuptools.setup().
You can then generate a package in many formats including wheel, egg, and even a Windows installer package.
Using the standard packaging infrastructure will also give you the benefit of easy version management.
You can build a package and specify dependencies in your setup.py. This is the right way
http://python-packaging.readthedocs.io/en/latest/dependencies.html
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
install_requires=[
'markdown',
],
zip_safe=False)
The need it now hack option is some lib that lets your script interact with shell. pexpect is my favorite for automating shell interactions.
https://pexpect.readthedocs.io/en/stable/
You should use virtualenv to manage packages of your project. If you do, you can then pip freeze > requirements.txt to save all dependencies of a project. After this requirements.txt will contain all necessary requirements to run your app. Add this file to your repository. All packages from requirements.txt can be install with
pip install -r requirements.txt
The other option will be to create a PyPI package. You can find a tutorial on this topic here

Does building pure python modules w/ conda require setuptools?

This weekend I've been reading up on conda and the python packaging user guide because I have a simple pure python project that depends on numpy. It seemed to me that distributing/installing this project via conda was better than pip due to this dependency.
One thing on which I'm still not clear: conda will install a python package from a recipe in build.sh, but it seems like build.sh just ends up calling python setup.py install for most python packages.
So even if I want to distribute/install my python package with conda, I still end up depending on setuptools (or distutils) for the actual installation, correct? I was unable to find a conda utility analogous to setuptools; am I missing something?
FWIW, I posted this question on the conda issue tracker.
Thanks!
Typically you will still be using distutils (or setuptools if the library requires it) to install things, yes. It is not technically required. The build.sh can be anything. If you wanted to, you could just copy the code into site-packages. Using setup.py install is recommended, though, as libraries will already have setup.py working, it will install metadata that can be read by pip, and it will compile any extension modules and install any data files.

Python Windows Installer with all dependencies?

I have a package in the PyPI repository. I include a Windows installer by running the following command to upload a new version, specifically the 'bdist_wininst':
python3 setup.py register sdist bdist_wininst upload
However, when a user runs the associated .exe file, it does not install Python 3 itself. Furthermore, even if Python 3 is installed, it will not install any associated dependencies.
What is the best way to create a windows installer that will install Python 3 if it is not installed, along with my package and its dependencies?
If that is not possible, what is the best way to create a windows installer that will install my package and its dependencies, assuming Python 3 is installed?
I'm on Ubuntu 12.04. If it's of any assistance, here is my setup.py:
from distutils.core import setup
import codecs
try:
codecs.lookup('mbcs')
except LookupError:
ascii = codecs.lookup('ascii')
func = lambda name, enc=ascii: {True: enc}.get(name=='mbcs')
codecs.register(func)
setup(
name='SIGACTor',
version='0.1.14dev',
description=open('README.txt').read(),
url='http://bitbucket.org/davidystephenson/sigactor',
author='David Y. Stephenson',
author_email='david#davidystephenson.com',
packages=['sigactor'],
license='Proprietary',
long_description=open('README.txt').read(),
install_requires=[
'beautifulsoup4',
'feedparser',
'python-dateutil',
'pyyaml'
],
)
You should definetely try out pynsist which can bundle Python with your packages and is based on well-established NSIS open-source installer:
https://pypi.python.org/pypi/pynsist
Anaconda team provides Constructor which is based on conda and NSIS again:
https://github.com/conda/constructor
Finally this approach using WinPython and most stable installer called InnoSetup:
http://cyrille.rossant.net/create-a-standalone-windows-installer-for-your-python-application/
But if your package is not a library but an application then you can bundle it (freeze) with Python and all dependencies, even compress it using pyinstaller:
http://www.pyinstaller.org
This is what I use for all of my apps even with crazy interop dependencies!
Bonus - auto update tool for pyinstaller:
https://github.com/JMSwag/PyUpdater

Using pip to install single-file python modules

I'm wondering if there's a way to "install" single-file python modules using pip (i.e. just have pip download the specified version of the file and copy it to site-packages).
I have a Django project that uses several 3rd-party modules which aren't proper distributions (django-thumbs and a couple others) and I want to pip freeze everything so the project can be easily installed elsewhere. I've tried just doing
pip install git+https://github.com/path/to/file.git
(and tried with the -e tag too) but pip complains that there's no setup.py file.
Edit: I should have mentioned - the reason I want to do this is so I can include the required module in a requirements.txt file, to make setting up the project on a new machine or new virtualenv easier.
pip requires a valid setup.py to install a python package. By definition every python package has a setup.py... What you are trying to install isn't a package but rather a single file module... what's wrong with doing something like:
git clone git+https://github.com/path/to/file.git /path/to/python/install/lib
I don't quite understand the logic behind wanting to install something that isn't a package with a package manager...

Including Bash autocompletion with setuptools

I've got a couple packages in PyPI, and I'd like to include autocompletion features with both of them. How would you check that Bash autocompletion should be installed at all (check for /etc/bash_completion, maybe?), and how would you install it with setup.py (preferably using setuptools)?
If you're going to require OS-level packages (i.e. bash-completion), then you should distribute your library as an OS-level package. That is, in .deb, .rpm, etc. Some tips here:
Debian New Maintainer's Guide
Fedora Packaging Guidelines
As part of the package generation, you can call your setuptools script to install the Python code. To ensure bash-completion is installed, you can specify it is a required package.
You can use data_files options:
from setuptools import setup
setup(
...
data_files=[
('/etc/bash_completion.d/', ['extra/some_completion_script']),
]
)

Categories

Resources