Is it possible to create a fully self-contained Python package? - python

The question
Ansible is a python moduel, installable via pip. It relies on several dependencies, also pip modules. Is it possible to "roll up" all of those dependencies and Ansible itself into some sort of a single package, that can be installed offline, without root? It's highly preferable to not need pip for the install, although it will be available for package creation.
Extra background
I'm trying to install Ansible on one of our servers. The server does not have access to the internet, there is no root access. Pip is not installed, but Python is. It is possible to get pip installed there, but might be complicated. The only way to get anything on the server is via an internal tar.gz package sharing solution.
I've tried fiddling around with rpm, saving dependencies, but the absence of root access put an end to that.

Use pip on an internet-connected machine to download all the deps to a local dir with --download and -r requirements.txt, then drop that dir on the disconnected machine with pip installed, and install using --no-index and --find-links=(archive dir).
See https://pip.pypa.io/en/latest/user_guide/#fast-local-installs

Related

Is there a way to automatically add dependencies to requirements.txt as they are installed?

Similar to how Node.js automatically adds dependencies to package-lock.json, is there a way I can automatically add requirements to my requirements.txt file for Python?
Since you mentioned Node.js specifically, the Python project that comes closest to what you're looking for is probably Pipenv.
Blurb from the Pipenv documentation:
Pipenv is a dependency manager for Python projects. If you're familiar with Node.js's npm or Ruby’s bundler, it is similar in spirit to those tools. While pip can install Python packages, Pipenv is recommended as it’s a higher-level tool that simplifies dependency management for common use cases.
It's quite a popular package among developers as the many stars on GitHub attest.
Alternatively, you can use a "virtual environment" in which you only install the external dependencies that your project needs. You can either use the venv module from the standard library or the Virtualenv package from PyPI, which offers certain additional features (that you may or may not need). With either of those, you can then use Python's (standard) package manager Pip to update the requirements file:
pip freeze >requirements.txt
This is the "semi-automatic" way, so to speak. Personally, I prefer to do this manually. That's because in a typical development environment ("virtual" or not), you also install packages that are only required for development tasks, such as running tests or building the documentation. They don't need to be installed along with your package on end-user machines, so shouldn't be in requirements.txt. Popular packaging tools such as Flit and Poetry manage these "extra dependencies" separately, as does Pip.
If you are using Linux you can create an alias like this:
alias req='pip3 freeze > ~/requerments.txt'
And then when you want to install new package use this command:
pip3 install <package> | req
I think, to-requirements.txt is what you need:
pip install to-requirements.txt
requirements-txt setup
After that installed packages will be appended to requirements.txt. And uninstalled packages will be removed.
It might require root access if you install it on system-wide Python interpreter. Add sudo if it failes.

Keeping a private python package installed and up to date

Question regarding installation of a Python-package from a private git-repository.
I have an init.py-file that is run whenever a user logs in to my service. This script is responsible for installing required packages, amongst others a python-package (with setup.py) from a private repository.
I am looking for ways to:
Install the latest version of the package if not currently installed.
Update the package, if the current installed version is not the latest.
Perform no action, if the latest version of the package already is installed.
I have tried the following:
Using pip install --upgrade git+ssh://..., however this always performs a clean install of the package.
Using pip install git+ssh://..., however this will not update the package if the current version is not the latest.
I am currently looking into ways of doing this manually by:
Git cloning the repository if it does not exist locally; then,
Call python setup.py develop to install the package in develop mode; then finally,
Do a git stash; git pull to discard any changes to working directory, and automatically pull latest changes.
However, I feel this approach is prone to users messing up.
I'd love if someone could provide some insight into this issue.
Thanks in advance!

Why use Pip or PyPI when you can just run the setup.py file

I am a novice Python user and not a CS by training but I have successfully managed to download several packages (as eggs, wheels, tar and gz files).
and get them installed by using 7-Zip expose the setup.py file, navigating to the setup.py via the command shell to get the packages installed.
I see lots of posts and videos about using PyPI or pip to handle the installation process but wonder why should I bother if I have a method that works.
What are the advantages to using PyPI or pip other than saving a few keystrokes?
Keep in mind that I do work behind a digital curtain imposed by the IT staff, I do not have admin privileges on my computer, and little or no access to the system settings.
I'm on Windows 7 and I'm using Python 2.7
pip is a package management system used to install and manage software packages written in Python.
It is not just for saving some key-strokes, but it installs , updates and manages the packages in to one's environment.
If tou are new to it try some commands like:
pip freeze : this will list all the installed packages
pip install PackageNameHere --upgrade : To upgrade existing packages
Lot more are their, like you can update all package at the same time .
These are just examples, and Obviously you might have heard ,
When you are in rome do like romans .

Using pip to install single-file python modules

I'm wondering if there's a way to "install" single-file python modules using pip (i.e. just have pip download the specified version of the file and copy it to site-packages).
I have a Django project that uses several 3rd-party modules which aren't proper distributions (django-thumbs and a couple others) and I want to pip freeze everything so the project can be easily installed elsewhere. I've tried just doing
pip install git+https://github.com/path/to/file.git
(and tried with the -e tag too) but pip complains that there's no setup.py file.
Edit: I should have mentioned - the reason I want to do this is so I can include the required module in a requirements.txt file, to make setting up the project on a new machine or new virtualenv easier.
pip requires a valid setup.py to install a python package. By definition every python package has a setup.py... What you are trying to install isn't a package but rather a single file module... what's wrong with doing something like:
git clone git+https://github.com/path/to/file.git /path/to/python/install/lib
I don't quite understand the logic behind wanting to install something that isn't a package with a package manager...

Python package install using pip or easy_install from repos

The simplest way to deal with python package installations, so far, to me, has been to check out the source from the source control system and then add a symbolic link in the python dist-packages folder.
Clearly since source control provides the complete control to downgrade, upgrade to any branch, tag, it works very well.
Is there a way using one of the Package installers (easy_install or pip or other), one can achieve the same.
easy_install obtains the tar.gz and install them using the setup.py install which installs in the dist-packages folder in python2.6. Is there a way to configure it, or pip to use the source version control system (SVN/GIT/Hg/Bzr) instead.
Using pip this is quite easy. For instance:
pip install -e hg+http://bitbucket.org/andrewgodwin/south/#egg=South
Pip will automatically clone the source repo and run "setup.py develop" for you to install it into your environment (which hopefully is a virtualenv). Git, Subversion, Bazaar and Mercurial are all supported.
You can also then run "pip freeze" and it will output a list of your currently-installed packages with their exact versions (including, for develop-installs, the exact revision from the VCS). You can put this straight into a requirements file and later run
pip install -r requirements.txt
to install that same set of packages at the exact same versions.
If you download or check out the source distribution of a package — the one that has its "setup.py" inside of it — then if the package is based on the "setuptools" (which also power easy_install), you can move into that directory and say:
$ python setup.py develop
and it will create the right symlinks in dist-packages so that the .py files in the source distribution are the ones that get imported, rather than copies installed separately (which is what "setup.py install" would do — create separate copies that don't change immediately when you edit the source code to try a change).
As the other response indicates, you should try reading the "setuptools" documentation to learn more. "setup.py develop" is a really useful feature! Try using it in combination with a virtualenv, and you can "setup.py develop" painlessly and without messing up your system-wide Python with packages you are only developing on temporarily:
http://pypi.python.org/pypi/virtualenv
easy_install has support for downloading specific versions. For example:
easy_install python-dateutil==1.4.0
Will install v1.4, while the latest version 1.4.1 would be picked if no version was specified.
There is also support for svn checkouts, but using that doesn't give you much benefits from your manual version. See the manual for more information above.
Being able to switch to specific branches is rarely useful unless you are developing the packages in question, and then it's typically not a good idea to install them in site-packages anyway.
easy_install accepts a URL for the source tree too. Works at least when the sources are in Subversion.

Categories

Resources