I have a Python package that is one of a collection of company Python packages. When I run
python setup.py install
I want the package to be installed to a common company directory, along with other company packages. I want this directory to be relative to the default Python install directory, e.g.,
/usr/lib/python2.7/site-packages/<company_name>/<python_package_name>
That is, I want to insert <company_name> into the installation path at install time.
I've seen ways to prefix this path, but can't seem to work out how to do what I've described.
Unfortunately Python packaging doesn't work like that. You could probably bend it to work like that but that would be quite an effort for a person without experience in Python packaging and even for experienced persons the amount/output tradeoff would not make sense. You do not mention any motive to do this besides your personal preference.
Instead, to have well-managed and human-navigable package installation folder, I recommend you to study the following resources
PEP 0382 - Namespace Packages: How to create packages like companyname.foobar, companyname.moomoo
Installing packages into a virtualenv - Python packaging installation guide (official)
Scrambler: Symlink namespaced Python packages to a single folder
Related
I have a python project that uses two R packages. I have to use these packages because they don't exist for Python as of today. While my project works great, one obstacle is that users have to install these two packages using R (or R studio) in their local systems. I was wondering if it is possible to add these package names in the python projects requirements.txt file so that they get installed with other python packages. Any leads on this are helpful... just trying to make it easy for the users of my project.
As essentially answered in the comments, Python and R have completely different packaging systems. It is not possible to add R packages to requirements.txt because is it used to store Python packages.
However, you could have setup code for your Python package install R packages when your Python code is installed or at runtime. In that case the R packages are installed using R's own packaging system, and nothing prevents you from storing them in a flat file (for example called requirements_r.txt).
A word of caution though. Installing a Python package that has the side effect of changing the directory of available R packages might be frowned upon by some.
I have an installable python package (mypackage) and it needs to use specific versions of a number of dependencies. At the minute I have some .sh scripts that just pip these into an internal package folder (eg C:\Python27\Lib\site-packages\mypackage\site-packages). When mypackage executes it adds this internal folder to the beginning of the python path so that it will override any other versions of the required dependencies elsewhere in the python path.
I know this will only work if the user doesn't import the dependencies prior to importing mypackage but I will document this.
I want to remove the .sh scripts and integrate the above into either dist_utils install or pip standard installation process. What is the best way to do this? I know about install_requires but it does not seem to allow specification of a location.
I eventually found the virtualenv tool which solved the above problem much more elegantly than the solution I'd put in place:
https://docs.python.org/3/tutorial/venv.html
I have 2-3 dozen Python projects on my local hard drive, and each one has its own virtualenv. The problem is that adds up to a lot of space, and there's a lot of duplicated files since most of my projects have similar dependencies.
Is there a way to configure virtualenv or pip to install packages into a common directory, with each package namespaced by the package version and Python version the same way Wheels are?
For example:
~/.cache/pip/common-install/django_celery-3.1.16-py2-none-any/django_celery/
~/.cache/pip/common-install/django_celery-3.1.17-py2-none-any/django_celery/
Then any virtualenv that needs django-celery can just symlink to the version it needs?
The whole point of virutalenv is to isolate and compartmentalize dependencies. What you are describing directly contradicts its use case. You could go into each individual project and modify the environmental variables but that's a hackish solution.
We are shipping our product to customers location who may or may not have python and other libraries installed, so can we reduce our python script into an independent executable with python and other required libraries included , so are there other ideas ?
You can use py2exe it does exactly what you need, and its very easy to use. I have used it on one of my projects which are online and used daily.
http://www.py2exe.org/
and here is their tutorial:
http://www.py2exe.org/index.cgi/Tutorial
You can deliver a package with Python and then apply one of these two methods:
Package With python + virtualenv
There's many solutions for that. One I like is virtualenv, which can allow you to deploy a specific configuration of a Python project (with dependencies) on another machines.
Package With python + pip
Another way is to use pip and write a requirements.txt file at the root of your project, which contains every dependency (1 per line), for example:
django>=1.5.4
pillow
markdown
django-compressor
By doing pip -r requirements.txt in the root dir, the program will install packages needed.
See also:
How do you use pip, virtualenv and Fabric to handle deployment?
Pip installer documentation
Virtualenv documentation
I want to use the default (no site packages) of virtualenv.
But some modules are difficult to install in a virtualenv (for example gtk). With "difficult" I mean that you need to have a lot of c-header files installed and a lot of stuff needs to be compiled.
I know that I can solve this by not installing these packages with pip, but to create symlinks to make some modules available from the global site-packages directory.
But is this the right direction?
Is there a way to create the symlinks with pip or virtualenv?
Update
In 2013 I wanted some modules like psycopg2, gtk, python-ldap and other which are installed on my linux server via rpm/dpkg in the virtualenv.
The symlinking or other work-arounds did make things more complicated, not simpler. We use this option today (2017)
--system-site-packages
Give the virtual environment access to the global
site-packages.
I'd say yeah, that's the right direction.
Your questions sounds similar to something I dealt with: installing OpenCV into virtualenv. My problem was that OpenCV wasn't available via pip (Python Package Index). What I ended up doing was querying the system-wide global Python installation for the module in question, and then copy-ing the .so into my virtualenv.
The whole process, including the boilerplate Makefile I used, are captured here: https://stackoverflow.com/a/19213369/1510289
You could do something similar by sym-linking instead of copying. The reason I ended up copying the library was because I use Make, and Make doesn't handle dependencies for symbolic links in a way I needed (as explained in the URL above.)
Hope this helps...
How are you compiling each of these 'hard' packages from scratch?
Are you doing something like:
python setup.py install
If so, replace that with:
python setup.py bdist_wheel
Then look in the ./dist directory for a .whl file. Then take whatever that file is, and do (after activating the environment)
pip install `./dist/whateverTheFileIsCalled.whl`