Background
Windows does not include a compiler by default, and installing a compiler (and perhaps configuring Python to use it) is a complicated enough task that many developers avoid doing so. To this end, many packages that have binary dependencies are available as a precompiled Windows executable that contains binary files. As an example, there is psycopg.
The executable file is an installer. When executed, it provides a graphical interface that locates an installed version of Python via the registry, and it installs the Python library and the included binary dependencies in the global Python installation.
However, this is not always desirable. Particularly when using virtualenv, developers don't want to install the library globally. They want the library installed in the virtual environment. Since this environment is not represented in the registry, the graphical installer cannot locate it. Luckily, a command similar to the following can be used to install the library to the virtual environment:
C:\> C:\virtualenv\Scripts\activate.bat
(virtualenv) C:\> easy_install psycopg2-2.5.win32-py2.7-pg9.2.4-release.exe
Note that this works regardless of whether easy_install comes from setuptools or distribute.
The actual question
Why does this command work? What is it about the exe that allows easy_install to process it?
I have noticed that the exe seems to be some kind of zip file. 7-Zip is able to open it for browsing, and these exe files that easy_install can process seem to have a common file structure. They have a top directory named PLATLIB which contains an egg-info file or folder and another (possibly more than 1?) folder. Are these exes just Python eggs with some kind of executable wrapped around them? How might I produce one myself? (Or to word it differently, is there some standard way of producing exes like this?)
Edit
Bonus question: Why doesn't pip work with these files?
Related
My goal is to embed the python interpreter using PyBind11, but to use the interpreter from a virtual env, such that installing dependencies using pip does not clutter the system paths.
There is not much information online around this topic. Embedding python with pybind11. Virtual environment doesn't work hardcodes the venv at compile time. This is insufficient, since the venv does not exist at that time, but will be created when the scripting engine is enabled at runtime.
The plan for now is to pip install --target into a cache dir and add it to the sys.path. Using the system interpreter. This is "okayish" but not using the system interpreter would be peferable.
You do not need a virtual environment to ship isolated dependencies with your software, or to use them locally, you simply need what you already have: an isolated lib folder.
pybind11 does not have its own interpreter (AFAIK), it embeds the one from the Python build you're using when compiling, so you should already have a matching version of the Python interpreter you're embedding (this is the executable you pass to PYTHON_EXECUTABLE if you're using CMake).
The only things that you should change is to not use sys.path but rather site.addsitedir, since sys.path will not handle everything (e.g., .pth files), while site.addsitedir will, so simply:
py::module_::import("site").attr("addsitedir")(/* whatever */);
I have a Jupyter notebook script that will be used to teach others how to use python.
Instead of asking each participant to install the required packages, I would like to provide a folder with the environment ready from the start.
How can I do this?
What is the easiest way to teach python without running into technical problems with packages/environments etc.?
If you just need to install Python dependencies, you can use #Aero Blue solution. However, the users would need probably to make a virtual environment, so they don't mess with other environments and versions, etc.
However, if they should need some Linux packages, this would not be enough. Therefore, I would suggest using Docker. You would need to provide them with a Dockerfile, that you should set to install any dependencies (whether is for Python or Linux), and they would just need to use docker build and docker run commands.
The easiest way I have found to package python files is to use pyinstaller which packages your python file into an executable file.
If it's a single file I usually run pyinstaller main.py --onefile
Another option is to have a requirements file
This reduces installing all packages to one command pip install -r requirements.txt
You would need to use a program such as py2exe, pyinstaller, or cx_freeze to package each the file, the modules, and a lightweight interpreter. The result will be an executable which does not require the user to have any modules or even python installed to access it; however, because of the built-in interpreter, it can get quite large (which is why Python is not commonly used to make executables).
Have you considered using Azure notebooks or another Jupyter hosting service ? Most of these have a special syntax you can use to perform pip installs. For Azure it is !pip install
https://notebooks.azure.com
When installing packages with sudo apt-get install or building libraries from source inside a python virtual environment (I am not talking about pip install), does doing it inside a python virtual environment isolate the applications being installed? I mean do they exist only inside the python virtual environment?
Things that a virtual environment gives you an isolated version of:
You get a separate PATH entry, so unqualified command-line references to python, pip, etc., will refer to the selected Python distribution. This can be convenient if you have many copies of Python installed on the system (common on developer workstations). This means that a shebang line like #!/usr/bin/env python will "do the right thing" inside of a virtualenv (on a Unix or Unix-like system, at least).
You get a separate site-packages directory, so Python packages (installed using pip or built locally inside this environment using e.g. setup.py build) are installed locally to the virtualenv and not in a system-wide location. This is especially useful on systems where the core Python interpreter is installed in a place where unprivileged users are not allowed to write files, as it allows each user to have their own private virtualenvs with third-party packages installed, without needing to use sudo or equivalent to install those third-party packages system-wide.
... and that's about it.
A virtual environment will not isolate you from:
Your operating system (Linux, Windows) or machine architecture (x86).
Scripts that reference a particular Python interpreter directly (e.g. #!/usr/bin/python).
Non-Python things on your system PATH (e.g. third party programs or utilities installed via your operating system's package manager).
Non-Python libraries or headers that are installed into a operating system specific location (e.g. /usr/lib, /usr/include, /usr/local/lib, /usr/local/include).
Python packages that are installed using the operating system's package manager (e.g. apt) rather than a Python package manager (pip) might not be visible from the the virtualenv's site-packages folder, but the "native" parts of such packages (in e.g. /usr/lib) will (probably) still be visible.
As per the comment by #deceze, virtual environments have no influence over apt operations.
When building from source, any compiled binaries will be linked to the python binaries of that environment. So if your virtualenv python version varies from the system version, and you use the system python (path problems usually), you can encounter runtime linking errors.
As for isolation, this same property (binary compatibility) isolates you from system upgrades which might change your system python binaries. Generally we're stable in the 2.x and 3.x, so it isn't likely to happen. But has, and can.
And of course, when building from source inside a virtualenv, installed packages are stashed in that virtualenv; no other python binary will have access to those packages, unless you are manipulating your path or PYTHONPATH in strange ways.
I have created a simple debian package for my python program using this post.
I am also using a postinst script to setup and populate mysql tables. The package gets installed with following command.
sudo apt install mypackage.deb
I now want to add an uninstall script so that if the package is removed, uninstall script gets called to cleanup the environment.
How can I incorporate uninstall script with the debian package?
You probably need to write a postrm script too the same way as you wrote the postinst script. See maintainer scrips flowcharts to understand how these scripts work.
A quote from the same article:
"It is possible to supply scripts as part of a package which the package management system will run for you when your package is installed, upgraded or removed.
These scripts are the control information files preinst, postinst, prerm and postrm. They must be proper executable files; if they are scripts (which is recommended), they must start with the usual #! convention. They should be readable and executable by anyone, and must not be world-writable."
I have just started to use python (within Windows, 64bit) - and I have a basic question on how to install external packages within the anaconda / spyder environment. I understand that for most packages one can simply use “conda install bunnies”. However, certain packages are not in the anaconda repository, and might have be installed externally (e.g. from github). For those packages, in order to have spyder to recognize this package – does one only in addition have to update the PYTHONPATH manager in Spyder to include the directory (e.g. c:\users\bunnies) in which one has downloaded this package? Or should one take additional steps / is there a faster way?
You have several options to use packages that are not (yet) available via conda install:
1.) If the respective package is on PyPi you can build it as described in the manual.
2.) If building from scratch doesn't work and the package is on PyPi you can also try an installation via pip. Not that you have to use the pip in your Anaconda distribution and not the one of your systems Python installation.
3.) If you want to include external packages or local folders that contain Python-scripts you can do the following.
3.1.) Use the sys module and append the required package/folder to the path:
import sys
sys.path.append(r'/path/to/my/package')
3.2) Or put the modules into into site-packages, i.e. the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages which is always on sys.path. (Source)
3.3) Or add a .pth file to the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages. This can be named anything (it just must end with .pth). A .pth file is just a newline-separated listing of the full path-names of directories that will be added to your path on Python startup. (Source)
Good luck!