Install modules in Python locally to script - python

I have the following problem:
I have to run some test/diagnostic Python script on a Windows system. Due to explicit requirement, the system has no default system-wide Python instance, but there are two different Python instances installed, used locally by applications running on the system. However, both these instances lack some basic modules my script uses (like logging, urllib, configparser etc.).
I want to run %PYTHONPATH%\python.exe myscript.py where %PYTHONPATH% points to one of the installed Python instances, but install the required additional modules "somewhere" outside %PYTHONPATH% (preferrably, in the same directory where my script is installed) so that my script can use them.
As my script is a test tool, it should not modify the OS or installed software, so the Python installation under %PYTHONPATH% should not be changed in any way.
It is also expected that the installation can be fully automated, ie. the best way to install would be just have the modules in the same .zip file with my script which is unpacked onto the target path.
It is also important that the system has no Internet access, so I have to download required files on another machine and copy them to the target system.
Can you guide me how to do it?

I found an answer myself - it is quite simple:
obtain the zip file containing standard modules from the appropriate Python version distribution (in my case it was the file python38.zip, it is inside the main zip file downloadable from Python site)
Unpack the contents of this file to c:\mydir\Python38\site-packages, where c:\mydir is the directory containing my script
set the environment variable PYTHONUSERBASE=c:\mydir before running my script
Now I can run the script and it finds all "missing" standard modules in c:\mydir\Python38\site-packages.

I think what youy are seeking for is a python virtual environment.
( internet needed )
Check :
https://docs.python.org/fr/3/library/venv.html
Then for the installation, you can creat a .exe file containing all dependencies.
(no internet needed)
Check:
https://www.pyinstaller.org/

Related

How can we build and distribute python scripts in a windows environment

My team is enjoying using python to solve problems for our business. We write many small independent scripty applications.
However, we have to have a central windows box that runs these along with legacy applications.
Our challenge is going through a build and deploy process.
We want to have Bamboo check the script out of git, install requirements and run tests, then if all is green, just deploy to our production box.
We'd like libraries to be isolated from script to script so we don't have dependency issues.
We've tried to get virtualenvs to be portable but that seems a no go.
Pex looked promising, but it doesn't work on windows.
Ideally you'd see a folder like so:
AppOne
/Script.py
/Libs
/bar.egg
/foo.egg
AppTwo
/Script2.py
/Libs
/fnord.egg
/fleebly.py
Are we thinking about this wrong? What's the pythonic way to distribute scripts within an enterprise?
You may be able to do that with a neat if relatively unknown feature that was sneaked into Python 2.6 without much ado: executing zip files as Python applications. It got a bit (just a bit) more of publicity after PEP 441 (which is the one PEX is inspired in), although I think most people is still unaware of it. The idea is that you create a zip file (the recommeded extension is .pyz or .pyzw for windowed applications, but that's obviously not important) with all the code and modules that you want and then you simply run it with Python. The interpreter will add the contents of the zip file to sys.path and look for a top level module named __main__ and run it. Python 3.5 even introduced the convenience module zipapp to create such packaged applications, but there is really no magic in it and you may as well create it by hand or script.
In your case, I guess Bamboo could do the check out, dependency install and tests in virtualenvs and then package the application along with the environment libraries. It's not a one-click solution but it may do the trick without additional tools.
TL:DR:
Use Docker
A short story long:
You can use docker to create an independent image for every script that you want to deploy.
You can install a python image (slim is the lightest) as a base environment for each script or a group of scripts/applications and use it like a "virtualenv" in which you can install all your dependencies for that script.
There is also an integration for Bamboo and Docker which you may find useful.
Here is the Docker documentation for reference.
You can test each script individually in a separated container and if it passes then you can use the same container to deploy it in your main server.
It is not exactly what you are asking, but you can use this solution in every platform (Windows, Linux, etc.), you can deploy all your scripts to the enterprise server (or anywhere for that matter) and use them across your company.
Disclaimer: This is not THE solution, it is a solution that I am aware of which applies to the time of this answer (2017)
Another possibility is pyinstaller. It creates an executable that can be deployed. Python is not even required to be installed on the deployed production box. It is harder to debug problems that occur only on the deployed box. You also can't modify the scripts on the deployed box which depending on your trust of the owners of the machine is either a positive or negative. See http://www.pyinstaller.org/
As I understand it, you want to create self-contained application directories on a build server, then copy them over to a production server and run scripts directly from them. In particular, you want all dependencies (your own and external packages) installed within a Libs subdirectory in each application directory. Here's a fairly robust way to do that:
Create the top-level application directory (AppOne) and the Libs subdirectory inside it.
Use pip install --ignore-installed --target=Libs package_name to install dependencies into the Libs subdirectory.
Copy your own packages and modules into the Libs subdirectory (or install them there with pip).
Copy Script.py into the top-level directory.
Include code at the top of Script.py to add the Libs directory to sys.path:
import os, sys
app_path = os.path.dirname(__file__)
lib_path = os.path.abspath(os.path.join(app_path, 'Libs'))
sys.path.insert(0, lib_path)
This will make packages like Libs\bar.egg and modules like Libs\fleebly.py available to your script via import bar or import fleebly. Without code like this, there is no way for your script to find those packages and modules.
If you want to streamline this part of your script, there are a couple of options: (1) Put these lines in a separate fix_path.py module in the top-level directory and just call import fix_path at the start of your script. (2) Create a Libs\__init__.py file with the line sys.path.insert(0, os.path.dirname(__file__)), and then call import Libs from your script. After that, Libs\x can be imported via import x. This is neat, but it's a nonstandard use of the package and path mechanisms (it uses Libs as both a library directory and a package), so it could create some confusion about how importing works.
Once these directories and files are in place, you can copy this whole structure over to any Windows system with Python installed, and then run it using cd AppOne; python Script.py or python AppOne\Script.py. If you name your top-level script __main__.py instead of Script.py, then you can run your app just by executing python AppOne.
Further, as #jdehesa pointed out, if your script is named __main__.py, you can compress the contents of the AppOne directory (but not the AppOne directory itself) into a file called AppOne.zip, and then copy that to your production server and run it by calling python AppOne.zip. (On Python 3.5 or later, you can also create the zip file via python -m zipapp AppOne if your script is called __main__.py. You may also be able to use python -m zipapp AppOne -m Script if your script is called Script.py. See https://docs.python.org/3/library/zipapp.html.)
This kind of thing can be easily dealt with python setup.py
Sample setup.py
from setuptools import setup
setup(
name=name_for_distribution,
version=version_number,
py_modules=[pythonfiles],
install_requires=[
python packages that need to be installed
]
)
Create a virtual environment , activate it and run :
python setup.py install
I feel this is the most pythonic way to distribute and package your project.
Reading links:
https://pythonhosted.org/an_example_pypi_project/setuptools.html
https://docs.python.org/2/distutils/setupscript.html

Virtualenv within single executable

I currently have an executable file that is running Python code inside a zipfile following this: https://blogs.gnome.org/jamesh/2012/05/21/python-zip-files/
The nice thing about this is that I release a single file containing the app. The problems arise in the dependencies. I have attempted to install files using pip in custom locations and when I embed them in the zip I always have import issues or issues that end up depending on host packages.
I then started looking into virtual environments as a way to ensure package dependencies. However, it seems that the typical workflow on the target machine is to source the activation script and run the code within the virtualenv. What I would like to do is have a single file containing a Python script and all its dependencies and for the user to just execute the file. Is this possible given that the Python interpreter is actually packaged with the virtualenv? Is it possible to invoke the Python interpreter from within the zip file? What is the recommended approach for this from a Python point of view?
You can create a bash script that creates the virtual env and runs the python scripts aswell.
!#/bin/bash
virtualenv .venv
.venv/bin/pip install <python packages>
.venv/bin/python script

Getting Enthought Python Distribution to Cooperate with COPASI and libsbml

I've been using the Enthought Python Distribution (the Academic version) for some time now, but have been trying to install some extra packages for a project, and running into problems born of my relative inexperience with the command line.
These are:
COPASI
libSBML
StochPy
The last one went well, a simple python setup.py install and everything seems to be running fine. But neither COPASI nor libSBML seem to be working.
Importing either one of them is netting "ImportError: No module named COPASI/libsbml".
I installed libSBML according to the directions here.And used this suggested workaround to get it working with Enthought, to no avail. For COPASI, I installed COPASI as directed, and followed the following directions for the Python bindings:
Once you downloaded the binary package for the Java bindings you have to unpack it. It will be unpacked to a directory called copasi_python. This directory contains the native library, a python file, a documentation file and the license file. It also contains a directory called unittests with lots of unittests that can be used to check if the bindings are working.
To run the unittests, you first have to set the PYTHONPATH environment variable to the directory where the native library and the COPASI.py file are located. Once you changed into the unittests directory you find a file called runTests.py which runs all the unittests in the directory.
My edited .bash_profile file now looks like this:
export PATH="/Library/Frameworks/EPD64.framework/Versions/Current/bin:${PATH}"
export PYTHONPATH=/usr/local/lib/python2.7/site-packages
export PYTHONPATH=/Applications/COPASI/copasi35_python27_macosx107_x64:$PYTHONPATH
The first line is to make EPD the default python interpreter, the second is the result of the suggested workaround above, and the third is per the instructions for COPASI. Any idea what I'm doing wrong?

How to build a single python file from multiple scripts?

I have a simple python script, which imports various other modules I've written (and so on). Due to my environment, my PYTHONPATH is quite long. I'm also using Python 2.4.
What I need to do is somehow package up my script and all the dependencies that aren't part of the standard python, so that I can email a single file to another system where I want to execute it. I know the target version of python is the same, but it's on linux where I'm on Windows. Otherwise I'd just use py2exe.
Ideally I'd like to send a .py file that somehow embeds all the required modules, but I'd settle for automatically building a zip I can just unzip, with the required modules all in a single directory.
I've had a look at various packaging solutions, but I can't seem to find a suitable way of doing this. Have I missed something?
[edit] I appear to be quite unclear in what I'm after. I'm basically looking for something like py2exe that will produce a single file (or 2 files) from a given python script, automatically including all the imported modules.
For example, if I have the following two files:
[\foo\module.py]
def example():
print "Hello"
[\bar\program.py]
import module
module.example()
And I run:
cd \bar
set PYTHONPATH=\foo
program.py
Then it will work. What I want is to be able to say:
magic program.py
and end up with a single file, or possibly a file and a zip, that I can then copy to linux and run. I don't want to be installing my modules on the target linux system.
I found this useful:
http://blog.ablepear.com/2012/10/bundling-python-files-into-stand-alone.html
In short, you can .zip your modules and include a __main__.py file inside, which will enable you to run it like so:
python3 app.zip
Since my app is small I made a link from my main script to __main__.py.
Addendum:
You can also make the zip self-executable on UNIX-like systems by adding a single line at the top of the file. This may be important for scripts using Python3.
echo '#!/usr/bin/env python3' | cat - app.zip > app
chmod a+x app
Which can now be executed without specifying python
./app
Use stickytape module
stickytape scripts/blah --add-python-path . > /tmp/blah-standalone
This will result with a functioning script, but not necessarily human-readable.
You can try converting the script into an executable file.
First, use:
pip install pyinstaller
After installation type ( Be sure you are in your file of interest directory):
pyinstaller --onefile --windowed filename.py
This will create an executable version of your script containing all the necessary modules. You can then transfer (copy and paste) this executable to the PC or machine you want to run your script.
I hope this helps.
You should create an egg file. This is an archive of python files.
See this question for guidance: How to create Python egg file
Update: Consider wheels in 2019
The only way to send a single .py is if the code from all of the various modules were moved into the single script and they your'd have to redo everything to reference the new locations.
A better way of doing it would be to move the modules in question into subdirectories under the same directory as your command. You can then make sure that the subdirectory containing the module has a __init__.py that imports the primary module file. At that point you can then reference things through it.
For example:
App Directory: /test
Module Directory: /test/hello
/test/hello/__init__.py contents:
import sayhello
/test/hello/sayhello.py contents:
def print_hello():
print 'hello!'
/test/test.py contents:
#!/usr/bin/python2.7
import hello
hello.sayhello.print_hello()
If you run /test/test.py you will see that it runs the print_hello function from the module directory under the existing directory, no changes to your PYTHONPATH required.
If you want to package your script with all its dependencies into a single file (it won't be a .py file) you should look into virtualenv. This is a tool that lets you build a sandbox environment to install Python packages into, and manages all the PATH, PYTHONPATH, and LD_LIBRARY_PATH issues to make sure that the sandbox is completely self-contained.
If you start with a virgin Python with no additional libraries installed, then easy_install your dependencies into the virtual environment, you will end up with a built project in the virtualenv that requires only Python to run.
The sandbox is a directory tree, not a single file, but for distribution you can tar/zip it. I have never tried distributing the env so there may be path dependencies, I'm not sure.
You may need to, instead, distribute a build script that builds out a virtual environment on the target machine. zc.buildout is a tool that helps automate that process, sort of like a "make install" that is tightly integrated with the Python package system and PyPI.
I've come up with a solution involving modulefinder, the compiler, and the zip function that works well. Unfortunately I can't paste a working program here as it's intermingled with other irrelevant code, but here are some snippets:
zipfile = ZipFile(os.path.join(dest_dir, zip_name), 'w', ZIP_DEFLATED)
sys.path.insert(0, '.')
finder = ModuleFinder()
finder.run_script(source_name)
for name, mod in finder.modules.iteritems():
filename = mod.__file__
if filename is None:
continue
if "python" in filename.lower():
continue
subprocess.call('"%s" -OO -m py_compile "%s"' % (python_exe, filename))
zipfile.write(filename, dest_path)
Have you taken into considerations Automatic script creation of distribute the official packaging solution.
What you do is create a setup.py for you program and provide entry points that will be turned into executables that you will be able run. This way you don't have to change your source layout while still having the possibility to easily distribute and run you program.
You will find an example on a real app of this system in gunicorn's setup.py

How to deploy Python to Windows users?

I'm soon to launch a beta app and this have the option to create custom integration scripts on Python.
The app will target Mac OS X and Windows, and my problem is with Windows where Python normally is not present.
My actual aproach is silently run the Python 2.6 install. However I face the problem that is not activated by default and the path is not set when use the command line options. And I fear that if Python is installed before and I upgrade to a new version this could break something else...
So, I wonder how this can be done cleanly. Is it OK if I copy the whole Python 2.6 directory, and put it in a sub-directory of my app and install everything there? Or with virtualenv is posible run diferents versions of Python (if Python is already installed in the machine?).
I also play before embedding Python with a DLL, and found it easy but I lost the ability to debug, so I switch to command-line plug-ins.
I execute the plug-ins from command line and read the STDOUT and STDERR output. The app is made with Delphi/Lazarus. I install others modules like JSON and RPC clients, Win32com, ORM, etc. I create the installer with bitrock.
UPDATE: The end-users are small business owners, and the Python scripts are made by developers. I want to avoid any additional step in the deployment, so I want a fully integrated setup.
Copy a Portable Python folder out of your installer, into the same folder as your Delphi/Lazarus app. Set all paths appropriately for that.
You might try using py2exe. It creates a .exe file with Python already included!
Integrate the python interpreter into your Delphi app with P4D. These components actually work, and in both directions too (Delphi classes exposed to Python as binary extensions, and Python interpreter inside Delphi). I also saw a patch for Lazarus compatibility on the Google Code "issues" page, but it seems there might be some unresolved issues there.
I think there's no problem combining .EXE packaging with a tool like PyInstaller or py2exe and Python-written plugins. The created .EXE can easily detect where it's installed and the code inside can then simply import files from some pre-determined plugin directory. Don't forget that once you package a Python script into an executable, it also packages the Python interpreter inside, so there you have it - a full Python environment customized with your own code.

Categories

Resources