Python modules on different devices - python

So I have been taking a few classes on python and the whole time, I was wondering about modules. I can install them and run them with Eclipse but if I compile that program, so if it has an 'exe' extension, how would the module react on a computer that doesn't have it installed.
Example:
If I made some random little thing with something like pygame. I installed the pygame module on my computer, made an application with the pygame module and compiled it into an executable, how does the other computer that I run that file on. Or does it not work at all?

Python modules are already executable - you don't compile them. If you want to run them on another computer, you can install python and any other dependent module such as pygame on that computer, copy the scripts over and run them.
Python has many ways to wrap scripts up into an installer to do the work for you. Its common to use python distutils to write a setup.py file which handles the install. From there you can use setup.py to bundle your scripts into zip files, tarballs, executables, rpms, etc... for other machines. You can document what the user needs to make your stuff go or you can use something like pip or distribute to write dependency files to automatically install pygame (and etc...).
There are many ways to handle this and its not particularly easy the first time round. For starters, read up on distutils in the standard python docs and then google for the pip installer.

Related

Portable Python Script with Module

I'm new to using python modules.
I'm currently working on a python 2.7 script that will be deployed to many remote computers (which have python 2.7 on them). The problem is that the script needs to use a module, which I am not allowed to install on those computers.
I'm wondering if it is possible to include the module files in the same package as my script (possibly have them compiled first), and then have the script import the library from that local folder, thus achieving a "portable" script.
If that is possible, how would I go about doing that?
Specifics: I'm running 2.7.11 on Windows needing to use Paramiko.
I'm asking this question because the similar questions that I can find either do not answer mine, or expect me to be familiar with core python structures with which I am not. I also DON'T want to include the entirety of python and then install the module onto that, something I see is often called Portable Python. I just want to send my script and the module and nothing more.
Many thanks!
To install modules in a specific directory, you can try pip install module --target=.
By default python search for those modules in same directory as the script first, then, if not available, it will search for python install lib files.

How to ditribute my Unix application?

I created a command-line python tool for Unix systems.
I want my script to be executed from anywhere just like any Unix command. One solution is to make my script executable and move it to /usr/bin.
But the script works with external files, and I guess moving all these files with my script in /usr/bin is a bad habit, as it will be hard to delete them one by one in the future.
Where should I put the directory of my application ? How to add the main script to the PATH, to execute it from anywhere ?
I would like then to create a package that could move my files in the right place, and delete them in the case the user wants to uninstall my application.
I don't know how to do this.
Thank you.
There are many way of distributing a Linux application.
It will depend on the distribution you are using, since they do not all use the same package manager. For example, you would create a .deb package for Debian, Ubuntu and there derivatives, an Arch package for Archlinux, etc...
You could then share the package with anyone to let them install your tool.
However, since your tool is in written in Python, you can also make a python package. You could then upload it to the Python Package Index to let anyone install it using python's pip package manager.
To create a python package, you will need to create a file called setup.py, and, from it, to call the setup method from the setuptool packages.
You will probably want to read the python documentation about writing such a script: https://setuptools.readthedocs.io/en/latest/setuptools.html
You may especially be interested by this sections:
Including Data Files
Automatic Script Creation
If you do things correctly, setuptools will take care of installing your script and its files somewhere in the PATH so it can be executed from the command line.
You should use setuptools to distribute your application, creating a setup.py file to configure its setup and installation.
Binaries can be delivered using the console_scripts option, while data files can be delivered using either package_data.

Create python script that don't depend on modules

I've made a python script that depends of numpy, cv2 and some other modules to run and need to run it on a Linux server where I'm not allowed to install anything.
Is there a way to join all that stuff into a single executable that runs without installing anything?
It sounds like you're looking for PyInstaller, which bundles all the modules your script depends on into a single program. It even includes Python itself. There are a few alternatives, some of which are listed on this page.
You could create a standalone executable using Nuitka. Assuming you have all required packages on your development machine you can run
nuitka --python-version=2.7 --standalone foobar
Just make sure you run it with the standalone flag and correct python version.

Compile python virtual environment

I created a Python script for a Freelance job and I can't find how to compile/build/package it for easy sharing. The person for which I created it is not a technical one, so I can't explain him how to activate a virtualenv, install requirements and so on.
What is the easiest way for him to run the project right after downloading it?
Can the whole virtualenv be compiled into an .exe? If yes, can this be done inside a macOS system?
Yes you can package your python programs and it's dependencies with
Cx_Freeze
There are other python modules that do the same, personally i prefer cx_Freeze because it's cross platform and was the only one that worked out of the box for me.
By default cx_Freeze automatically discovers modules and adds them to the exe to be generated. but sometimes this doesn't work and you might have to add them by yourself
To create a simple executable from a file. you can just use the bundled cxfreeze script
cxfreeze hello.py --target-dir dist
but more for more complex applications that have different files you'll have to create a distutils setup script.
Note that cx_freeze doesn't compile your code like a traditional compiler. it simply zips your python files and all it's dependencies ( as byte code) together, and includes the python interpreter to make your program run. So your code can be disassembled. (if anyone wants to) and you'll also notice that the size of your program would be larger than it was initially (Because of the extra files included)
I ended up using PyInstaller as this worked out of the box for me.

Python compile all modules into a single python file

I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.

Categories

Resources