multi platform portable python - python

I want to install python on a flash drive in a virtual environment so that I can develop code wherever I am. Is this possible to do in such a way that I can use my flash drive on windows/mac/linux computers?

For windows, head to Portable Python (http://PortablePython.com) to see various options you have,
For linux and Mac you don't need to install it on USB drive as those systems usually come with Python pre-installed. If you need specific packages for those systems, bring them on USB together with a command line script that can load them with one call in virtualenv on those systems and you are good to go !
Be aware that this is never 100% bullet proof as you are depending on Python version you are using/bringing packages for.

You could try looking at setting up something using some VirtualEnv type environments, with the various Python versions installed on your machines.
Not sure how you'd get round the different paths on the different operating systems though.
Virtualenv: http://pypi.python.org/pypi/virtualenv

As #millimoose pointed out, you could install three different versions of Python.
For each Python package you are working on, you can create a .pth file in the site-packages directory of each Python version that you would like to use the package from.
Note that, as described here:
If you put a .pth file in the site-packages directory containing a path, python searches this path for imports.
For example, if you have a package named my_package that you are working on that resides at the path C:\Users\Me\Documents\dev_packages\my_package, you can add a file with extension .pth (note that the name doesn't matter, specifically it doesn't have to have any relation to the package name), with the contents:
C:\Users\Me\Documents\dev_packages
This will add C:\Users\Me\Documents\dev_packages to the Python import search-path, causing the my_package package to be discovered. By placing this .pth file in the site-packages directory of each Python version, my_package will be available in all corresponding versions of Python.

Related

Python, site-packages partially on network file system?

CentOS 7 system, Python 2.7. The OS's installed python has directory
/usr/lib/python2.7/site-packages
and that is where a
python setup.py install
command would install a package. On a computing cluster I would like to install some packages so that they are referenced from that directory but which actually reside in an NFS served directory here:
/usr/common/lib/python2.7/site-packages
That is, I do not want to have to run setup.py on each of the cluster nodes to do a local install on each, duplicating the package on every machine. The packages already installed locally must not be affected, some of those are used by the OS's commands. Also the local ones must work even if the network is down for some reason. I am not trying to set up a virtual environment, I am only trying to place a common set of packages in a different directory in such a way that the OS supplied python sees them.
It isn't clear to me what is the best way to do this. It seems like such a common problem that there must be a standard or preferred way of doing this, and if possible, that is the method I would like to use.
This command
/usr/bin/python setup.py install --prefix=/usr/common
would probably install into the target directory. However the "python" command on the cluster nodes will not know this package is present, and there is no "network" python program that corresponds to the shared site-packages.
After the network install one could make symlinks from the local to the shared directory for each of the files created. That would be acceptable, assuming that is sufficient.
It looks like the PYTHONPATH environmental variable might also work here, although I'm unclear about what it expects
for "path" (full path to site-packages, or just the /usr/common part.)
EDIT: This does seem to work as needed, at least for the test case. The software package in question was installed using --prefix, as above. PYTHONPATH was not previously defined.
export PYTHONPATH=/usr/common/lib/python2.7/site-packages
python $PATH_TO_START_SCRIPT/start.py
ran correctly.
Thanks.

Why is python27.dll not part of python installed folder but in Windows system folder

As described in: http://bugs.python.org/issue22139, the python27.dll is installed in the windows systems (in my case C:\Windows\Systems32) folder.
But I would like to know why? Why is it not installed next to the python.exe, for example in C:\Python27\?
Reason I ask: I've made a mercurial hook in python that our developers need to use to check if the commit message is valid. It checks a.o. for a valid JIRA issue number. To prevent all our developers to install python themselves and install the required modules manually (a lot of work and errorprone), I zipped the python installation and asked the developers to unzip it locally. But they can't run it, because the python27.dll is missing, or worse, they already have another minor version of python installed, and the hook will fail due to the wrong python27.dll used. Confusing.
If I just add the python27.dll (the correct version) to the zip file, it all seems to work great. So, why is it not installed in that location in the first place? What is the advantage of installing it in C:\Windows\System32?
Hope someone can explain this to me!
Thanks in advance,
Tallandtree.
I use the Anaconda Python distribution from http://continuum.io. They put python27.dll into c:\anaconda right next to its python.exe. This distribution is also superior in that you can have multiple python environments with precisely the packages you need and switch between them easily (http://conda.pydata.org/docs/using/envs.html). You can also get the package list of one of your environments and distribute it to others.
I recommend this Python distribution over the one from python.org and Enthought, because of this issue.
.dlls are quite windows-specific files. I imagine you will have shared object (.so) files for LINUX/UNIX-specific Python stuff? You said your developer's couldn't run it, because they didn't have the correct DLL (i.e. the one relevant to their Python installation).
Also, the advantage of installing it to System32 is that it's in the default PATH. Additionally, if any other application is internally using Python and require access to the .dll file, and also NOT reference your Python directory, they will probably be looking for a location that "Actually" exists (I wanted to say guaranteed to exist, but......never mind). That location would be `C:/windows/Systems32'.
I found it to work just fine to put python27.dll in the Python directory (c:\Python27 or wherever). As long as it's in the PATH, it seems to work. I did this for a "relocatable" installation of Python. I can copy the installation directory to a Windows machine that has no Python installed, set the PATH to include that directory, and run Python, including all the libraries I had installed with pip install on the original machine.

Use Selenium with Python Portable

My question seems somewhat inane, but I cannot seem to find any resources for what I need to do.
Essentially I'm using my work computer to write python applications in my spare time. I'm using Python Portable (syntax version 3.2) because I do not have administrative access and can't do things with path variables etc.
How (if possible) do I install or import selenium so I can use it in Python Portable?
Thanks all!
Based on answer found Importing modules on portable python
and How to install external libraries with Portable Python?
Check for what import sys; print sys.path says?
It displays the list of directories and zipfiles where portable python looks for modules to import. Just copy your modules into one of those directories or zipfiles, or sys.path.append('/whatever/dir') if you have your modules in /whatever/dir and want to keep them there (the latter approach will last only for the current session, be it interactive or a script's execution).
Also on their FAQs
You don’t have package I need, can I add it?
For simpler packages you can use easy install or even extract them in site-packages folder of
the Portable Python distribution. However some packages are installing additional dependencies
in windows system folders - in this case your Portable Python distribution will not work once
you move it to some other workstation. Make sure to do proper testing !

Python compile all modules into a single python file

I am writing a program in python to be sent to other people, who are running the same python version, however these some 3rd party modules that need to be installed to use it.
Is there a way to compile into a .pyc (I only say pyc because its a python compiled file) that has the all the dependant modules inside it as well?
So they can run the programme without needing to install the modules separately?
Edit:
Sorry if it wasnt clear, but I am aware of things such as cx_freeze etc but what im trying to is just a single python file.
So they can just type "python myapp.py" and then it will run. No installation of anything. As if all the module codes are in my .py file.
If you are on python 2.3 or later and your dependencies are pure python:
If you don't want to go the setuptools or distutiles routes, you can provide a zip file with the pycs for your code and all of its dependencies. You will have to do a little work to make any complex pathing inside the zip file available (if the dependencies are just lying around at the root of the zip this is not necessary. Then just add the zip location to your path and it should work just as if the dependencies files has been installed.
If your dependencies include .pyds or other binary dependencies you'll probably have to fall back on distutils.
You can simply include .pyc files for the libraries required, but no - .pyc cannot work as a container for multiple files (unless you will collect all the source into one .py file and then compile it).
It sounds like what you're after is the ability for your end users to run one command, e.g. install my_custom_package_and_all_required_dependencies, and have it assemble everything it needs.
This is a perfect use case for distutils, with which you can make manifests for your own code that link out to external dependencies. If your 3rd party modules are available publicly in a standard format (they should be, and if they're not, it's pretty easy to package them yourself), then this approach has the benefit of allowing you to very easily change what versions of 3rd party libraries your code runs against (see this section of the above linked doc). If you're dead set on packaging others' code with your own, you can always include the required files in the .egg you create with distutils.
Two options:
build a package that will install the dependencies for them (I don't recommend this if the only dependencies are python packages that are installed with pip)
Use virtual environments. You use an existing python on their system but python modules are installed into the virtualenv.
or I suppose you could just punt, and create a shell script that installs them, and tell them to run it once before they run your stuff.

Creating a Portable Python (local install) for Linux

I'm looking to create the following:
A portable version of python that can be run on any system (with any previous version of python or no python installed) and have it pre-configured with various python packages (ie, django, lxml, pysqlite, etc)
The closest I've found to the above is virtualenv, but this only goes so far.
If I package up a nice virtualenv for python on one machine, it contains sym links to a lot of the libraries it needs. I can take those sym links and convert them to their actual files, but if I try to move this entire directory to another machine, I get seg fault after seg fault.
To launch python on a different machine, I'm using:
LD_LIBRARY_PATH=lib/ ./bin/python
and in lib/ I have all of the shared libraries I copied from the original machine. The problem here is these shared libraries might rely on other shared libraries that I'm not including, so executing this on other linux distros does not work. Probably due to it falling back on older shared libaries installed on the system that do not work with what I copied over.
Anyone have an idea on how to get this working? Is this even possible?
EDIT:
To clarify, the desired outcome is to create a tar.gz of a python binary and associated packages (django, lxml, pysqlite, etc) that can be extracted and run on any linux based system, ie (ubuntu 8.04, redhat 5, suse 11, etc), all 32bit distros, where the locally installed version of python doesn't impact what's in the tar.gz.
I just tested this and it works great.
Get the copy of python you want to install and untar it and cd to the untarred folder first.
Also get a copy of setuptools and untar that.
/opt/portapy used below is of course just the name I came up with for this post, it could be any path and the full path should be tarred up and the same path should be used on any systems you put this on due to absolute path linking.
mkdir /opt/portapy
cd <python source dir>
./configure --prefix=/opt/portapy && make && make install
cd <setuptools source dir>
/opt/portapy/bin/python ./setup.py install
Make the virtual env folder inside the portapy folder.
mkdir /opt/portapy/virtenv
/opt/portapy/bin/virtualenv /opt/portapy/virtenv
cd /opt/portapy/virtenv
source bin/activate
Done. You are ready to install all of your libraries here and have the option of creating multiple virtual envs this way.
You can then tar up the whole /opt/portapy folder and transport it to any Linux system of the same arch, within reason I suspect.
I compiled 2.7.5 ond centOS 5.8 64bit and moved the folder to a Cent6.9 system and it runs perfectly.
I don't know how this is even possible. If it were, they woudn't need to distribute binary packages of python for different platforms. You can't simply distribute python that will run on any platform. It has to be built from source for that arch. Virtualenv will expect you to tell it which system python to use (using links).
This pretty much goes for almost any binary package that links against system libs. Again, if it were possible, we wouldn't need any platform specific binary distributions.
You can, however, achieve part of what you want. That is, running python on another machine that doesn't have python installed as long as its the same arch. This is the same concept behind freezing, or py2exe/py2app/pyinstaller. An interpreter is bundled into a standalone environment. So the app can run on any similar platform.
Edit
I just realized that while your question speaks about "system" agnostically, your title contains the reference "linux". There are different flavors of linux, so in order for it to work you would have to build it fat for multiple archs and also completely contain the standalone links. You might try building a package with pyinstaller and using that to include in your project.
You can try just building python from source, in your virtualenv:
$ ./configure --prefix=/path/to/virtualenv && make && make install
If you still have problems with the links to libs, you can also investigate building it statically
I'm not sure that working solely in Python is the way to go here. You might have better luck with Puppet of Chef, which are configuration tools that can be used to create a local environment. There is plenty of code out there to install virtualenv and python on just about any Linux plus OSX (probably not Windows though).
Your workflow would be to install chef or Puppet (your choice), run a script to install the Python you want, then enter a virtualenv and pip install any packages you might need.
Sorry this isn't as easy as virtualenv alone, but it is much more robust.
Well, since I rarely accept "can't be done", there is a way to do it. Warning: it isn't pretty and you should probably look into a different scenario.
What you will need to to is determine a standard location for this top level directory. Second, using that directory as your root you will need to compile Python on each Linux distribution you want to run this on. For this you would use something like "/usr/local/myappname/platform/" to configure and compile Python to live in. In each case substitute "platform" with the name of the platform such as "/usr/local/rhel/". If memory serves the configure option you are looking for here is --prefix.
Once you have each distribution compiled you will need a script to determine which one to use and either set environment variables or have it create symlinks to the appropriate "installation" of python. I would then use virtualenv and bootstrap in that tree to keep the "in-use" python libraries even more specific.
I can't think of a common Linux distribution that doesn't have Python by default. As such you could use setup.py and/or basic python scripts to script this out since you should be able to rely in Python being present - even if its ye olde version as in RHEL installs. Personally I find the above method overly complicated but it would meet your stated requirements with the allowance for a final script. Of course, you could use shar (SHell ARchive) to tar all of this into a runnable shell script to do the installation and avoid the need for secondary scripts. If you gzip the resulting shel archive then you can decompress it on target systems and execute it to set everything up.
All that said, I would not recommend this. I would recommend determining the minimum Python version you can run on and ensuring that is installed by the distribution whenever possible and if needs be pulling down from a repo and installing. Then, use virtualenv and bootstrap with a requirements.txt to install necessary python libraries and apps into the virutalenv. For that see this documentation
I faced the same problem, so I created PortableVirtualenv. Your Question is just the definition of it.
I use it as a base for commercial multiplatform app I develop. (But PortableVirtualenv is public domain - use it freely.)
If needed, you can pip-install any package and zip the whole directory to distribute also packages you need.
One nice option is to make a "snap" portable linux application. They have a python mode which lets you specify you specify exactly what modules you need. From https://snapcraft.io/first-snap#python :
Snaps let you distribute a dependency-isolated Python app in an app store experience for end users.
Another option is to containerize your application with something like docker. Then instead of executing your script directly, the user is actually running a small OS with just your application and its dependencies. https://www.infoq.com/articles/docker-executable-images/ has more about executable containers.
Container images can also be used for short lived processes: a containerized executable meant to be run on your computer. These containers execute a single task, are short lived and can generally be removed after use. We call these executable images. Examples are compilers (Golang) or build tools (Maven), presentation software (I love to hack a simple presentation in Markdown format and let a RevealJS Docker image serve that) and browsers (a fresh contained browser to follow that fishy link). A real evangelist for executable images is Docker's own Jessie Frazelle. To get some great inspiration be sure to read her blog about them or check out this presentation at DockerCon 2015.

Categories

Resources