How to install wxPython using virtualenv - python

I am trying to begin a new python GUI application and I have decided to use wxPython as GUI because I want a multi-platform one.
The problem is that I want to use virtualenv ( with virtualenvwrapper ) to isolate the environment and be able to reproduce it in other machines where I will work, but i cannot install wxPython.
I have it installed in my ubuntu machine via apt-get but that is not enough
I have searched the web for a solution and i have found ...
This page http://batok.github.com/virtualenvwxp/ where it is explained a way to hack the virtualenv environment to use the local installation of wxPython. Not the best solution, but it would be a good workaround. The problem is that it is explained for Mac, and I couldnt make it work in my ubuntu.
Also found this page Installing wxPython in virtualenv under Linux where someone ask something similar. I have tried to build wxPython that way with no success.
Any help would be appreciated.
In the end, I have choose wxPython beacuse it is multiplatform and i can use it without license problems, but as i have not started yet i can change my mind if there is another easier to install framework.
Thanks In advance
20110925: Sorry for the delay and thanks for the answers.
I just have tried to install wxpython using buildout and the links given here, but i still have the same problem. It seems as if I need libgtk2.0-dev package to be able to compile wxpython...
So there is any way to install this package locally to the buildout environment?
Thanks again.

At the end I could not resolve this problem.
I want to create a reproducible python environment with all the requirements inside using buildout and/or virtualenv so I could work in any linux system with just virtualenv, python and a C++ compiler installed.
It seems that the only way to do this is to use buildout cmmi recipes to download and build wxpython and ALL its dependencies. This is a really painful way, and I have no time now.
I have decided to use a workaround: I am going to work on my ubuntu laptop most of the time, so I have installed wxpython from the repositories and use a wx.pth file to make it available to the virtual environment.
This is not a good solution, but seems the best till now ... so if someone knows any better solution please let me know.
When my python project is more mature, I will turn again to this problem and I will probably try the hard way ...
Thanks for all your answers and comments.

The solution I ended up using was to install python to my main system:
Then make a symbolic link from the wx in my system python to my virtual environment:
ln -s /usr/lib/python2.7/dist-packages/wxversion.py <virtual_env_path>/lib/python2.7/site-packages/wxversion.py
Where is the path in my case to a virtual environment named "fibersim" for example is:
/home/adam/anaconda/envs/fibersim
Then import wx worked
Got this from: http://qopml.org/wp-content/uploads/2013/01/README.txt

Buildout allows your to install different parts whose recipe code determines how that part is built. There are cmmi recipes for building stuff with Configure/Make/Make-Install (CMMI). You can use this to build wxPython locally to the buildout and then create a python interpreter that has that build of wxPython and your own eggs in it's path.
See this blog post and this answer for details.
Keep in mind that zc.recipe.egg will also install any setuptools/distribute console_scripts in the buildout's bin directory as well. See also mr.developer for automatically checking out multiple packages from VCS and working on them in the same buildout.

Related

how to control if libraries are installed on the system in python

is there a way to create a function that check on the system if the python modules that are needed by the main program are installed and eventually install them automatically?
I searched a lot for something like this but i didn't found anything useful.
Thanks and sorry for the bad english.
The tool you are likely looking for is pip, it's not run as a part of a script but rather is used to install a script.
https://pip.pypa.io/en/stable/reference/
In addition if you wish to develop a script to install your script you can find documentation
http://marthall.github.io/blog/how-to-package-a-python-app/

Is there a way to "version" my python distribution?

I'm working by myself right now, but am looking at ways to scale my operation.
I'd like to find an easy way to version my Python distribution, so that I can recreate it very easily. Is there a tool to do this? Or can I add /usr/local/lib/python2.7/site-packages/ (or whatever) to an svn repo? This doesn't solve the problems with PATHs, but I can always write a script to alter the path. Ideally, the solution would be to build my Python env in a VM, and then hand copies of the VM out.
How have other people solved this?
virtualenv + requirements.txt are your friend.
You can create several virtual python installs for your projects, everything containing exactly those library versions you need (Tip: pip freeze spits out a requirements.txt with the exact library versions).
Find a good reference to virtualenv here: http://simononsoftware.com/virtualenv-tutorial/ (it's from this question Comprehensive beginner's virtualenv tutorial?).
Alternatively, if you just want to distribute your code together with libraries, PyInstaller is worth a try. You can package everything together in a static executable - you don't even have to install the software afterwards.
You want to use virtualenv. It lets you create an application(s) specific directory for installed packages. You can also use pip to generate and build a requirements.txt
For the same goal, i.e. having the exact same Python distribution as my colleagues, I tried to create a virtual environment in a network drive, so that everybody of us would be able to use it, without anybody making his local copy.
The idea was to share the same packages installed in a shared folder.
Outcome: Python run so unbearably slow that it could not be used. Also installing a package was very very sluggish.
So it looks there is no other way than using virtualenv and a requirements file. (Even if unfortunately often it does not always work smoothly on Windows and it requires manual installation of some packages and dependencies, at least at this time of writing.)

setup.py, makefile... What I need?

I've been reading a lot these days, and I'm not sure about the specific use of it. I need ask it, because cannot find someone who explain it to me. Now I'm lost..
The main problem is I need install my app (python + glade) in "/usr/share/name_app" and a ".desktop" file in "/usr/share/applications" in Ubuntu.
The solution that I've find is creating a ".deb" file because the installation is perfect. In ubuntu I can launch it with Unity clicking on the launcher, the ".desktop".
(Probably I'll upload it to "Ubuntu Software Center").
For windows I could use "py2exe" or a similar, and another one for Mac.
But, like the code is in GitHub, it should have a "setup" or a "makefile" to install it.
After reading and reading (and reading), I think that "setup.py" is only for install a module and then import it with python.
However if I have to install and app, how can I distribute it making a "setup.py" or a "makefile"? Which is better for install an app? Which is the diference? What I have to use?
Thanks:)
setup.py is used to deploy Python applications and modules with virtualenv http://www.virtualenv.org/en/latest/index.html setup.py is mostly useful for the application developers - you can run
python setup.py develop
within virtualenv to set-up your development workspace with Python dependencies.
For each platform distribution (Windows, OSX, Linux) use the distribution tools as you are currently using.
You can also use setuptools tools to roll out packages from setup.py for the platform architecture. Eg. creating .deb from setup.py
http://pypi.python.org/pypi/stdeb/
More info about setup.py
http://packages.python.org/distribute/setuptools.html
To grasp general concepts you could read stdlib docs: An Introduction to Distutils.
The first several links for keywords: "python packaging" are to: Python Packaging User Guide that should show what setup.py is and how to use it.

Distributing python code with virtualenv?

I want to distribute some python code, with a few external dependencies, to machines with only core python installed (and users that unfamiliar with easy_install etc.).
I was wondering if perhaps virtualenv can be used for this purpose? I should be able to write some bash scripts that trigger the virtualenv (with the suitable packages) and then run my code.. but this seems somewhat messy, and I'm wondering if I'm re-inventing the wheel?
Are there any simple solutions to distributing python code with dependencies, that ideally doesn't require sudo on client machines?
Buildout - http://pypi.python.org/pypi/zc.buildout
As sample look at my clean project: http://hg.jackleo.info/hyde-0.5.3-buildout-enviroment/src its only 2 files that do the magic, more over Makefile is optional but then you'll need bootstrap.py (Make file downloads it, but it runs only on Linux). buildout.cfg is the main file where you write dependency's and configuration how project is laid down.
To get bootstrap.py just download from http://svn.zope.org/repos/main/zc.buildout/trunk/bootstrap/bootstrap.py
Then run python bootstap.py and bin/buildout. I do not recommend to install buildout locally although it is possible, just use the one bootstrap downloads.
I must admit that buildout is not the easiest solution but its really powerful. So learning is worth time.
UPDATE 2014-05-30
Since It was recently up-voted and used as an answer (probably), I wan to notify of few changes.
First of - buildout is now downloaded from github https://raw.githubusercontent.com/buildout/buildout/master/bootstrap/bootstrap.py
That hyde project would probably fail due to buildout 2 breaking changes.
Here you can find better samples http://www.buildout.org/en/latest/docs/index.html also I want to suggest to look at "collection of links related to Buildout" part, it might contain info for your project.
Secondly I am personally more in favor of setup.py script that can be installed using python. More about the egg structure can be found here http://peak.telecommunity.com/DevCenter/PythonEggs and if that looks too scary - look up google (query for python egg). It's actually more simple in my opinion than buildout (definitely easier to debug) as well as it is probably more useful since it can be distributed more easily and installed anywhere with a help of virtualenv or globally where with buildout you have to provide all of the building scripts with the source all of the time.
You can use a tool like PyInstaller for this purpose. Your application will appear as a single executable on all platforms, and include dependencies. The user doesn't even need Python installed!
See as an example my logview package, which has dependencies on PyQt4 and ZeroMQ and includes distributions for Linux, Mac OSX and Windows all created using PyInstaller.
You don't want to distribute your virtualenv, if that's what you're asking. But you can use pip to create a requirements file - typically called requirements.txt - and tell your users to create a virtualenv then run pip install -r requirements.txt, which will install all the dependencies for them.
See the pip docs for a description of the requirements file format, and the Pinax project for an example of a project that does this very well.

Deploying python app to Mac and Windows users

I've written an app in python that depends on wxPython and some other python libraries. I know about pyexe for making python scripts executable on Windows, but what would be the easiest way to share this with my Mac using friends who wouldn't know how to install the required dependencies? One option would be to bundle my dependencies in the same package, but that seems kind of clunky. How do people usually deploy such apps? For once I miss Java...
You could check out py2app, which is similar to py2exe
How do people usually deploy such apps?
2 choices.
With instructions.
All bundled up.
You write simple instructions like this. Folks can follow these pretty reliably, unless they don't have enough privileges. Sometimes they need to sudo in linux environments.
Download easy_install (or pip)
easy_install this, easy_install that (or pip this, pip that)
easy_install whatever package you wrote.
It works really well. If you download some Python packages you'll see this in action.
Sphinx requires docutils. Django requires docutils and PIL. It works out really well to simply document the dependencies. Other folks seem to do it without serious problems. Follow their lead.
Bundling things up means you have to
(a) provide the entire original distribution (as required by most open source licenses)
(b) provide a compatible open source license with the licenses of the things you bundled. This can be easy if you depend on things that all of the same license. Otherwise, you basically can't redistribute them and have to resort to installation instructions.

Categories

Resources