I have created a little Python egg (with setuptools) that I want to install in other machines of my LAN. I have even setup a server for the eggs and all (and the egg is properly downloaded and installed with easy_install -f http://myserver/eggrepository ) :-)
I would like to know if there's a way of running an script (bash or Python) when installing it with easy_install (version 0.6c11 and python2.6).
I have added a bash script to the package, and I'd like to be able to run it automatically (mainly to start some functionalities in the rcX.d levels, start running at startup, etc...) when the egg is installed. Right now I have to go to the /usr/local/lib/python2.6/dist-packages, find the folder where my egg was installed and run the bash script that is in said egg... But that solution is not very accurate and I'm sure it will give me problems if I change versions, paths, etc...
I've been reading and I found some posts saying it wasn't possible, but they are a bit old and maybe there's a way now... I also found others saying it was possible with distutils (which means that probably setuptools can do it too) but I haven't been able to find any suitable solution using setuptools.
Thank you in advance
Related:
How can I add post install scripts...
How to extend distutils with a simple post install script
Ok... I found a workaround...
python-packaging-custom-scripts
It's not as straight-forward as I would have liked, but well...
I can put the installation process in an sh file and then, since there's going to be a Python script in the user's path, I can call it from the bash script installing the package...
Related
Here is my situation:
I work on a python project in which we work within a particular, project-oriented-and-customized virtual environment.
However at some point I need to introduce 3rd party modules into this venv, concretely speaking, pycurl and openpyxl. Presently I had to run certain python scripts depending on these two modules from anaconda's venv, which includes those two.
I don't feel like to switch back and forth between these two venvs.
We have a corp firewall stopping direct access to outside code repository. I managed, however, to download pycurl from https://dl.bintray.com/pycurl/pycurl/, after unzip it (BTW, I use Windows), I noticed these two packages
site-packages\curl
site-packages\pycurl-7.43.0-py3.5.egg-info
In my project's directory tree, I found bunches of modules pretty much in line with the naming conventions as above, they are all under:
my_project\PythonVenv\Lib\site-packages
Then I copied curl and pycurl-7.43.0-py3.5.egg-info there and re-activate the project's venv, and tried running the script, still it complains:
ModuleNotFoundError: No module named 'pycurl'
Maybe simply copying doesn't work? Do I need to use something like "python setup.py" or "pip install".
First, I didn't see setup.py coming with the pycurl package; secondly "pip install" not work due to the corp firewall.
Anyone can help? Thanks.
On Windows you should be using the Windows binary packages. It sounds like you are trying to unpack the source distribution.
I've recently started writing python scripts and I'm still newbie to the language.
I'm stuck with a problem: My script requires the 'requests' library(and the other packages that comes with it when using pip) to be installed by pip for the script to work(and some folders like the 'database', where I store a sqlite3 file) and I need to install the script in a lot of machines, that have different Ubuntu versions, therefore different Python versions, and I want my script to run 'standalone' and to not have to install/update Python, pip and the 'requests' package every time I setup the script in a new machine. I'm developing in a virtualenv on my machine that is currently setuped with all the necessary packages to run the script.
Can I make a make a 'copy' of my virtualenv so it can be moved with my Python script to other computers, including my database folder, without having to install/update python and pip on every machine, instead using this standalone version of python? All the machines are Linux.
I already tried to copy my virtualenv to my project folder but the virtualenv crashed when I tried running my script using the python interpreter inside it in the shebang line, even when using the --relocatable argument, so I guess it's not the case.
I've also tried using PyInstaller, no success.
Welcome to the world of deployment! The answer you seek is far from trivial.
First off, Python is an interpreted language that isn't really supposed to be distributed as a desktop application. If you would like to create executables, then there are some libraries for that, such as py2exe. However, these are ad-hoc solutions at best. They "freeze" the whole of Python along with your code, and then you ship everything together.
The best practice way to stipulate your dependencies is in a requirements.txt file. You can create one with this command:
pip freeze > requirements.txt
What this does is checks all the libraries that are currently in whatever env you're working in, and saves them to a file called requirements.txt. That file will then have all of your required libraries in it, and anyone who receives your code can just run
pip install -r requirements.txt
and it will install all the dependencies.
However, that just takes care of library dependencies. What about the version of python itself, OS environment etc... So this is where you may need to start looking at solutions like Docker. With Docker, you can specify the full environment in a Dockerfile. Then anyone on another machine can run the docker images, with all of the dependencies included. This is fast become the de-facto way of shipping code (in all languages, but especially useful in Python).
I'm a Windows guy who just compiled my first python application, can anyone tell me where the compiled output would end up?
I just ran SUDO make install to install pyopenCL which is a dependency.
Now I'm trying to install and run phoenix2 and I ran the following:
sudo python ./setup.py install
and now I'm not sure where to look for and execute the file as described here. Any assistance would be appreciated (I'm a bit of a n00b here, overwhelmed by all the documentation)
When you run make install, the make application looks in the project's Makefile to find out where it should put executables, as well as any other files the application needs to run. This, of course, assumes that the project even has executables (a library might not, for example).
Look in the project base directory (the dir you ran make install from) for a file named Makefile. It should have a variable called BIN_DIR or similar that tells you where it wants final binaries to go.
I downloaded tarball of python 2.7.2 to install on Suse Linux server--it comes with 2.6 and 3.1.
Untarred it (I know--wrong lingo, sorry) to a directory.
When trying to run ./configure, which should create a valid makefile I can't get past the step one: the script reports that it can't find a compiler on the path.
But, when I run the shell in the same directory and type "make", make runs.
I am really unfamiliar with Linux, but this just seems so basic that I can't even begin to see what's wrong.
I also downloaded what appears to be an RPM file for python 2.7.2 for SUSE Linux but I can't for the life of me figure out how to get "import" this package into Yast2 or "Install Software." These two tools seem impenetrable and hostile to packages saved in the file system rather than accessed from specific distribution web sites.
Really, this should be just trivial but it is not.
Suse uses Gnome and Gnome seems to have its own view of what the directory structure should be for desktop end user-y kinds of files. That is where I put my downloaded tar file. Might I do better if I put somewhere in usr?
Sorry to be so much more clueless than most stackoverflow participants but I am just not a Linux guy.
Sounds like you simply don't have the compiler installed. Do:
sudo zypper install gcc
If a ./configure fails, there's no point in running make.
Suse has a package manager called Yast. It would do your installation with no fuss.
I want to distribute some python code, with a few external dependencies, to machines with only core python installed (and users that unfamiliar with easy_install etc.).
I was wondering if perhaps virtualenv can be used for this purpose? I should be able to write some bash scripts that trigger the virtualenv (with the suitable packages) and then run my code.. but this seems somewhat messy, and I'm wondering if I'm re-inventing the wheel?
Are there any simple solutions to distributing python code with dependencies, that ideally doesn't require sudo on client machines?
Buildout - http://pypi.python.org/pypi/zc.buildout
As sample look at my clean project: http://hg.jackleo.info/hyde-0.5.3-buildout-enviroment/src its only 2 files that do the magic, more over Makefile is optional but then you'll need bootstrap.py (Make file downloads it, but it runs only on Linux). buildout.cfg is the main file where you write dependency's and configuration how project is laid down.
To get bootstrap.py just download from http://svn.zope.org/repos/main/zc.buildout/trunk/bootstrap/bootstrap.py
Then run python bootstap.py and bin/buildout. I do not recommend to install buildout locally although it is possible, just use the one bootstrap downloads.
I must admit that buildout is not the easiest solution but its really powerful. So learning is worth time.
UPDATE 2014-05-30
Since It was recently up-voted and used as an answer (probably), I wan to notify of few changes.
First of - buildout is now downloaded from github https://raw.githubusercontent.com/buildout/buildout/master/bootstrap/bootstrap.py
That hyde project would probably fail due to buildout 2 breaking changes.
Here you can find better samples http://www.buildout.org/en/latest/docs/index.html also I want to suggest to look at "collection of links related to Buildout" part, it might contain info for your project.
Secondly I am personally more in favor of setup.py script that can be installed using python. More about the egg structure can be found here http://peak.telecommunity.com/DevCenter/PythonEggs and if that looks too scary - look up google (query for python egg). It's actually more simple in my opinion than buildout (definitely easier to debug) as well as it is probably more useful since it can be distributed more easily and installed anywhere with a help of virtualenv or globally where with buildout you have to provide all of the building scripts with the source all of the time.
You can use a tool like PyInstaller for this purpose. Your application will appear as a single executable on all platforms, and include dependencies. The user doesn't even need Python installed!
See as an example my logview package, which has dependencies on PyQt4 and ZeroMQ and includes distributions for Linux, Mac OSX and Windows all created using PyInstaller.
You don't want to distribute your virtualenv, if that's what you're asking. But you can use pip to create a requirements file - typically called requirements.txt - and tell your users to create a virtualenv then run pip install -r requirements.txt, which will install all the dependencies for them.
See the pip docs for a description of the requirements file format, and the Pinax project for an example of a project that does this very well.