Here is my situation:
I work on a python project in which we work within a particular, project-oriented-and-customized virtual environment.
However at some point I need to introduce 3rd party modules into this venv, concretely speaking, pycurl and openpyxl. Presently I had to run certain python scripts depending on these two modules from anaconda's venv, which includes those two.
I don't feel like to switch back and forth between these two venvs.
We have a corp firewall stopping direct access to outside code repository. I managed, however, to download pycurl from https://dl.bintray.com/pycurl/pycurl/, after unzip it (BTW, I use Windows), I noticed these two packages
site-packages\curl
site-packages\pycurl-7.43.0-py3.5.egg-info
In my project's directory tree, I found bunches of modules pretty much in line with the naming conventions as above, they are all under:
my_project\PythonVenv\Lib\site-packages
Then I copied curl and pycurl-7.43.0-py3.5.egg-info there and re-activate the project's venv, and tried running the script, still it complains:
ModuleNotFoundError: No module named 'pycurl'
Maybe simply copying doesn't work? Do I need to use something like "python setup.py" or "pip install".
First, I didn't see setup.py coming with the pycurl package; secondly "pip install" not work due to the corp firewall.
Anyone can help? Thanks.
On Windows you should be using the Windows binary packages. It sounds like you are trying to unpack the source distribution.
Related
I would like to pack a python program and ship it in a deb package.
For reasons (I know in 99% it is bad practice) I want to ship the program in a python virtual environment within a debian package.
I know I can do this using dh-virtualenv. This works great - generally no problem.
But the problem arises when I want to upload this to launchpad. Uploading to launchpad means uploading a source package. In terms of dh-virtualenv a source package is the package description, where the virtualenv has not been created, yet.
What happens when I upload this to launchpad is, that the package will not build, since the dh-virtualenv which is executed during the build process on launchpad will try to install python modules into the virtualenv, which means installing these from the PyPI, which will not work, since launchpad does not allow external network access.
So basically there are two possible solutions:
Approach A
Prepare the virtualenv and somehow incorporate it into the source package and having the dh build process simply "move" this prepared virtualenv to its end location. This could work with virtualenv --relocatable. BUT the relocation strips the utf-8 marker at the beginning of all python scripts, rendering all python scripts in the virtualenv broken.
Apporach B
Somehow cache all necessary python packages in the source package and have dh_virtualenv install from the cache instead of from PyPI.
This seems like to be doable with pip2pi, but certain experiements show, that it will not install packages, although they are located in the local package index.
Both approaches seem a bit clumsy and prone to errors.
What do you think of this?
What are your experiences?
What would you recommend?
I've recently started writing python scripts and I'm still newbie to the language.
I'm stuck with a problem: My script requires the 'requests' library(and the other packages that comes with it when using pip) to be installed by pip for the script to work(and some folders like the 'database', where I store a sqlite3 file) and I need to install the script in a lot of machines, that have different Ubuntu versions, therefore different Python versions, and I want my script to run 'standalone' and to not have to install/update Python, pip and the 'requests' package every time I setup the script in a new machine. I'm developing in a virtualenv on my machine that is currently setuped with all the necessary packages to run the script.
Can I make a make a 'copy' of my virtualenv so it can be moved with my Python script to other computers, including my database folder, without having to install/update python and pip on every machine, instead using this standalone version of python? All the machines are Linux.
I already tried to copy my virtualenv to my project folder but the virtualenv crashed when I tried running my script using the python interpreter inside it in the shebang line, even when using the --relocatable argument, so I guess it's not the case.
I've also tried using PyInstaller, no success.
Welcome to the world of deployment! The answer you seek is far from trivial.
First off, Python is an interpreted language that isn't really supposed to be distributed as a desktop application. If you would like to create executables, then there are some libraries for that, such as py2exe. However, these are ad-hoc solutions at best. They "freeze" the whole of Python along with your code, and then you ship everything together.
The best practice way to stipulate your dependencies is in a requirements.txt file. You can create one with this command:
pip freeze > requirements.txt
What this does is checks all the libraries that are currently in whatever env you're working in, and saves them to a file called requirements.txt. That file will then have all of your required libraries in it, and anyone who receives your code can just run
pip install -r requirements.txt
and it will install all the dependencies.
However, that just takes care of library dependencies. What about the version of python itself, OS environment etc... So this is where you may need to start looking at solutions like Docker. With Docker, you can specify the full environment in a Dockerfile. Then anyone on another machine can run the docker images, with all of the dependencies included. This is fast become the de-facto way of shipping code (in all languages, but especially useful in Python).
I'm new to Python, so I think my question is very fundamental and is asked a few times before but I cannot really find something (maybe because I do not really know how to search for that problem).
I installed a module in Python (reportlab). Now I wanted to modify a python script in that module but it seems that the python interpreter does not notice the updates in the script. Ironically the import is successful although Python actually should not find that package because I deleted it before. Does Python uses something like a Cache or any other storage for the modules? How can I edit modules and use those updated scripts?
From what you are saying, you downloaded a package and installed it using either a local pip or setup.py. When you do so, it copies all the files into your python package directory. So after an install, you can delete the source folder because python is not looking here.
If you want to be able to modify, edit, something and see changes, you have to install it in editable mode. Inside the main folder do:
python setup.py develop
or
pip install -e .
This will create a symbolic link to you python package repository. You will be able to modify sources.
Careful for the changes to be effective, you have to restart your python interpreter. You cannot just import again the module or whatever else.
I'm working by myself right now, but am looking at ways to scale my operation.
I'd like to find an easy way to version my Python distribution, so that I can recreate it very easily. Is there a tool to do this? Or can I add /usr/local/lib/python2.7/site-packages/ (or whatever) to an svn repo? This doesn't solve the problems with PATHs, but I can always write a script to alter the path. Ideally, the solution would be to build my Python env in a VM, and then hand copies of the VM out.
How have other people solved this?
virtualenv + requirements.txt are your friend.
You can create several virtual python installs for your projects, everything containing exactly those library versions you need (Tip: pip freeze spits out a requirements.txt with the exact library versions).
Find a good reference to virtualenv here: http://simononsoftware.com/virtualenv-tutorial/ (it's from this question Comprehensive beginner's virtualenv tutorial?).
Alternatively, if you just want to distribute your code together with libraries, PyInstaller is worth a try. You can package everything together in a static executable - you don't even have to install the software afterwards.
You want to use virtualenv. It lets you create an application(s) specific directory for installed packages. You can also use pip to generate and build a requirements.txt
For the same goal, i.e. having the exact same Python distribution as my colleagues, I tried to create a virtual environment in a network drive, so that everybody of us would be able to use it, without anybody making his local copy.
The idea was to share the same packages installed in a shared folder.
Outcome: Python run so unbearably slow that it could not be used. Also installing a package was very very sluggish.
So it looks there is no other way than using virtualenv and a requirements file. (Even if unfortunately often it does not always work smoothly on Windows and it requires manual installation of some packages and dependencies, at least at this time of writing.)
I have created a little Python egg (with setuptools) that I want to install in other machines of my LAN. I have even setup a server for the eggs and all (and the egg is properly downloaded and installed with easy_install -f http://myserver/eggrepository ) :-)
I would like to know if there's a way of running an script (bash or Python) when installing it with easy_install (version 0.6c11 and python2.6).
I have added a bash script to the package, and I'd like to be able to run it automatically (mainly to start some functionalities in the rcX.d levels, start running at startup, etc...) when the egg is installed. Right now I have to go to the /usr/local/lib/python2.6/dist-packages, find the folder where my egg was installed and run the bash script that is in said egg... But that solution is not very accurate and I'm sure it will give me problems if I change versions, paths, etc...
I've been reading and I found some posts saying it wasn't possible, but they are a bit old and maybe there's a way now... I also found others saying it was possible with distutils (which means that probably setuptools can do it too) but I haven't been able to find any suitable solution using setuptools.
Thank you in advance
Related:
How can I add post install scripts...
How to extend distutils with a simple post install script
Ok... I found a workaround...
python-packaging-custom-scripts
It's not as straight-forward as I would have liked, but well...
I can put the installation process in an sh file and then, since there's going to be a Python script in the user's path, I can call it from the bash script installing the package...