I run Vagrant on Mac OS X. I am coding inside a virtual machine with CentOS 6, and I have the same versions of Python and Ruby in my development and production environment. I have these restrictions:
I cannot manually install. Everything must come through RPM.
I cannot use pip install and gem install to install the libraries I want as the system is managed through Puppet, and everything I add will be removed.
yum has old packages. I usually cannot find the latest versions of the libraries.
I would like to put my libraries locally in a lib directory near my scripts, and create an RPM that includes those frozen versions of dependencies. I cannot find an easy way to bundle my libraries for my scripts and push everything into my production server. I would like to know the easiest way to gather my dependencies in Python and Ruby.
I tried:
virtualenv (with --relocatable option)
PYTHONPATH
sys.path.append("lib path")
I don't know which is the right way to go. Also for ruby, is there any way to solve my problems with bundler? I see that bundler is for rails. Does it work for custom small scripts?
I like the approach in Node.JS and NPM; all packages are stored locally in node_modules. I have nodejs rpm installed, and I deploy a folder with my application on the production server. I would like to do it this way in Ruby and Python.
I don't know Node, but what you describe for NPM seems to be exactly what a virtualenv is. Once the virtualenv is activated, pip installs only within that virtualenv - so puppet won't interfere. You can write out your current list of packages to a requirements.txt file with pip freeze, and recreate the whole thing again with pip install -r requirements.txt. Ideally you would then deploy with puppet, and the deploy step would involve creating or updating the virtualenv, activating it, then running that pip command.
Maybe take a look at Docker?
With Docker you could create a image of your specific environment and deploy that.
https://www.docker.com/whatisdocker/
Related
I'm an experienced developer, but not very familiar with Python and Pyramid.
I'm trying to follow some (a bit old and unclear) instructions on deploying a Pyramid web application. My OS is Ubuntu 16.04.
The instructions say to install virtualenv and Pyramid - I do so with apt install virtualenv and apt install python-pyramid. Then they say I should run the app in a virtual environment, so I build that with virtualenv . -ppython3, and activate it with source bin/activate. I install the application from a ready-to-run buildout from GitHub. The buildout includes a "production.ini" file with parameters to pserve.
But Pyramid is not included in the virtual environment built with virtualenv. (There is no "pserve" in the bin directory, e.g.) So I can't run the applications with bin/pserve etc/production.ini, as the instructions say. And if I try with only "pserve", I get errors when trying to access files like "var/waitress-%(process_num)s.sock". Files that the app excepts to find in the virtual environment.
I've looked for flags to tell virtualenv to include Pyramid, but couldn't find any. Am I overlooking something? I'd be most grateful for some help! :-)
/Anders from Sweden
Perhaps you might want to try installing Pyramid in your virtual environment using pip, since apt-installed libraries are installed into /opt, rather than being visible to Python. In the guide, it seems like you're wanting to install Pyramid through the virtual environment so that it can be used by your program, so I think you'd be best using pip rather than apt-get. I did a quick Google search, and it seems like this is the library you need. Here, all you'd have to do is run the installation command once you've already entered the virtual environment with pip install pyramid. This way, you should only have access to it within the virtual environment, as well!
You mentioned it's using buildout - I assume this is zc.buildout. buildout usually manages its own virtualenv and handles installing all of the necessary dependencies. It really depends on how that buildout is configured as there's no standard there for what to do or how to run your app. I would normally expect pserve to be exposed in the bin folder, but maybe another app-specific script is exposed instead.
What is the purpose of virtualenv inside a docker django application? Python and other dependencies are already installed, but at the same time it's necessary to install lots of packages using pip, so it seems that conflict is still unclear.
Could you please explain the concept?
EDIT: Also, for example. I'v created virtualenv inside docker django app and recently installed pip freeze djangorestframework and added it to installed in settings.py but docker-compose up raises error . No module named rest_framework.Checked, everything is correct.Docker/virtualenv conflict ? May it be?
Docker and containerization might inspire the illusion that you do not need a virtual environment. distutil's Glpyh makes a very compelling argument against this misconception in this pycon talk.
The same fundamental aspects of virtualenv advantages apply for a container as they do for a non-containerized application, because fundamentally you're still running a linux distribution.
Debian and Red Hat are fantastically complex engineering projects.
Integrating billions of lines of C code.For example, you can just apt install libavcodec. Or yum install ffmpeg.
Writing a working build
system for one of those things is a PhD thesis. They integrate
thousands of Python packages simultaneously into one working
environment. They don't always tell you whether their tools use Python
or not.
And so, you might want to docker exec some tools inside a
container, they might be written in Python, if you sudo pip install
your application in there, now it's all broken.
So even in containers, isolate your application code from the system's
Regardless of whether you're using docker or not you should always run you application in a virtual environment.
Now in docker in particular using a virtualenv is a little trickier than it should be. Inside docker each RUN command runs in isolation and no state other than file system changes are kept from line to line. To install to a virutalenv you have to prepend the activation command on every line:
RUN apt-get install -y python-virtualenv
RUN virtualenv /appenv
RUN . /appenv/bin/activate; \
pip install -r requirements.txt
ENTRYPOINT . /appenv/bin/activate; \
run-the-app
A virtualenv is there for isolating the packages to a specific environment. Docker is also there to isolate the settings to a specific environment. So in essence if you use docker there isn't much benefit of using virtualenv too.
Just pip install thing into the docker environment directly it'll do no harm. To pip install the requirements use the dockerfile where you can execute commands.
You can find a pseudo code example below.
FROM /path/to/used/docker/image
RUN pip install -r requirements.txt
I'm new to virtualenv but I'm writting django app and finally I will have to deploy it somehow.
So lets assume I have my app working on my local virtualenv where I installed all the required libraries. What I want to do now, is to run some kind of script, that will take my virtualenv, check what's installed inside and produce a script that will install all these libraries on fresh virtualenv on other machine. How this can be done? Please help.
You don't copy paste your virtualenv. You export the list of all the packages installed like -
pip freeze > requirements.txt
Then push the requirements.txt file to anywhere you want to deploy the code, and then just do what you did on dev machine -
$ virtualenv <env_name>
$ source <env_name>/bin/activate
(<env_name>)$ pip install -r path/to/requirements.txt
And there you have all your packages installed with the exact version.
You can also look into Fabric to automate this task, with a function like this -
def pip_install():
with cd(env.path):
with prefix('source venv/bin/activate'):
run('pip install -r requirements.txt')
You can install virtualenvwrapper and try cpvirtualenv, but the developers advise caution here:
Warning
Copying virtual environments is not well supported. Each virtualenv
has path information hard-coded into it, and there may be cases where
the copy code does not know it needs to update a particular file. Use
with caution.
If it is going to be on the same path you can tar it and extract it on another machine. If all the same dependencies, libraries etc are available on the target machine it will work.
What tools or best practices are available for tracking and managing dependencies of the software I'm developing? I'm using Python / Django, and to date all my software requirements are open source.
I'm developing a web application that, while modest, has a number of dependencies. At a minimum, I'd like to track the software and version number for these. I suppose I'd also like to track the configurations of the required software, and possibly some system-level stuff (userid, if any, of the process of the instance required software, and required permissions thereof).
(Better yet would be something that would help me set up a server for the application when I'm ready to deploy. Still better would be something that allows me to track the the http and dns name server used to support the app. But rumor has it that puppet is a tool for that sort of thing.)
Use pip and virtualenv. With virtualenv, you can create a "virtual environment" which has all your Python packages installed into a local directory. With pip install -r, you can install all packages listed in a specific requirements file.
Rough example:
virtualenv /path/to/env --no-site-packages --unzip-setuptools # create virtual environment
source /path/to/env/bin/activate # activate environment
easy_install pip # install pip into environment
source /path/to/env/bin/activate # reload to get access to pip
pip install -r requirements.txt
Where requirements.txt contains lines like this:
django==1.3
The great thing about this is that requirements.txt serves both as documentation and as part of the installation procedure, so there's no need to synchronize the two.
Being new to the python games I seem to have missed out on some knowledge on how you can develop on a program but also keep it in your live environment.
Programs like gpodder can be run directly from the source checkout which is really handy however others want to be "installed" to run.
A lot of programs are distributed with a setup.py with instructions to run "python ./setup.py install" as root which will put stuff somewhere in your file-system. There are even install commands like "develop" which seem to hold the promise of what I want. So I tried:
export PYTHONPATH=/home/alex/python
python ./setup.py develop --install-dir=/home/alex/python
Which downloaded a bunch of stuff locally and seems magically ensure the application I'm hacking on is still being run out of the src tree. So I guess my roundabout question is is this the correct way of developing python code? How do things like easy_install and pip factor into this?
So I tried the following:
python /usr/share/pyshared/virtualenv.py /home/alex/src/goobook
cd /home/alex/src/goobook/googbook.git
/home/alex/src/goobook/bin/python ./setup.py develop
And finally linked the program in question to my ~/bin
cd /home/alex/src/goobook
linkbin.pl bin/goobook
However invocation throws up a load of extra chatter which seems to imply it's wrong:
17:17 alex#socrates/i686 [goobook] >goobook --help
/home/alex/bin/goobook:5: UserWarning: Module pkg_resources was already imported from /home/alex/src/goobook/lib/python2.5/site-packages/setuptools-0.6c8-py2.5.egg/pkg_resources.py, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
/home/alex/bin/goobook:5: UserWarning: Module site was already imported from /home/alex/src/goobook/lib/python2.5/site.pyc, but /home/alex/src/goobook/lib/python2.5/site-packages/distribute-0.6.10-py2.5.egg is being added to sys.path
from pkg_resources import load_entry_point
Install:
http://pypi.python.org/pypi/virtualenv
to set up a localized virtual environment for your libraries, and:
http://pypi.python.org/pypi/setuptools
i.e. "easy_install" to install new things.
Virtualenv allows you to work in completely independent and isolated Python environments. It will let you easily create multiple environments which have different Python packages installed or different versions of a same package. Virtualenv also lets you easily switch between your different environments.
As of 2012, the de facto preferred tool for package management in Python is pip rather than setuptools. Pip is able to handle dependencies and to install/uninstall globally or inside a virtual environment. Pip even comes out-of-the-box with virtualenv.
Python 3
Also worth mentioning is the fact that virtual environments are becoming a part of Python itself in release 3.3, with the implementation of PEP 405.
The Python Packaging User Guide, which "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools", recommends using pip to install in "development mode":
pip install -e <path>
Thus in the root directory of your package you can simply
pip install -e .
See installing from a local source tree.
The best way to develop Python apps with dependencies is to:
Download desired version of the python interpreter.
Install and use buildout (http://www.buildout.org/).
Buildout is something like Maven for Java (will fetch all needed packages automatically).
This way your Python interpreter will not be polluted by third party packages (this is important if you will be running developed application on other machines). Additionally you can integrate buildout with virtualenv package (this allows you to create virtual python interpreters for each project).