Deploy python venv as snap - python

We have a python(3) venv rest api. In the same venv we have also installed and configured the application server uwsgi the api is running on. And we are looking for a clean way to deploy it.
Until now we are deploying with is a simple script that creates the venv install the dependencies from pip and then runs the app.
I am wondering if using snap would be possible to package everything nicely and simply install them on a server.
Is that possible or a good practice to deploy a web api this way?
Thanks for your time.

Yes, you can do this with snaps. You can use the python plugin module with snapcraft which extends your python path with a virtualenv.
Also, Its pretty easy to do this with a deb.
dh-virtualenv lets you use standard debian packaging tools. https://labs.spotify.com/2013/10/10/packaging-in-your-packaging-dh-virtualenv/
It is your choice.

Related

Pyramid not included in Python virtualenv

I'm an experienced developer, but not very familiar with Python and Pyramid.
I'm trying to follow some (a bit old and unclear) instructions on deploying a Pyramid web application. My OS is Ubuntu 16.04.
The instructions say to install virtualenv and Pyramid - I do so with apt install virtualenv and apt install python-pyramid. Then they say I should run the app in a virtual environment, so I build that with virtualenv . -ppython3, and activate it with source bin/activate. I install the application from a ready-to-run buildout from GitHub. The buildout includes a "production.ini" file with parameters to pserve.
But Pyramid is not included in the virtual environment built with virtualenv. (There is no "pserve" in the bin directory, e.g.) So I can't run the applications with bin/pserve etc/production.ini, as the instructions say. And if I try with only "pserve", I get errors when trying to access files like "var/waitress-%(process_num)s.sock". Files that the app excepts to find in the virtual environment.
I've looked for flags to tell virtualenv to include Pyramid, but couldn't find any. Am I overlooking something? I'd be most grateful for some help! :-)
/Anders from Sweden
Perhaps you might want to try installing Pyramid in your virtual environment using pip, since apt-installed libraries are installed into /opt, rather than being visible to Python. In the guide, it seems like you're wanting to install Pyramid through the virtual environment so that it can be used by your program, so I think you'd be best using pip rather than apt-get. I did a quick Google search, and it seems like this is the library you need. Here, all you'd have to do is run the installation command once you've already entered the virtual environment with pip install pyramid. This way, you should only have access to it within the virtual environment, as well!
You mentioned it's using buildout - I assume this is zc.buildout. buildout usually manages its own virtualenv and handles installing all of the necessary dependencies. It really depends on how that buildout is configured as there's no standard there for what to do or how to run your app. I would normally expect pserve to be exposed in the bin folder, but maybe another app-specific script is exposed instead.

Run python script from another computer without installing packages/setting up environment?

I have a Jupyter notebook script that will be used to teach others how to use python.
Instead of asking each participant to install the required packages, I would like to provide a folder with the environment ready from the start.
How can I do this?
What is the easiest way to teach python without running into technical problems with packages/environments etc.?
If you just need to install Python dependencies, you can use #Aero Blue solution. However, the users would need probably to make a virtual environment, so they don't mess with other environments and versions, etc.
However, if they should need some Linux packages, this would not be enough. Therefore, I would suggest using Docker. You would need to provide them with a Dockerfile, that you should set to install any dependencies (whether is for Python or Linux), and they would just need to use docker build and docker run commands.
The easiest way I have found to package python files is to use pyinstaller which packages your python file into an executable file.
If it's a single file I usually run pyinstaller main.py --onefile
Another option is to have a requirements file
This reduces installing all packages to one command pip install -r requirements.txt
You would need to use a program such as py2exe, pyinstaller, or cx_freeze to package each the file, the modules, and a lightweight interpreter. The result will be an executable which does not require the user to have any modules or even python installed to access it; however, because of the built-in interpreter, it can get quite large (which is why Python is not commonly used to make executables).
Have you considered using Azure notebooks or another Jupyter hosting service ? Most of these have a special syntax you can use to perform pip installs. For Azure it is !pip install
https://notebooks.azure.com

Can I use flask without virtual env?

I am having issues with flask, and now I am wondering if there's a way to use flask without virtual environment on Python. Why would we need virtual environment with flask?
$ sudo pip install virtualenv
$ sudo apt-get install python-virtualenv
$ virtualenv venv
$ . venv/bin/activate
$ venv\Scripts\activate
I was searching on Google and couldn't find a good answer for that! If there's a way to use flask without virtual environment could you please show me how?
Yes, you can. You can use any python library without virtualenv. What virtualenv does is create a sandbox environment for you so you can install whatever python libraries you want without affecting anything else on your computer. When you delete that virtual environment, all those libraries go away like it never happened.
That way you can have one project that uses version 1 or Flask and another project that uses version 2 and they won't step on each other in any way. It lets you segregate python projects so you don't have to worry about them interfering with each other.
It's generally recommended that you use it. In addition to the benefits already mentioned, it helps eliminate environmental issues between your development environment and other environments like production. Otherwise you can get into a situation where things work fine on your box but when you go to deploy it, there are problems. Usually that's because you were using the wrong version of a library without realizing it. The virtualenv system helps prevent that from happening by making sure your app only has access to the versions of the libraries you want it to. When you move your app to production the entire virtualenv sandbox goes with it so it's pretty likely it will work the same way as it did on your dev box.

How to manage libraries in deployment

I run Vagrant on Mac OS X. I am coding inside a virtual machine with CentOS 6, and I have the same versions of Python and Ruby in my development and production environment. I have these restrictions:
I cannot manually install. Everything must come through RPM.
I cannot use pip install and gem install to install the libraries I want as the system is managed through Puppet, and everything I add will be removed.
yum has old packages. I usually cannot find the latest versions of the libraries.
I would like to put my libraries locally in a lib directory near my scripts, and create an RPM that includes those frozen versions of dependencies. I cannot find an easy way to bundle my libraries for my scripts and push everything into my production server. I would like to know the easiest way to gather my dependencies in Python and Ruby.
I tried:
virtualenv (with --relocatable option)
PYTHONPATH
sys.path.append("lib path")
I don't know which is the right way to go. Also for ruby, is there any way to solve my problems with bundler? I see that bundler is for rails. Does it work for custom small scripts?
I like the approach in Node.JS and NPM; all packages are stored locally in node_modules. I have nodejs rpm installed, and I deploy a folder with my application on the production server. I would like to do it this way in Ruby and Python.
I don't know Node, but what you describe for NPM seems to be exactly what a virtualenv is. Once the virtualenv is activated, pip installs only within that virtualenv - so puppet won't interfere. You can write out your current list of packages to a requirements.txt file with pip freeze, and recreate the whole thing again with pip install -r requirements.txt. Ideally you would then deploy with puppet, and the deploy step would involve creating or updating the virtualenv, activating it, then running that pip command.
Maybe take a look at Docker?
With Docker you could create a image of your specific environment and deploy that.
https://www.docker.com/whatisdocker/

Where and How do I install django framework on a server?

I'm new to Django and still learning.
Currently I have server set up with the follow folders:
'www'
'db'
'private'
Where and How do I install django framework on a server?
It can be accessed with ftp and ssh.
Currently. I just know that php works and loads within the www folder.
Do I have to install python?
Thanks everybody in advance.
It is a good practice to use a virtual environment such as virtualenv. After you install virtualenv and activate it, you can then install django via pip install django which will install Django to your virtualenv. An easy way to organize the package requirements is to put the package you will use in a requirements.txt file, which looks like something like this:
Django==1.4.1
Mako==0.7.0
MarkupSafe==0.15
You can then install all the required packages with pip install -r /path/to/requirements.txt
Setting up Django for deployment:
Cleanest & Fastest server setup for Django
official django doc 1 and 2
Basic Django deployment with virtualenv, fabric, pip and rsync

Categories

Resources