We have a Python application that runs on a single server (the master), and that is copied to multiple other servers via rsync (the nodes), in an all Ubuntu environment. Currently we have to login to each server when any Python changes include any new libraries, so we have to go to each server, login via SSH, and python3.7 -m pip install whatever. What is the pythonic way to do this so that it’s possible to efficiently clone a python application including new libraries?
Related
I have a python application that I've created an executable of, using pyinstaller. The entire python interpreter is packaged into the executable with all its pip dependencies.
So now my application can run in environments where python or python modules may not be installed, but there are still some dependencies:
1) MongoDB - This is the database my application uses, and it needs to be installed on a system for it to work of course.
2) Mosquitto - This service is required because the application uses MQTT to receive/send commands.
My current method of handling this is to use a shell script which installs mongodb and mosquitto the first time when my application is deployed somewhere. I just discovered docker, and I was wondering if it is capable of packaging these 'external' dependencies into a docker image?
Is it possible for me to have one standalone "thing" which will run in any environment regardless of whether mongoDB or mosquitto are installed there?
And how exactly would I go about doing this?
(Unrelated but this application is meant to run on a raspberry pi)
If you adopted Docker here:
You'd still have to "separately" run the external services; they couldn't be packaged into a single artifact per se. There's a standard tool called Docker Compose that provides this capability, though, and you'd generally distribute a docker-compose.yml file that describes how to run the set of related containers.
It's unusual to distribute a Docker image as files; instead you'd push your built image to a registry (like Docker Hub, but the major public-cloud providers offer this as a hosted service, there are a couple of independent services, or you can run your own). Docker can then retrieve the image via HTTP.
Docker containers can only be run by root-equivalent users. Since you're talking about installing databases as part of your bringup process this probably isn't a concern for you, but you could run a plain-Python or pyinstallered application as an ordinary user. Anyone who can run any Docker command has unrestricted root-level access on the host.
I'm working on creating my first "real" web app using Django.
Yesterday I learned I should be using a web server like Nginx to serve static files and pass off requests for dynamic content to my web app. I also learned that I need something like Gunicorn as the intermediary between the web server (Nginx) and my Django app.
My question is about virtualenv. It makes sense that we would contain app related software in it's own separate environment. What should I install in virtualenv, and what gets installed system wide? For example, in this guide we seem to install Python, Nginx and the database system wide (because they're installed before virtualenv is installed) while Django and Gunicorn are installed in virtualenv. It makes sense that Gunicorn would have to go in the virtualenv since its importing our python app, as explained here. Are the other things required to be installed system wide? Or can I pick either way? Is one way preferred over another?
Thanks!
Virtualenv is for managing Python libraries. It is not for managing Python itself, or for external services such as databases; it does however manage the Python libraries you use to access the database.
There's no room for confusion here, because there's simply no way to install Python itself or a database within a virtualenv.
For the moment I've created an Python web application running on uwsgi with a frontend created in EmberJS. There is also a small python script running that is controlling I/O and serial ports connected to the beaglebone black.
The system is running on debian, packages are managed and installed via ansible, the applications are updated also via some ansible scripts. With other words, updates are for the moment done by manual work launching the ansible scripts over ssh.
I'm searching now a strategy/method to update my python applications in an easy way and that can also be done by our clients (ex: via webinterface). A good example is the update of a router firmware. I'm wondering how I can use a similar strategy for my python applications.
I checked Yocto where I can build my own linux with but I don't see how to include my applications in those builds, and I don't wont to build a complete image in case of hotfixes.
Anyone who has a similar project and that would like to share with me some useful information to handle some upgrade strategies/methods?
A natural strategy would be to make use of the package manager also used for the rest of the system. The various package managers of Linux distributions are not closed systems. You can create your own package repository containing just your application/scripts and add it as a package source on your target. Your "updater" would work on top of that.
This is also a route you can go when using yocto.
I am attempting to make a dynamic website for a school project. The problem is it has to be on the school server and I can't use any webframeworks. I have searched through google and stackoverflow but I can't seem to get an answer.
I have tried the code that was provided here:
How to implement a minimal server for AJAX in Python?
It worked on the local server but how can I change it so that it would open on the school server. When I used those codes, the page won't load or an internal error shows. Can someone point me in the right direction?
Using a web framework in python does not necessary needs a system package installation (like running a sudo apt-get install python-something).
In the end python frameworks are just files like in your project, but you can install them system wide (like in the apt-get example) or ship them within your project (probably what you want). Take a look at virtual environment for creating a self contained environment and setuptools foi packaging the application and its dependencies
For implementing an ajax server directly in python without a wsgi container (apache, nginx, etc) I recommend using flask. It is very, very simple and very powerful
I have a server that I'd like to use to maintain persistent connections with a set of devices, just so that they can pass simple messages back and forth. It's a trivial task, but selecting a server-side platform has been suprrisingly difficult (especially since I have no administrative privileges - it's a dedicated commercial server).
My best idea so far is to write a TCP server in Python. The Twisted platform seems suitable for the task, and has a lot of good reviews. However, my server has Python 2.7 but not Twisted, and the admins have been reluctant to install it for me.
Is there any way that I can just upload a Twisted package to the server and reference it in my libraries without installing it as a framework?
I'm not sure what you mean by "installing it as a framework". If you are using an OS X server hosting environment, then maybe you're talking about Framework with a Capital F. However, OS X server hosting isn't a very common environment so I'm guessing that's not it.
If you just want to know how to install a Python library in your home directory, then the general answer is:
$ python setup.py install --user
This Just Works™ on Python 2.7 (assuming the package uses distutils, which Twisted does, and you unpack the source .tar.gz and change your working directory to the directory that is the root of the contents of that .tar.gz), so you should be done after that.
Use virtualenv to create your private Python libraries installation.