Scheduling a python script on Azure - python

So I was looking into scheduling a python script on a daily basis and, rather than using Task Scheduler on my own machine, I was wondering if it is possible to do so using an Azure cloud account.

For your needs, I suggest you use Web Jobs in Web Apps Service.
It has two types of Azure Web Jobs for you to choose:
Continuous and Trigger.
For your needs, Trigger should be adopted.
You could refer to the document here for more details.In addition, here shows how to run tasks in WebJobs.
You could refer to the steps as below to create your webjob.
Step 1 :
Use the virtualenv component to create an independent python runtime environment in your system.Please install it first with command pip install virtualenv if you don't have it.
If you installed it successfully ,you could see it in your python/Scripts file.
Step2 : Run the commad to create independent python runtime environment.
Step 3: Then go into the created directory's Scripts folder and activate it (this step is important , don't miss it)
Please don't close this command window and use pip install <your libraryname> to download external libraries in this command window.
Step 4:Keep the Webjob.py(which is your own business code) uniformly compressed into a folder with the libs packages in the Libs/site-packages folder that you rely on.
Step 5:
Create webjob in Web app service and upload the zip file,then you could execute your Web Job and check the log
You could also refer to the SO thread:
1.Options for running Python scripts in Azure
2.Python libraries on Web Job
BTW,you need to create a azure web app first because Webjob runs in azure web app.
Hope it helps you.

Related

How do I deploy a Python Dash app locally without the desired users and the server having all associated dependencies installed?

Forgive me, I'm new to all this. It might not even be possible?
I have a Dash app that does a number of calculations, and I need to deploy it locally somehow.
I need all users in our company to be able to view it, but without the dependencies of the packages. I cannot use any web-based (Heroku, Git, etc) method as the data is commercially sensitive and must remain site-only.
I can successfully run it through waitress-serve on my machine and it can be viewed on other computers, but I'd rather it run from the server and be accessible by anyone that wants to use it.
What's the solution? Is is possible to have a folder on the server that has all the associated files and dependencies, and then a batch file (or similar - that's what I use now to launch mine) to launch the app on a wsgi server? Would our network have to have the python dependencies installed however?

Python: Question about packaging applications docker vs pyinstaller

I have a python application that I've created an executable of, using pyinstaller. The entire python interpreter is packaged into the executable with all its pip dependencies.
So now my application can run in environments where python or python modules may not be installed, but there are still some dependencies:
1) MongoDB - This is the database my application uses, and it needs to be installed on a system for it to work of course.
2) Mosquitto - This service is required because the application uses MQTT to receive/send commands.
My current method of handling this is to use a shell script which installs mongodb and mosquitto the first time when my application is deployed somewhere. I just discovered docker, and I was wondering if it is capable of packaging these 'external' dependencies into a docker image?
Is it possible for me to have one standalone "thing" which will run in any environment regardless of whether mongoDB or mosquitto are installed there?
And how exactly would I go about doing this?
(Unrelated but this application is meant to run on a raspberry pi)
If you adopted Docker here:
You'd still have to "separately" run the external services; they couldn't be packaged into a single artifact per se. There's a standard tool called Docker Compose that provides this capability, though, and you'd generally distribute a docker-compose.yml file that describes how to run the set of related containers.
It's unusual to distribute a Docker image as files; instead you'd push your built image to a registry (like Docker Hub, but the major public-cloud providers offer this as a hosted service, there are a couple of independent services, or you can run your own). Docker can then retrieve the image via HTTP.
Docker containers can only be run by root-equivalent users. Since you're talking about installing databases as part of your bringup process this probably isn't a concern for you, but you could run a plain-Python or pyinstallered application as an ordinary user. Anyone who can run any Docker command has unrestricted root-level access on the host.

Simple Google Cloud deployment: Copy Python files from Google Cloud repository to app engine

I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...

What are the ways to deploy python code on aws ec2?

I have a python project and i want to deploy it on an AWS EC2 instance. My project has dependencies to other python libraries and uses programs installed on my machine. What are the alternatives to deploy my project on an AWS EC2 instance?
Further details : My project consist on a celery periodic task that uses ffmpeg and blender to create short videos.
I have checked elastic bean stalk but it seems it is tailored for web apps. I don't know if containerizing my project via docker is a good idea...
The manual way and the cheapest way to do it would be :
1- Launch a spot instance
2- git clone the project
3- Install the librairies via pip
4- Install all dependant programs
5- Launch periodic task
I am looking for a more automatic way to do it.
Thanks.
Beanstalk is certainly an option. You don't necessarily have to use it for web apps and you can configure all of the dependencies needed via .ebextensions.
Containerization is usually my go to strategy now. If you get it working within Docker locally then you have several deployment options and the whole thing gets much easier since you don't have to worry about setting up all the dependencies within the AWS instance.
Once you have it running in Docker you could use Beanstalk, ECS or CodeDeploy.

Run custom script when deployed Azure vm boots

I'm using Azure Python SDK to deploy a Linux Vm in the Cloud. This Vm has a public IP and ssh enabled. I need to have this Vm running a custom script immediately after it boots . This script would install pip, python, docker etc and start a docker container.
How could I pass this script when deploying the vm ? / How could I instruct the vm to run this script after it has started ?
Cheers,
Steve
According to your scenario, you could use Azure Custom Script Extension.
The Custom Script Extension downloads and executes scripts on Azure
virtual machines. This extension is useful for post deployment
configuration, software installation, or any other configuration /
management task. Scripts can be downloaded from Azure storage or other
accessible internet location, or provided to the extension run time.
If you want to use python to do this, please refer to this python sdk documentation.
Please refer to the similar question.

Categories

Resources