How can we deploy a python code from master to minion using saltstack? I am working on localhost.
I've done the installation things
I am not trying to completely understand it, I just want to know how the deployment could be.
You can use cmd state module to execute a script from master to a minion,
run_python:
cmd.script:
- name: script_optional_name
- source: salt://test/script.py
The script will be executed in a minion.
Related
I'm trying to debug any Python script that an interpreter runs so long and I have a reference to that script. I.e if my connected interpreter runs a script called abc.py, and in my script directory I have abc.py with breakpoints attached. The IDE will automatically stop execution at that break point
I'm using PyCharm, but I'd like to know the theory here to say if I'd ever like to connect VS Code I'd be able to do that as well. Additionally I'm currently connecting to a Docker container running airflow.
Given the above, I'm assuming that the goal is to do a "remote" debug.
Also since Python is a script, and run by an interpreter I am assuming, if I can read into the interpreter and if PyCharm can match the file ran by the interpreter then it should be able to pause the execution.
I am additionally assuming that the interpreter can run in "normal" mode. Not in debug mode as we have in Java.
I have read three approaches:
ssh interpreter to my Docker container - seems most promising for my current goal, but unsure if it'll work
using Python debug server (Debugging Airflow Tasks with IDE tools?) - still requires manual changes in the specific scripts
using Docker interpreter (https://medium.com/#andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d) - still requires individual debug configs for executing a single DAG / script
Is debugging any file executed by a python interpreter possible, at least in theory?
Is it possible remotely?
Is it possible using airflow at all?
There is a python script in my repo that I would like to run whenever I call an API.This Python script merely transfer data from one database to another. The Jenkins server for the project currently is used for builds/pipelines/running tests, I was wondering if I could use this Jenkins service to run this script when i call an API since I found that Jenkins allows you to remotely trigger scripts via REST.
I was wondering it I could use Jenkin's feature of trigger remotely to run this python script in my repo when I need to. The python script is built using a python image in the dockerfile, so docker helps to setup the dependencies/python needed to run the script. the command to run by Jenkins is something like docker build and docker run
Yes you can.
Just setup a pipeline that
Runs in docker (with your image). Have a look at this
Does a git clone of you repository
Runs you python script with something like: sh "python <your script>"
I want to run my Python application (through Bamboo) using a Script task which runs the application.
I tried (as a task) with
Python myprogram.py
But it blocks the process of deployment and its status remains InProgress.
How to run the Python application from within Bamboo (deployment phase) successfully ?
Thanks
I think there is no problem with the way you are trying to run your python script from within your deployment. Can you confirm that you are not using inline with script task to run your python script ? Also the other thing, have you run your python script locally, just to be sure there is no issue within your code ?
We have a project on nginx/Django, using VirtualBox.
When we try to run command VBoxManage list runningvms in nginx, we have the next error:
Failed to initialize COM because the global settings directory '/.config/VirtualBox' is not accessible!
If we run this command in console, it works fine.
What can we do to make it working good in nginx?
Other details:
nginx is runned by user "www-data", console - by the other user (Administrator).
We have fixed the issue.
There was wrong environment variable "Home" (os.environ['HOME']). We changed it, and so the problem was gone.
Using Python API for VB instead of ssh can really help you with that problem, as RegularlyScheduledProgramming suggested - we added Python API too.
Thanks!
Is it possible to run python script through AWSCLI (user-data). I tried but it didn't run and i have following in my logs
boot.log:2015-08-07 10:08:30,660 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: './step-1
cloud-init.log:2015-08-07 10:08:30,660 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: './step-1'
cloud-init-output.log:2015-08-07 10:08:30,660 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: './step-1'
Note step-1 is my script which i am trying to pass as user-data . Also my script is present in same directory from where i am running command so it should pick
The default interpreter seems to be Python. So if you simply want to execute a shell script you'll need to start with a hash-bang, for example:
#!/bin/bash
/yourpath/step-1
Please note, in order to debug this, try: cat /var/log/cloud-init-output.log
And see: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html#user-data-shell-scripts
You can use any command to be run under user-data. I have used user-data to bootstrap Windows Instances with Domain Controller setup or domain join using PowerShell; of course given that it is on EC2 - the properties are extensible whether you are running in Unix based or Windows Based.
So you have specified, Python - so please ensure the following
Python already installed and then take an image - use that image to bootstrap
You enable User-Data and pass the user-data commands during the launch time
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html
the document from aws say that only shell and cloud-init directive that are supported