I want to run my Python application (through Bamboo) using a Script task which runs the application.
I tried (as a task) with
Python myprogram.py
But it blocks the process of deployment and its status remains InProgress.
How to run the Python application from within Bamboo (deployment phase) successfully ?
Thanks
I think there is no problem with the way you are trying to run your python script from within your deployment. Can you confirm that you are not using inline with script task to run your python script ? Also the other thing, have you run your python script locally, just to be sure there is no issue within your code ?
Related
I'm trying to debug any Python script that an interpreter runs so long and I have a reference to that script. I.e if my connected interpreter runs a script called abc.py, and in my script directory I have abc.py with breakpoints attached. The IDE will automatically stop execution at that break point
I'm using PyCharm, but I'd like to know the theory here to say if I'd ever like to connect VS Code I'd be able to do that as well. Additionally I'm currently connecting to a Docker container running airflow.
Given the above, I'm assuming that the goal is to do a "remote" debug.
Also since Python is a script, and run by an interpreter I am assuming, if I can read into the interpreter and if PyCharm can match the file ran by the interpreter then it should be able to pause the execution.
I am additionally assuming that the interpreter can run in "normal" mode. Not in debug mode as we have in Java.
I have read three approaches:
ssh interpreter to my Docker container - seems most promising for my current goal, but unsure if it'll work
using Python debug server (Debugging Airflow Tasks with IDE tools?) - still requires manual changes in the specific scripts
using Docker interpreter (https://medium.com/#andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d) - still requires individual debug configs for executing a single DAG / script
Is debugging any file executed by a python interpreter possible, at least in theory?
Is it possible remotely?
Is it possible using airflow at all?
I want to know if there's a way to have windows server 2019 automatically launch django's web server. I also want the launch to be performed at startup and by SYSTEM.
I tried using batch scripts that launch manage.py from venv's python interpreter. When I launch the batch manually (i.e. double click) it works fine and dandy. But it appears that SYSTEM fails in running the script correctly when planning the task.
I made SYSTEM launch another script at startup (a simple python script that creates a txt file from within its own venv) and it works.
If the Django launch sceipt is launched by USER then it works.
The problem is with the launching of django with SYSTEM. I've also tried streamlit and the result is the same.
Do you have any Ideas?
Sample batch script:
cd path\of\managepyfile\
C:\path_to_venv\Scripts\python -m manage.py runserver
We run a similar application (not python) but an application that uses a web server.
We have it setup as a task in task scheduler that when the server starts up, it runs the powershell script that executes a command to start the web server.
Link to setup
However, you could use a web server like IIS and deploy the files to the www folder in the cdrive and run the site as an IIS service.
Setting it up on IIS was a little tricky if you've never used IIS before. Happy to help out as we have deployed our test access tool for one of our apps this way.
There is a python script in my repo that I would like to run whenever I call an API.This Python script merely transfer data from one database to another. The Jenkins server for the project currently is used for builds/pipelines/running tests, I was wondering if I could use this Jenkins service to run this script when i call an API since I found that Jenkins allows you to remotely trigger scripts via REST.
I was wondering it I could use Jenkin's feature of trigger remotely to run this python script in my repo when I need to. The python script is built using a python image in the dockerfile, so docker helps to setup the dependencies/python needed to run the script. the command to run by Jenkins is something like docker build and docker run
Yes you can.
Just setup a pipeline that
Runs in docker (with your image). Have a look at this
Does a git clone of you repository
Runs you python script with something like: sh "python <your script>"
I'm running Flask on an AWS instance. My goal is to be able to have Flask running on its own, without me having to ssh into it and run
python app.py
Is there a way to have this command run every time the AWS instance itself reboots?
Yes there is a way to start the python script on reboot.
On linux you will find /etc/init.d directory. You will need to write your own init.d script and put it inside /etc/init.d directory,which will indeed start your python script. Ahh ! wait its not goning to be magic. Dont worry, there is fixed format of init.d script. Script contains some basic tasks like start(),stop(),reload() etc. Just add the code that you want to run on start in start() block.
Some reference link : https://bash.cyberciti.biz/guide//etc/init.d
Try this:
(crontab -l 2>/dev/null; echo '#reboot python /path/to/app.py') | crontab -
I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance