I have an Amazon Ubuntu instance which I stop and start (not terminate). I was wondering if it is possible to run a script on start and stop of the server. Specifically, I am looking at writting a python boto script to take my RDS volume offline when the EC2 server is not running.
Can anyone tell me if this is possible please?
It is possible. You just have to write an init script and setup proper symbolic links in /etc/rc#.d directories. It will be started with a parameter start or stop depending on if machine is starting up or shutting down.
Related
I have a Selenium Python (3.9) renderer script that will perform a bunch of fetching tasks that I'm trying to utilize a AWS EC2 virtual machine for a runtime on the cloud. I am running it by using SSH to access the VM, adding the scripts and dependencies, and then running it with a python3 <script-name>.py.
But, I need help preserving the runtime instance till my script is completely (or indefinitely till I manually delete the instance). Currently, the script seems tied to my local CLI, and when I leave it be for a while or shut my lid, the AWS VM runtime quits with error:
Client Loop: Send Disconnect: Broken Pipe
How can I preserve the runtime indefinitely or till end of script, and untie it from any local runtimes? Apologies for any idiosyncrasy, I'm new to DevOps and deploying stuff outside of local runtimes.
I used ClientAliveInterval=30, ServerAliveInterval=30, ClientAliveCountMax=(arbitrarily high amount), and ServerAliveCountMax=(arbitrarily high amount). I tried to use nohup inside my VM, but it did not prevent the process from ending. I have observed that when ps -a returns my ssh session, it is running, else it is not.
I am using a M1 Mac on Ventura. The AMI I am using to create the VM is ami-08e9419448399d936. This is Selenium-Webdriver-on-Headless-Ubuntu, using Ubuntu 20.04.
I regularly have Python scripts that take up to 8+ hours to complete that I want to run on a remote server. However, I don't want to go through the hassle of setting up a server, setting an environment, running the script and shutting down the server after the script is done every time.
Ideally, I'm looking for a CLI product like Heroku that spins up a server, runs the script in an environment and shuts down the server after the script is done.
AWS Lambda functions sound close to what I'm looking for, but they have a runtime limit. Are there other solutions that would fit these criteria?
Thanks!
I have a project in which one of the tests consists of running a process indefinitely in order to collect data on the program execution.
It's a Python script that runs locally on a Linux machine, but I'd like for other people in my team to have access to it as well because there are specific moments where the process needs to be restarted.
Is there a way to set up a workflow on this machine that when dispatched, stops and restarts the process?
You can execute commands on your Linux host via GH Actions and SSH. Take a look at this action.
I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance
So I am trying to set up a Continuous integration environment using Jenkins.
One of the build step requires a series of mouse actions/movements to accomplish a task in Excel. I have already written a python script using the ctypes library to do this.
The script works perfectly fine if I run it either through Jenkins or on the server itself when I am actively logged in to the server using remote desktop connection, but as soon as I minimize/close the connection and then run the script from Jenkins, it seems the mouse events never get executed. Is there something I can add to the script to make this work? Thanks for any help you can provide.