Python script from systemctl at launch - python

I have 3 systemd services I created that run a Python script and pass a key argument because the scripts download datasets that require that key. When I enable the service it works just fine, but if I reboot the EC2 instance its on it fails on launch with:
(code=exited, status=255)
I want it to run on launch because I have a workflow that uses EventBridge to trigger a Lambda function that turns on the instance at specific time to download the dataset to S3 and begin the ETL process. Why would the service run as intended with $sudo systemctl start service-name.service but fails on startup?

Hmmm, this will depend on how you're running the EC2. Basically, there are two ways.
Via Cloud-init / EC2 userdata
You can specify if the script will be executed on the first boot (when the EC2 is created), every (re)boot, etc.
You can check the officials' docs for that:
Cloud-init: Event and Updates
AWS - Run commands on your Linux instance at launch
How can I utilize user data to automatically run a script with every restart of my Amazon EC2 Linux instance?
Via Linux systemd
You can use the example below (just remove the comments and/or just/add the Requires and After if it's needed.)
## doc here: https://man7.org/linux/man-pages/man5/systemd.unit.5.html#[UNIT]_SECTION_OPTIONS
[Unit]
Description=Startup script
# Requires=my-other-service.target
# After=After=network.target my-other-service.target
## doc here: https://man7.org/linux/man-pages/man5/systemd.service.5.html#OPTIONS
[Service]
Type=oneshot
ExecStart=/opt/myscripts/startup.sh
## doc here: https://man7.org/linux/man-pages/man5/systemd.unit.5.html#[INSTALL]_SECTION_OPTIONS
[Install]
WantedBy=multi-user.target

Related

How to schedule a shell command to run in VM instance on GCP?

I am wanting to schedule a shell command within a VM instance to run on a weekly basis.
How it would work:
Once a week, Cloud Scheduler invokes pub sub trigger
Pub sub then pushes message to VM instance's HTTP endpoint
This in turn causes the shell command to run
I have no problem with steps one and two but I am struggling with how to get the shell command to execute.
One thing I have considered is downloading Python to the VM instance and then creating a Python script that runs an os system command.
import os
cmd = "some command"
os.system(cmd)
But again though my problem is how do I get the HTTP POST request to cause the Python script to run?
I would do it differently:
Cloud Scheduler calls a Cloud Function (or Cloud Run)
The Cloud Function starts an instance with a startup script that runs the batch process, and shuts down.
If you need to pass arguments to the script, you can do it using instance metadata when you create the instance (or while it is already running).

Can you remotely trigger to run scripts in your repo with Jenkins (the script is not related to builds)

There is a python script in my repo that I would like to run whenever I call an API.This Python script merely transfer data from one database to another. The Jenkins server for the project currently is used for builds/pipelines/running tests, I was wondering if I could use this Jenkins service to run this script when i call an API since I found that Jenkins allows you to remotely trigger scripts via REST.
I was wondering it I could use Jenkin's feature of trigger remotely to run this python script in my repo when I need to. The python script is built using a python image in the dockerfile, so docker helps to setup the dependencies/python needed to run the script. the command to run by Jenkins is something like docker build and docker run
Yes you can.
Just setup a pipeline that
Runs in docker (with your image). Have a look at this
Does a git clone of you repository
Runs you python script with something like: sh "python <your script>"

Trigger python script on ec2 instance via lambda function?

I currently have an ec2 set up that runs a script when booted up (I do this by calling the script in the user data field for the ec2 instance). Then I just send a command via lambda to start the ec2 instance and that causes the script to run. I would now like to run multiple scripts - ideally I'd have something like a lambda that starts the ec2 instance, then sends a notification to a second lambda when it is up and running to run various scripts on there before shutting it back down. How can I trigger a python script on a running ec2 instance via lambda?
Thanks
EDIT:
I believe I've found a quick solution. in the user data I point to a script like "startup.py"
in this script I can just import whatever series of scripts I want to execute. I just have to figure out the paths as the user data script is executes in a different directoy from /home/ec2-user/
To run commands on EC2 instances from outside of them, you should consider using AWS Systems Manager Run Command. This allows you to run multiple, arbitrary commands across a number of instances that you choose (by instance Id, by resource group, or by tags). You can do this from the AWS console, CLI, or SDK.
Note that to use this, your instances need to be configured correctly.
Using Scheduling SSH jobs Trigger python script on ec2 instance.
This is the link that can help you.

.py file in EC2 instance to execute on event from S3 Bucket

i have a .py file thats on an EC2 instance. Im trying to have the .py file run when an event(file uploaded to S3 Bucket) occurs.
I currently have an event notification that is sent to a AWS Lambda function that starts the EC2 instance, here is that code from the AWS console:
import boto3
id = [ec2-ID]
def lambda_handler(event, context):
ec2 = boto3.client('ec2')
ec2.start_instances(InstanceIds=id)
i can manually go into PuTTY and type in "python test.py" to run my program and it works, but i want to get rid of the "having to do it manually part" and have it just run itself whenever there is an event.
I am stumped as to how to progress.
I thought by "starting" my EC2 instance it would run that .py file and get to work processing whats in the S3 bucket
no error messages...it just doesnt do anything at all. Its suppose to work once a file is uploaded to the S3 bucket it should send a notification to the lambda to have the EC2 start processing the file with the .py file that is on it.
Kind regards
This is a nice trick you can try - https://aws.amazon.com/premiumsupport/knowledge-center/execute-user-data-ec2/
This should override the fact User Data is executed only on instance first creation. This method will allow you to execute User Data scripts on every boot. Just update the bash from:
/bin/echo "Hello World" >> /tmp/testfile.txt
to:
python /file_path/python_file.py &
Ttake a look at AWS Systems Manager Run Command as a way to run arbitrary scripts on EC2. You can do that from your boto3 client, but you'll probably have to use a boto3 waiter to wait for the EC2 instance to restart.
Note that if you're only starting the EC2 instance and running this script infrequently then it might be more cost-effective to simply launch a new EC2 instance, run your script, then terminate EC2. While the EC2 instance is stopped, you are charged for EBS storage associated with the instance and any unused Elastic IP addresses.
Use Cron:
$ sudo apt-get install cron
$ crontab -e
# option 3 vim
#Type "i" to insert text below
#reboot python /path_directory/python_test.py &
#Type ":wq" to save and exit
To find the .py file, run:
sudo find / -type f -iname "python_test.py"
Then add the path to Cron.
If all you need is to run some python code and the main limitation is running time, it might be a better idea to use lambda to listen to the S3 event, and Fargate to execute the task. The main advantage is you don't have to worry about starting/stopping your instance, and scaling out would be easier.
There is a nice write-up of a working use case at the serverless blog

How run a command (python file) on boot on AWS EC2 server

I'm having some problem making a python file run everytime the AWS server boots.
I am trying to run a python file to start a web server on Amazon Webservice EC2 server.
But I am limited to edit systemd folder and other folders such as init.d
Is there anything wrong?
Sorry I don't really understand EC2's OS, it seems a lot of methods are not working on it.
What I usually do via ssh to start my server is:
python hello.py
Can anyone tell me how to run this file automatically every time system reboots?
It depends on your linux OS but you are on the right track (init.d). This is exactly where you'd want to run arbitrary shell scripts on start up.
Here is a great HOWTO and explanation:
https://www.tldp.org/HOWTO/HighQuality-Apps-HOWTO/boot.html
and another stack overflow specific to running a python script:
Run Python script at startup in Ubuntu
if you want to share you linux OS I can be more specific.
EDIT: This may help, looks like they have some sort of launch wizard:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html
When you launch an instance in Amazon EC2, you have the option of
passing user data to the instance that can be used to perform common
automated configuration tasks and even run scripts after the instance
starts. You can pass two types of user data to Amazon EC2: shell
scripts and cloud-init directives. You can also pass this data into
the launch wizard as plain text, as a file (this is useful for
launching instances using the command line tools), or as
base64-encoded text (for API calls).

Categories

Resources