I'm looking to create a script in python that initiates an SSH session with a server. I know it has to be a simple process i'm just not sure where to start. My ultimate plan is to automate this script to run on startup. i am not even sure python is the best way to go i just know it comes preloaded on raspbain for the pi.
A simple bash script would be better suited to the task. It is possible with python, but there's no obvious reason to make it harder than necessary.
From
write a shell script to ssh to a remote machine and execute commands:
#!/bin/bash
USERNAME=someUser
HOSTS="host1 host2 host3"
SCRIPT="pwd; ls"
for HOSTNAME in ${HOSTS} ; do
ssh -l ${USERNAME} ${HOSTNAME} "${SCRIPT}"
done
From how do i run a script at start up (askubuntu):
You will need root privileges for any the following. To get root, open
a terminal and run the command
sudo su
and the command prompt will change to '#' indicating that the terminal
session has root privileges.
Alternative #1. Add an initscript.
Create a new script in /etc/init.d/myscript.
vi /etc/init.d/myscript
(Obviously it doesn't have to be called "myscript".) In this script,
do whatever you want to do. Perhaps just run the script you mentioned.
#!/bin/sh
/path/to/my/script.sh
Make it executable.
chmod ugo+x /etc/init.d/myscript
Configure the init system to run this script at startup.
update-rc.d myscript defaults
Alternative #2. Add commands to /etc/rc.local
vi /etc/rc.local
with content like the following.
# This script is executed at the end of each multiuser runlevel
/path/to/my/script.sh || exit 1 # Added by me
exit 0
Alternative #3. Add an Upstart job.
Create /etc/init/myjob.conf
vi /etc/init/myjob.conf
with content like the following
description "my job"
start on startup
task
exec /path/to/my/script.sh
Depending on what you do with the ssh connection, if it needs to stay open over the entire up time of the device, you will need to use some more trickery however (ssh connections are autoclosed after a period of inactivity).
Related
I need to write a script in python which stops a service running in EC2 instance just before it reboots.
What should be the approach here?
Suppose service name is abc.
to manually stop this service, I execute:
service abc stop
I want to automate this before instance goes for reboot.
Help me with this
The cron scheduler allows you to use #reboot in place of a schedule string. So, write your script and then crontab -e and add the following:
#reboot /path/to/python /path/to/reboot_script.py
For the reboot script, python isn't really your best choice. I would write it using a bash script, looking something like this:
#!/bin/bash
service abc stop
If you must use python, you could use invoke:
from invoke import run
run('service abc stop', hide=True, warn=True)
Edit:
To run this BEFORE reboot, make a shell script called K99kill_service and add it to your /etc/rc6.d directory. These scripts are executed before reboot or shutdown. It's important that it is named as above to make sure it runs at the right time and doesn't interfere with other shutdown scripts.
Courtesy : https://opensource.com/life/16/11/running-commands-shutdown-linux
See the link, it explains in very much detail on, How to shutdown services in linux before shutdown for both SysVinit and Systemd ?
The python script script.py is located in /usr/bin/monitor/scripts and it's main function is to use subprocess.check_call() and subprocess.check_output() to call various administrative tools (both c programs located in /usr/bin/monitor/ created specifically for the machine, and linux executables in /sbin like fdisk -l and df -h). It was written to run as root and print output from these programs in a useful way to the command line.
My project is to make the output from this script viewable through a webpage. I'm on a Beaglebone Black using Apache2, which executes files as user www-data from its DocumentRoot, /var/www/html/. The webpage is set up like this:
index.html uses an iframe to display the output of a python CGI script which is also located in /var/www/html/
script.cgi attempts to call/display output from script.py output using the subprocess module
The problem is that script.py is being called just fine, but each of the calls within script.py fail and return script.py's error messages because I presume they need to be run as root when apache is running them as user www-data.
To try to get around this, I created a new group called bbb, added www-data to the group, then ran chown :bbb script.py to change its group to bbb. Unfortunately it was still causing the same problems, so I tried changing permissions from 755 to 775, which didn’t work either. I tried running chown :bbb * on the files/programs that script.py uses, also to no avail. Also, some of the executables script.py uses are in /sbin and I am cautious to just give it blanket root access to directories like this.
Since my attempts at fixing ownership issues felt a bit like 1000 monkey code, I created new version of the script in which I create a list of html output, and after each print statement in the original code, I append the same line of text as a string with html tags to the html output list, then at the end of the script (in whatami) I have it create and write to a .txt file in /var/www/html/, and call os.chmod("/var/www/html/data.txt", 0o755) to give apache access. The CGI then calls subprocess.check_call() on script.py, then opens, reads, and prints each line with html formatting to the iframe in the webpage. This attempt at least resulted in accurate output but... it only updates when it is run in terminal as root, rather than re-running script.py ever time the page is refreshed, which kind of undermines the point of the webpage. I assume this means the subprocess check_call in the CGI script is not working correctly, but for some reason, the subprocess call itself doesn’t throw any errors or indications of failure, yet the text file returns without being updated. Even with the subprocess call in a “try” block succeeded by a “print(‘call successful’)”, it returns the success message and then the not updated text file.
I’m a bit at a loss trying to figure out how to just force the script to run and do it’s thing in the background so that the file will update without just giving apache root access. I've read a few things about either wrapping the python script in a shell that causes it to be run as root, or to change sudoers to give www-data sudo priviledges, but I do not want to introduce security issues or make what was intended to be a simple script allowing output to a webpage to become more convoluted than it already has. Any advice or direction would be greatly appreciated.
Best way IMO would be to "decouple" execution, by creating a localhost-only service which you "call" from the apache process by connecting via a local socket.
E.g. if using systemd:
Create: /etc/systemd/system/my-svc.socket
[Unit]
Description=My svc socket
[Socket]
ListenStream=127.0.0.1:1234
Accept=yes
[Install]
WantedBy=sockets.target
Create: /etc/systemd/system/my-svc#.service
[Unit]
Description=My Service
Requires=my-svc.socket
[Service]
Type=simple
ExecStart=/opt/my-service/script.sh %i
StandardInput=socket
StandardError=journal
TimeoutStopSec=5
[Install]
WantedBy=multi-user.target
Create /opt/my-service/script.sh:
#!/bin/sh
echo "id=$(id)"
echo "args=$*"
Finish setup with:
$ sudo chmod +x /opt/my-service/script.sh
$ sudo systemctl daemon-reload
Try it out:
$ nc 127.0.0.1 1234
id=uid=0(root) gid=0(root) groups=0(root)
args=55-127.0.0.1:1234-127.0.0.1:32938
Then from your cgi, you'll need to do the equivalent of the nc command above (just a tcp connection).
--jjo
I'm running Flask on an AWS instance. My goal is to be able to have Flask running on its own, without me having to ssh into it and run
python app.py
Is there a way to have this command run every time the AWS instance itself reboots?
Yes there is a way to start the python script on reboot.
On linux you will find /etc/init.d directory. You will need to write your own init.d script and put it inside /etc/init.d directory,which will indeed start your python script. Ahh ! wait its not goning to be magic. Dont worry, there is fixed format of init.d script. Script contains some basic tasks like start(),stop(),reload() etc. Just add the code that you want to run on start in start() block.
Some reference link : https://bash.cyberciti.biz/guide//etc/init.d
Try this:
(crontab -l 2>/dev/null; echo '#reboot python /path/to/app.py') | crontab -
I have 3-4 python scripts which I want to run continuously on a remote server(which i am accessing through ssh login).The script might crash due to exceptions(handling all the exceptions is not practical now), and in such cases i want to immediately restart the script.
What i have done till now.
I have written a .conf file in /etc/init
chdir /home/user/Desktop/myFolder
exec python myFile.py
respawn
This seemed to work fine for about 4 hours and then it stopped working and i could not start the .conf file.
Suggest changes to this or i am also open to a new approach
Easiest way to do it - run in infinite bash loop in screen. Also it's the worst way to do it:
screen -S sessionName bash -c 'while true; python myFile.py ; done'
You can also use http://supervisord.org/ or daemon by writing init.d script for it http://www.linux.com/learn/tutorials/442412-managing-linux-daemons-with-init-scripts
If your script is running on an Ubuntu machine you have the very convenient Upstart, http://upstart.ubuntu.com/
Upstart does a great job running services on boot and respawning a process that died.
i need to keep run my Python Program in Background on my Raspberry Pi after i close the ssh connection, Because i need to save Phidget information on a SQL DB
I try to do this with nohup but it seems that the python Program isn't even executed.
Because when I look into the MySql DB , after doing below there nothing inserted.
I type in :
pi#raspi ~/MyProjekt $ sudo nohup python sensorReader.py &
[1] 8580
and when i try to look if this process exist whit :
ps -A| grep 8580
it returns nothing.
So do i something wrong ?
How can i run the python program after close SSH Conneciton
I would recommend running your python program in a cron reboot job.
To edit your root cronjobs use
sudo crontab -e
And add the line
#reboot sudo python full_path/MyProjekt/sensorReader.py
Then reboot your pi with:
sudo reboot
And then confirm that your process is running:
ps -aux | grep python
I don't think this is an ssh connection issue, from what you say the program seems to execute and exit. Does your .py execute in an infinite loop? Else you shouldn't expect it to stay alive.
Then, about keeping a process alive after the parent has terminated (the shell in your case), nohup is the answer, that means ignore HUP signals (those sent by a terminating parent process).
The '&' just means 'execute in background'.
The cron solution is good if your program is meant to do something periodically, but if it should stay alive waiting for some event (like listening to a socket), I would prefer to create an init scritp, so that the program is run as a demon at boot time and only in the desired runlevels.