I need to execute python script on remote server (access through puTTY), but I don't have a stable Internet connection, and every time I execute the script I get problems after several minutes due to my Internet getting disconnected.
How do I remotely execute the script without being connected to server?
(e.g. I connect to server, run script, and can logout while executing)
You can use a Linux Screen, it opens a background terminal and keeps a shell active even through network disruptions.
Open the screen typing in your terminal $ screen and execute there your script, even if you lose connection it won't kill the process.
Here you will find a well explained How to for this program. I use it for my regular day working on remote.
try this
nohup your_script >/dev/null 2>&1 &
program will be running in background
Related
My Setup:
I have a Python script that I'd like to run on a remote host. I'm running a BASH script on my local machine that SSH's into my remote server, runs yet another BASH script, which then kicks off the Python script:
Local BASH script --> SSH --> Remote BASH script --> Remote Python script
The Python script configures a device (a DAQ) connected to the remote server and starts a while(True) loop of sampling and signal generation. When developing this script locally, I had relied on using Ctrl+C and a KeyboardInterrupt exception to interrupt the infinite loop and (most importantly) safely close the device sessions.
After exiting the Python script, I have my BASH script do a few additional chores while still SSH'd into the remote server.
Examples of my various scripts...
local-script.sh:
ssh user#remotehost "remote-script.sh"
remote-script.sh:
python3 infinite-loop.py
infinite-loop.py:
while(true):
# do stuff...
My Issue(s):
Now that I've migrated this script to my remote server and am running it via SSH, I can no longer use the KeyboardInterrupt to safely exit my Python script. In fact, when I do, I'll notice that the device that was being controlled by the Python script is still running (the output signals from my DAQ are changing as though the Python script is still running), and when I manually SSH back into the remote server, I can find the persisting Python script process and must kill it from there (otherwise I get two instances of the Python script running on top of one another if I run the script again). This leads me to believe that I'm actually exiting my remote-side BASH script SSH session that was kicked off by my local script and leaving my remote BASH and Python scripts off wandering on their own... (updated, following investigation outlined in the Edit 1 section)
In summary, using Ctrl+C while in the remote Python script results in:
Remote Python Script = Still Running
Remote BASH Script = Still Running
Remote SSH Session = Closed
Local BASH Script = Active ([Ctrl]+[C] lands me here)
My Ask:
How can I asynchronously interrupt (but not fully exit) a Python script that was kicked off over an SSH session via a BASH script? Bonus points if we can work within my BASH --> SSH --> BASH --> Python framework... whack as it may be. If we can do it with as few extra pip modules installed on top, you just might become my favorite person!
Edit 1:
Per #dan's recommendation, I started exploring trap statements in BASH scripts. I have yet to be successful in implementing this, but as a way to test its effectiveness, I decided to monitor process list at different stages of execution... It seems that, once started, I can see my SSH session, my remote BASH script, and its subsequent remote Python script start up processes. But, when I use Ctrl+C to exit, I'm kicked back into the top-level "Local" BASH script and, when I check the process list of my remote server, I see both the process for my remote BASH script and my remote Python script still running... so my Remote BASH script is not stopping... I'm, in fact, ONLY ending my SSH session...
In combining the suggestions from the comments (and lots of help from a buddy), I've got something that works for me:
Solution 1:
In summary, I made my remote BASH script record its Group Process ID (GPID; that which is also assigned to the Python script that is spawned by the remote BASH script) to a file, and then had the local BASH script read that file to then kill the group process remotely.
Now my scripts look like:
local-script.sh
ssh user#remotehost "remote-script.sh"
remotegpid=`ssh user#ip "cat gpid_file"`
ssh user#ip "kill -SIGTERM -- -$remotegpid && rm gpid_file"
# ^ After the SSH closes, this goes back in to grab the GPID from the file and then kills it
remote-script.sh
ps -o pgid= $$ | xargs > ~/Desktop/gpid_file
# ^ This gets the BASH script's GPID and writes it to a file without whitespace
python3 infinite-loop.py
infinite-loop.py (unchanged)
while(true):
# do stuff...
This solves only most of the problem, since, originally I had set out to be able to do things in my Python script after it was interrupted and before exiting into my BASH scripts, but it turned out I had a bigger problem to catch (what with the scripts continuing to run even after closing my SSH session)...
I want to execute a Python script when creating an alarm (McAfee ESM Virtual machine - ESM 11.5.4 20220106). When creating an alarm, I configured "Execute remote command". Alarm is being triggered but script isn't getting executed. I tried as shown in the image below but it didn't work.
I found a way to this. You can't execute a script situated in the ESM (In the image above, the IP is of ESM machine itself). You have to spin another VM with the required ports open. Refer image in the question.. So, the IP address, credentials and script path has to be of the VM.
Basically you should have a command server for script execution and others.
Also, in my case I noted that only bash scripts were getting executed. Python scripts failed. So, the work around to that could be that you mention bash script in ESM and inside bash script you write a command to execute the python scripts that has to be executed.
Hope it helps!
I've coded a stock trading bot in Python3. I have it hosted on a server (Ubuntu 18.10) that I use iTerm to SSH into. Wondering how to keep the script actively running so that when I exit out of my session it won't kill the active process.
Basically, I want to SSH into my server, start the script then close out and come back into it when the days over to stop the process.
You could use nohup and add & at the end of your command to safely exit you session without killing original process. For example if your script name is script.py:
nohup python3 script.py &
Normally, when running a command using & and exiting the shell afterwards, the shell will terminate the sub-command with the hangup signal (kill -SIGHUP <pid>). This can be prevented using nohup, as it catches the signal and ignores it so that it never reaches the actual application.
You can use screen
sudo apt-get install screen
screen
./run-my-script
Ctrl-A then D to get out of your screen
From there you will be able to close out your ssh terminal. Come back later and run
screen -ls
screen -r $screen_running
The screen running is usually the first 5 digits you see after you've listed all the screens. You can see if you're script is still running or if you've added logging you can see where in the process you are.
Using tmux is a good option. Alternatively you could run the command with an & at the end which lets it run in the background.
https://tmuxcheatsheet.com/
I came here for finding nohup python3 script.py &
Right solution for this thread is screen OR tmux. :)
I am using Python and wxpython for gui. I am trying to connect ssh tunnel. After connecting to ssh, wants a new terminal to open and have to continue my operation in local machine. How to achieve this?
I tried subprocess, pexpect and paramiko, but all are capable to connect to ssh but not open the new teminal
Below my code is there which I tried with pexpect:
import time
import sys
import pexpect
c = pexpect.spawn("ssh -Y -L xxxx:localhost:xxxx user # host.com")
time.sleep(0.1)
c.expect("[pP]aasword")
c.sendline("xxxxxx")
time.sleep(0.2)
c.interact()
c.pexpect([user#host.com~]$)
# here after its connects to ssh then command wont be executed
c.sendline("xfce4-terminal")
On 24/04/2013
I am able to open new terminal but what happens is when the new terminal will open controls from gui doesn't go there. Any help?
Opening a new local terminal and connecting an existing process in to it is a little complicated. There are at least three approaches:
Open the terminal before you start connecting, and run all the code that tries to establish the connection from within it. This is simplest. The main drawback is that the terminal will appear even if the connection fails, which might be what you want to avoid.
Run the connection attempt with a session of tmux or screen and if you detect that it succeeded then reattach that session in to a new terminal.
Make your Python program provide a pty that the terminal can attach to - your program will need to hang around and pass input and output between the remote connection and the pty.
I've setup an Amazon EC2 server. I have a Python script that is supposed to download large amounts of data from the web onto the server. I can run the script from the terminal through ssh, however very often I loose the ssh connection. When I loose the connection, the script stops.
Is there a method where I tell the script to run from terminal and when I disconnect, the script is still running on the server?
You have a few options.
You can add your script to cron to be run regularly.
You can run your script manually, and detach+background it using nohup.
You can run a tool such as GNU Screen, and detach your terminal and log out, only to continue where you left off later. I use this a lot.
For example:
Log in to your machine, run: screen.
Start your script and either just close your terminal or properly detach your session with: Ctrl+A, D, D.
Disconnect from your terminal.
Reconnect at some later time, and run screen -rD. You should see your stuff just as you left it.
You can also add your script to /etc/rc.d/ to be invoked on book and always be running.
You can also use nohup to make your script run in the background or when you have disconnected from your session:
nohup script.py &
The & at the end of the command explicitly tells nohup to run your script in the background.
If it just a utility you run ad-hoc, not a service daemon of some kind, i would just run it in screen. Than you can disconnect if you want and open the terminal back up later... Or reconnect the terminal if you get disconnected. It should be in your linux distros package manager. Just search for screen
http://www.gnu.org/software/screen/
nohup runs the given command with hangup signals ignored, so that the command can continue running in the background after you log out.
Syntax:
nohup Command [Arg]...
Example:
nohup example.py
nohup rasa run
Also, you can run scripts continuously using the cron command.
For more:
https://ss64.com/bash/nohup.html
https://opensource.com/article/17/11/how-use-cron-linux