I have server and connect to it via ssh, I would like to run command and then disconnect from server thinking, that process is running and will be finished in few hours.
So I came up with
python script.py > output.txt 2^C1 &
it works! ... well sometimes .. and sometimes (exactly the same python script) it fails without leaving any error msg in output.txt. Does anyone know why? And what should I be running to prevent failing?
I doubt that it has something to do with python script.
You're probably looking for nohup.
nohup python script.py > output.txt 2^C1 &
It allows you to logout with the script still running.
It won't leave any error message in output.txt, Try this
python script.py 1> output.txt 2>error.txt 2^C1 &
If your script fails it should log reason in error.txt
Related
My question seems to be quite easy, but for some reason I did not find a quick answer to it. I have a python script that I want to run on the terminal command line (Ubuntu linux server), which works for me. But then I can't use the command line until the script ends. The script takes a long time to run, and I would like to continue using the command line to perform other tasks. How can you do the work of a script when its progress is not shown on the command line, but keep its work? And how can I see the active processes that are running on the server to see if a process is running?
Run script command:
python script.py
Add next with & echo "123":
The script takes a long time to run, and I would like to continue
using the command line to perform other tasks.
It seems that you want to run said process in background, please try pasting following
python script.py &
echo "123"
it should start your script.py and then output 123 (without waiting until script.py ends)
how can I see the active processes that are running on the server to
see if a process is running?
Using ps command
ps -ef
will list all processes which you would probably want to filter to get interested one to you.
I have ten python scripts in the same directory. How to run all of these from command line, that it will work in background?
I use SSH terminal to connect to server CentOS and run Python script as:
python index.py
But when I close client terminal SSH, proccess is died
You can use the & command to make things run in the background, and nohup so it continues on logout, such as
nohup python index.py &
If you want to run multiple things this way, it's probably easiest to just make a script to start them all (with a shell of your choice):
#!/bin/bash
nohup python index1.py &
nohup python index2.py &
...
As long as you don't need to interact with the scripts once they are started (and don't need any stdout printing) this could be pretty easily automated with another python script using the subprocess module:
for script in listofscripts:
#use subprocess.run() for python 3.x (this blocks until each script terminates)
subprocess.call(["python", script], *args) #use popen if you want non - blocking
*args is a link (it's coloring got overwritten by code highliting
also of note: stdout / stderr printing is possible, just more work..
I just set up my first aws server for a personal project I'm doing. I'm running ubuntu linux, and I have a python script that accesses an sqlite database file in order to send email. I have these same files on my own ubuntu machine and the script works fine. I'm having trouble, however, figuring out how to run my script from the terminal in my aws vm. Normally I use idle to run my python script on my linux machine, so I'm trying to figure out how to run it from the terminal and it's giving me some trouble.
I tried
python script.py
which did nothing, so I converted it to an executable, ran it, and got
./script.py: line 1: import: command not found
...and so on
I realized that I had to add
#!/usr/bin/env python3
to my script, so I did that, converted to executable again, and ran it by entering
./script.py
which also did nothing. If the program had run, it would have delivered an email to my inbox. Is there any way I can tell if it's actually trying to run my script? Or am I trying to run it incorrectly?
You can modify the script to add verbose that prints out to the console the status of the script, or if you just want to know whether your script is running in the background, you can check whether the process is active using ps (process name would be the name of the script) :
ps aux | grep "script.py"
Anyways the former is a better practice, since you can exactly know execution flow of your script.
I am a newbie in Fabric, and want to run one command in a background, it is written in shell script, and I have to run that command via Fabric, so lets assume I have a command in shell script as:
#!/bin/bash/
java &
Consider this is a file named myfile.sh
Now in Fabric I am using this code to run my script as:
put('myfile.sh', '/root/temp/')
sudo('sh /root/temp/myfile.sh')
Now this should start the Java process in background but when I login to the Machine and see the jobs using jobs command, nothing is outputted.
Where is the problem please shed some light.
Use it with
run('nohup PATH_TO_JMETER/Jmetercommand & sleep 5; exit 0)
maybe the process exists before you return. when you type in java, normally it shows up help message and exits. Try a sleep statement or something that lingers. and if you want to run it in the background, you could also append & to the sudo call
I use run("screen -d -m sh /root/temp/myfile.sh",pty=False). This starts a new screen session in detached mode, which will continue running after the connection is lost. I use the pty=False option because I found that when connecting to several hosts, the process would not be started in all of them without this option.
In remote server, I have a script test.sh like:
#!/bin/bash
echo "I'm here!"
nohup sleep 100&
From local, I run 'fab runtest' to call the remote test.sh.
def runtest():
run('xxxx/test.sh')
I can get the output "I'm here!", but I can Not find the sleep process in remote sever.
What did I miss?
Thanks!
It is possible to run the nohup inside the script in remote machine?
I checked the answer here and Fabric FAQ, also get the hints from fabric appears to start apache2 but doesn't and it works for me to combine them together
You can keep your test.sh without changes, and add pty=False with related shell redirection.
from fabric.api import *
def runtest():
run("nohup /tmp/test.sh >& /dev/null < /dev/null &",pty=False)
At least, it works for me.
According to the Fabric FAQ you can no longer effectively do this. Instead you should use tmux, screen, dtach or even better use the python daemon package:
import daemon
from spam import do_main_program
with daemon.DaemonContext():
do_main_program()
We ran into this problem and found that you can use nohup in a command, but not in the script itself.
For example, run('nohup xxxx/test.sh') works.