Problems with nohup of a python code which calls another nohup? - python

The problem is like this.
I have a python code "test.py", which calls a command line:
os.system('nohup some_command > b.txt &')
print(some_results)
and it works well, which redirects the output info of "some_command" to "b.txt", and only printing the "some_results" to the terminal.
I want to run the code in background (let it keep running when I exit the terminal), so I use this command:
nohup python test.py > a.txt &
Now everything, including the "some_results" and the output of "some_command", is redirected into "a.txt", which causes the program to run not properly.
Is there a way to redirect only the "some_results", instead of everything, to "a.txt"? What should I do?
PS: I don't know what keywords should I search, and by searching "nohup in nohup" I have not found relevant solution.
============================= Some unsuccessful attempts =========================
After reading a recommended question: Redirect stdout to a file in Python?
I had an idea:
import sys
sys.stdout=open('c.txt','w')
os.system('nohup some_command > b.txt &')
print(some_results)
But the problem is still not solved.
nohup python test.py > a.txt & redirects the python outputs to "c.txt" (as expected), but everything else to "a.txt" instead of "b.txt", and causes a failure.
python test.py > a.txt & works temporarily: It redirects the python outputs to "c.txt", the command outputs to "b.txt", and leaving "a.txt" blank, as expected (sort of).
However, the terminal would be poped up with "nohup: redirecting stderr to stdout" messages each time the "os.system" command is called. After restarting the terminal, the messages no longer pops and the program is still running, but the redirection becomes the same as nohup python test.py > a.txt &.
============== Some additional information =====================
The os.system(blabla) is executed multiple times.
The "some_command" is actually "pocketsphinx", which outputs a lot of logs including the time alignment of phonemes, and finally a line describing the phoneme sequence without time alignment. What I need is the "time alignment" section.
In normal conditions, the last line always follows the info section, no matter where they are printed.
In my problem, the last line is always in the "b.txt" correctly. The info (including the time alignments which I want them to be in "b.txt") are redirected to somewhere else.

In your script, just redirect both stdout and stderr to a file, no nohup, no background:
os.system('some_command > b.txt 2>&1')
print(some_results)
In the terminal:
nohup python my_script.py > a.txt &

Related

Call python script as module with input from bash script

From a bash function, I want to call a python script which prompts for input, and I need to run that script as a module using python -m
Here is select_pod.py
# above this will be a print out of pods
pod = input('Pick pod')
print(pod)
Here is the bash function:
function foo() {
POD=$(python3 -m select_pod)
kubectl exec $POD --stdin --tty bash
}
I can't get the input to work, i.e. "Pick pod" is not printed to the terminal.
When you do POD=$(python3 -m select_pod), the POD=$(...) means that any output printed to stdout within the parentheses will be captured within the POD variable instead of getting printed to the screen. Simply echoing out POD is no good, as this will first be done once the Python script has finished.
What you need to do is to duplicate the output of the Python program. Assuming Linux/Posix, this can be done using e.g.
POD=$(python3 -m select_pod | tee /dev/stderr)
Because your terminal shows both stdout and stderr, duplicating the output from stdout to stderr makes the text show up.
Hijacking the error channel for this might not be ideal, e.g. if you want to later sort the error messages using something like 2> .... A different solution is to just duplicate it directly to the tty:
POD=$(python3 -m select_pod | tee /dev/tty)
You can change sys.stdout before input :
import sys
save_sys_stdout = sys.stdout
sys.stdout = sys.stderr
pod = input('Pick pod')
sys.stdout = save_sys_stdout
print(pod)
So that POD=$(python3 -m select_pod) will work and you don't need to do split after.

Save command output to file and see it on the terminal

I have a command that slowly outputs a list. I want this list to be saved on a file and also see it slowly being generated on the terminal.
python script.py 2>&1 | tee File.txt
This does not work for me. While the command is saved, I don't see the list of websites appearing on the terminal.
By default stdout is line buffered when going to a terminal, but uses a larger buffer when being redirected, hence tee and the terminal don't see the output until later.
For ways to get script.py to not buffer the output see the answers to this question Disable output buffering
For example if script.py is:
#!/usr/bin/python3
import time
for i in range(5):
print('This is line', i, flush=True)
time.sleep(1)
Running ./script.py | tee File.txt will print each line to the terminal as the line is executed - one second apart.
If you remove flush=True then the entire output is buffered, and nothing is printed until the script finishes 5 seconds later when everything is printed.
2>&1 redirects stderr to stdout, so you may need to apply the same buffering to stderr as well as stdout.
Per the Linux Documentation Project (TLDP),
2>&1
# Redirects stderr to stdout.
# Error messages get sent to same place as standard output.
And,
&>filename
# Redirect both stdout and stderr to file "filename."
So to pipe both to a file,
Command &> | tee File.txt
Or just stdout,
Command | tee File.txt

Freeze stdin when in the background, unfreeze it when in the foreground

I am trying to run a sript in the background:
nohup script.py > out 2> err < /dev/null &
The script (Python 3.4) does at some point:
answer = input('? ')
(it has a menu running in one of the threads)
And the nohup call is crashing with:
EOFError: EOF when reading a line
Because of the /dev/null redirection of stdin I imagine. If I run it without stdin redirection:
nohup script.py > out 2> err &
It crashes with:
OSError: [Errno 9] Bad file descriptor
If I run it with:
script.py > out 2> err
It works, but blocks my terminal (it is in the foreground)
If I run it with:
script.py > out 2> err &
It runs in the background alright, but it gets stopped as soon as the input call is reached.
What I would like is:
be able to redirect stdout and stderr to the filesystem
be able to put the script in the background
be able to move if to the foreground and interact with the menu normally (so stdin must be enabled somehow). stdout and stderr would still be redirected to the filesystem, but stdin would behave normally.
the script must run fine in the background and in the foreground (of course, the menu is not working in the background, because stdin is "frozen")
Basically, what I would like is that when it is in the background, stdin is kind of "frozen", and whenever it comes to the foreground it works again normally.
Is this possible? The solution does not need to involve nohup
What you want (and how input works and fails on EOF under python) with an interactive menu means that you cannot safely pass a file as stdin when invoking your program. This means your only option is to invoke this like so:
$ script.py > out 2> err &
As a demonstration, this is my script:
from time import sleep
import sys
c = 0
while True:
sleep(0.001)
c += 1
if c % 1000 == 0:
print(c, flush=True)
if c % 2000 == 0:
print(c, file=sys.stderr, flush=True)
if c % 10000 == 0:
answer = input('? ')
print('The answer is %s' % answer, flush=True)
Essentially, every second it will write to stdout, every two seconds it will write to stderr, and lastly, every ten seconds it will wait for input. If I were to run this and wait a bit over a seconds (to allow disk flush), and chain this together, like so:
$ python script.py > out 2> err & sleep 2.5; cat out err
[1] 32123
1000
2000
2000
$
Wait at least 10 seconds and try cat out err again:
$ cat out err
1000
2000
3000
4000
5000
6000
7000
8000
9000
10000
? 2000
4000
6000
8000
10000
[1]+ Stopped python script.py > out 2> err
$
Note that the prompt generated by input is also written to stdout, and the program effectively continued running up to where it is expecting stdin to give it data. You simply have to bring the process back into foreground by %, and start feeding it the required data, then suspend with ^Z (CtrlZ) and keep it running again in the background with %&. Example:
$ %
python script.py > out 2> err
Test input
^Z
[1]+ Stopped python script.py > out 2> err
$ %&
[1]+ python script.py > out 2> err &
$
Now cat out again, after waiting another ten seconds:
$ cat out
1000
...
10000
? The answer is Test input
11000
...
20000
?
[1]+ Stopped python script.py > out 2> err
$
This is essentially a basic crash course in how standard processes typically functions in both foreground and background, and things just simply work as intended if the code handles the standard IO correctly.
Lastly, you can't really have it both ways. If the application expects stdin and none is provided, then the clear option is failure. If one is provided however but application got sent to background and kept running, it will be Stopped as it expects further input. If this stopped behaviour is unwanted, the application is at fault, there is nothing can be done but to change the application to not result in an error when EOF is encountered when executed with /dev/null as its stdin. If you want to keep stdin as is, with the application being able to somehow keep running when it is in the background you cannot use the input function as it will block when stdin is empty (resulting in process being stopped).
Now that you have clarified via the comment below that your "interactive prompt" is running in a thread, and since usage of input directly reads from stdin and you seem unwilling to modify your program (you asking for general case) but expects a utility to do this for you, the simple solution is to execute this within a tmux or screen session as they fully implement a pseudo tty that is independent from whichever console that started (so you can disconnect and send the session to the background, or start other virtual sessions, see manual pages) which will provide stdio that the program expects.
Finally, if you actually want your application to support this natively, you cannot simply use input as is, but you should check whether input can safely be called (i.e. perhaps making use of select), or by checking whether the process is currently in the foreground or background (An example you could start working from is How to detect if python script is being run as a background process, although you might want to check using sys.stdin, maybe.) to determine if input can be safely called (however if user suspends the task as input comes it would still hang as input waits), or use unix sockets for communication.

getting python script to print to terminal without returning as part of stdout

I'm trying to write a python script that returns a value which I can then pass in to a bash script. Thing is that I want a singe value returned in bash, but I want a few things printed to the terminal along the way.
Here is an example script. Let's call it return5.py:
#! /usr/bin/env python
print "hi"
sys.stdout.write(str(5))
what I want is to have this perform this way when I run it from the command line:
~:five=`./return5.py`
hi
~:echo $five
5
but what I get is:
~:five=`./return5.py`
~:echo $five
hi 5
In other words I don't know how to have a python script print and clear the stdout, then assign it to the specific value I want.
Not sure why #yorodm suggests not to use stderr. That's the best option I can think of in this case.
Notice that print will add a newline automatically, but when you use sys.stderr.write, you need to include one yourself with a "\n".
#! /usr/bin/env python
import sys
sys.stderr.write("This is an important message,")
sys.stderr.write(" but I dont want it to be considered")
sys.stderr.write(" part of the output. \n")
sys.stderr.write("It will be printed to the screen.\n")
# The following will be output.
print 5
Using this script looks like this:
bash$ five=`./return5.py`
This is an important message, but I dont want it to be considered part of the output.
It will be printed to the screen.
bash$ echo $five
5
This works because the terminal is really showing you three streams of information : stdout, stdin and stderr. The `cmd` syntax says "capture the stdout from this process", but it doesn't affect what happens to stderr. This was designed exactly for the purpose you're using it for -- communicating information about errors, warnings or what's going on inside the process.
You may not have realized that stdin is also displayed in the terminal, because it's just what shows up when you type. But it wouldn't have to be that way. You could imagine typing into the terminal and having nothing show up. In fact, this is exactly what happens when you type in a password. You're still sending data to stdin, but the terminal is not displaying it.
from my comment..
#!/usr/bin/env python
#foo.py
import sys
print "hi"
sys.exit(5)
then the output
[~] ./foo.py
hi
[~] FIVE=$?
[~] echo $FIVE
5
You can use stdout to output your messages and stderr to capture the values in bash. Unfortunately this is some weird behaviour as stderr is intended for programs to communicate error messages so I strongly advice you against it.
OTOH you can always process your script output in bash

how can I silence all of the output from a particular python command?

Autodesk Maya 2012 provides "mayapy" - a modded build of python filled with the necessary packages to load Maya files and act as a headless 3D editor for batch work. I'm calling it from a bash script. If that script opens a scene file in it with cmds.file(filepath, open=True), it spews pages of warnings, errors, and other info I don't want. I want to turn all of that off only while the cmds.file command is running.
I've tried redirecting from inside of the Python commands I'm sending into mayapy inside the shell script, but that doesn't work. I can silence everything by redirecting stdout/err to /dev/null in the call to the bash script. Is there any way to silence it in the call to the shell, but still allow my passed-in command inside the script to print out information?
test.sh:
#!/bin/bash
/usr/autodesk/maya/bin/mayapy -c "
cmds.file('filepath', open=True);
print 'hello'
"
calling it:
$ ./test.sh # spews info, then prints 'hello'
$ ./test.sh > /dev/null 2>&1 # completely silent
Basically, I think the best way to solve this is to implement a wrapper that will execute test.sh and sanitize the output to the shell. To sanitize the output, I would simply prepend some string to notify your wrapper that this text is good for output. My inspiration for the wrapper file came from this: https://stackoverflow.com/a/4760274/2030274
The contents are as follows:
import subprocess
def runProcess(exe):
p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
retcode = p.poll() #returns None while subprocess is running
line = p.stdout.readline()
yield line
if(retcode is not None):
break
for line in runProcess(['./test.sh']):
if line.startswith('GARYFIXLER:'):
print line,
Now you could imagine test.sh being something along the lines of
#!/bin/bash
/usr/autodesk/maya/bin/mayapy -c "
cmds.file('filepath', open=True);
print 'GARYFIXLER:hello'
"
and this will only print the hello line. Since we are wrapping the python call in a subprocess, all output typically displayed to the shell should get captured and you should intercept the lines that you don't want.
Of course, to call test.sh from a python script, you need to make sure you have the correct permissions.
I knew I was just getting twisted around with pipes. Maya is indeed sending all batch output to stderror. This frees stdout entirely once you properly pipe stderr away. Here's an all-bash one-liner that works.
# load file in batch; divert Maya's output to /dev/null
# then print listing of things in file with cmds.ls()
/usr/autodesk/maya/bin/mayapy -c "import maya.standalone;maya.standalone.initialize(name='python');cmds.file('mayafile.ma', open=True);print cmds.ls()" 2>/dev/null

Categories

Resources