Python redirect output after os._exit - python

I have thread which is supposed to close whole application, so I use os._exit(1). But I also want to redirect output from my program to file and the output file is empty after all. Simple example:
import os
print('something')
os._exit(1)
Running program with:
python myprogram.py > output.txt
Is there any way to do this?

Related

How to exit out of a Python Script while the script invoked by it still continues to run

I have two python scripts out of which one of them does some processing from the data in a excel file. The second script is used to update the data whenever the user updates the excel. After the user modifies the xlsx file he/she can run the second script and it automatically reloads the second script. However the problem here is that since the first script that does the actual processing is having a infinite while loop and when I am executing the file from the second script it doesn't exit after executing the first py file. I just want the second script to execute the first python file once and then exit out while the first script keeps executing infinitely. Here is the code:
import notification
import importlib
import subprocess
import os
import sys
importlib.reload(notification)
if os.name == 'nt':
subprocess.run(["python","notification.py"], capture_output=True)
else:
subprocess.run(["python3","notification.py"], capture_output=True)
sys.exit(0)
Here the name of my first file is notification.py.
You need to use subprocess.Popen instead of subprocess.run
Reference:
Python spawn off a child subprocess, detach, and exit
https://docs.python.org/3/library/subprocess.html#popen-objects

Python subprocess.run C Program not working

I am trying to write the codes to run a C executable using Python.
The C program can be run in the terminal just by calling ./myprogram and it will prompt a selection menu, as shown below:
1. Login
2. Register
Now, using Python and subprocess, I write the following codes:
import subprocess
subprocess.run(["./myprogram"])
The Python program runs but it shows nothing (No errors too!). Any ideas why it is happening?
When I tried:
import subprocess
subprocess.run(["ls"])
All the files in that particular directory are showing. So I assume this is right.
You have to open the subprocess like this:
import subprocess
cmd = subprocess.Popen(['./myprogram'], stdin=subprocess.PIPE)
This means that cmd will have a .stdin you can write to; print by default sends output to your Python script's stdout, which has no connection with the subprocess' stdin. So do that:
cmd.stdin.write('1\n') # tell myprogram to select 1
and then quite probably you should:
cmd.stdin.flush() # don't let your input stay in in-memory-buffers
or
cmd.stdin.close() # if you're done with writing to the subprocess.
PS If your Python script is a long-running process on a *nix system and you notice your subprocess has ended but is still displayed as a Z (zombie) process, please check that answer.
Maybe flush stdout?
print("", flush=True,end="")

How do I terminate a shell program through python

Let's say I do somthing like this:
import os
os.system('java some_program.jar')
Is there a way to stop the execution of that program through python?
My situation:
I have a program in java that does some stuff and inserts data into a .csv file and I need to run it through python (because I'm using python for handeling the data in the .csv file) but the program itself doesn't stop by itself so i need a way to stop it manually once it inserts the data into the .csv file
Don't use os.system.
Instead, use p = subprocess.Popen(...). Then simply call p.kill().
Also, your Java program should be updated to exit when it sees EOF on stdin.
You could try having the java program echo to console or something that it is finished writing to the CSV file using the subprocesses library and it's check_output function. And when that is done, use something like this: os.system("taskkill /im some_program.jar") to kill off the program.

Python - stdin - how to recognize the source of the input?

I am looking for a way how to determine whether stdin input comes from another application by piping or not.
Let say that I have a program that either accepts the input data from piped stdin (when you pipe it from another application - grep, tail, ...) or it uses a default data file. I don't want the user to fill in manually the data when prompted because there was no stdin piped.
My simple code example looks like this:
from sys import stdin
for line in stdin:
print line
When I run the script using this:
echo "data" | python example.py
I get
data
and the script ends.
While if I run the script in the following way,
python example.py
it prompts user to fill in the input and it waits.
Therefore I am looking for something like follows to avoid the prompt when no data are piped.
from sys import stdin
if is_stdin_piped():
for line in stdin:
print line
else:
print "default"
Is something like this possible? Thanks
If you use input redirection, the standard input is not connected to a terminal like it usually is. You can check if a file descriptor is connected to a terminal or not with the isatty function:
import os
def is_stdin_piped():
return not os.isatty(0)
For extra safety use sys.stdin.fileno() instead of 0.
Update: To check if stdin is redirected from a file (rather than another program, such as an IDE or a shell pipeline) you can use fstat:
import os, stat
def is_stdin_a_file():
status = os.fstat(0)
return stat.S_ISREG(status.st_mode)

Start job from python, redirect output to file

I have a script, update_file, that I typically run like so:
sudo update_file (file) > ./logs/(file) &
I was wondering what the proper syntax/is it possible, to call this script from within a Python script and still have it redirect output from update_file to a file and have it created as a system job.
EDIT: I should note, I run this against multiple (file)s so I would like to pass that as a variable.
import subprocess
subprocess.call("sudo update_file(file)",stdout=open("logs/(file)","w"))
maybe?
First, the subprocess module is how you execute programs from Python. The section Replacing Older Functions with the subprocess Module in the documentation shows you how to transform typical shell functionality into Python.
Because you're using & to background the task, you'll want to create a Popen, and do the job-handling later. So:
jobs = []
# ... each time you want to run it on a file ...
jobs.append(subprocess.Popen(['sudo', 'update_file', file],
stdout=open(os.path.join('logs', file), 'w'))
# ... at exit time ...
for job in jobs:
job.wait()
job.stdout.close()

Categories

Resources