I have a script which make a request to api get data and create pandas dataframe and if certain condition is fulfilled it send another request to api and prints result.
Simple version would like this:
request= api_reguest(data)
table = json_normalize()
table = table[table['field']>1]
if table.empty:
pass
else:
var1,var2,var3 = table[['var1','var2','var3']]
another_request = api_request2(var1,var2,var3)
print var1,var2,var3
threading.Timer(1, main).start()
It all works fine, but when I run it as a process in supervisord it stops logging and sending requests to api after about 12 hours. It is clearly the problem of output buffering, because if I restart the process it starts working again.
I already tried all possible solutions for output buffering in python:
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
sys.stdout.flush()
Running script with -u option
Running script through stdbuf
My feeling is that it is something to do with supervisord buffer rather then with the script itself, but I can't figure out how to turn off supervisord buffer.
Can you advise me please?
Related
I'm trying to make a python script that would clone(ISO image) one usb stick to another using dd if=/dev/sda of=/dev/sdb
Here's my problem:
I want to create progress bar showing what is done.
I tried:
Looking at storage space at second usb stick, but this doesn't work beacause ISO image scans also unused space
By adding status=progress to dd command I can get progress in terminal but I can't figure how to access stdout from python. I tried subprocess.Popen,run(stdout = PIPE) with and without shell = True
reading process.stdout with .read(), .read(1), .readline() or communicate(). Nothing worked for me. (https://www.endpointdev.com/blog/2015/01/getting-realtime-output-using-python/)
I can see progress going on in python shell but .read() function always get stuck.
Part of code I am concerned about:
comm = 'sudo dd if=/dev/sda of=/dev/sdb'
cloning = subprocess.Popen(shlex.split(comm),stdout = PIPE,text = True)
while True:
print(cloning.stdout.read())
I want something that would work like:
while True:
progress = cloning.stdout.read()
update_bar(progress)
I'm using python 3.7 on Raspberry
Thanks for help
You were on the right track with status=progress, but it outputs to stderr, not stdout. If you do stderr = PIPE and then read from cloning.stderr instead of cloning.stdout, it will work.
I have been reading through the available questions regarding real time processing of a subprocess in Python 3 though none of them address the exact issue I am experiencing. I am working on a script to parse relevant data and format the output of a Source Server specifically for Day of Infamy.
When the script starts, it launches the Day of Infamy server (./doi.sh) which gives server start up variables such as map, playlist, battle eye server data, etc. At a certain point the server script is just hanging, waiting for an event such as a connection or a kill to take place. At this point the server would log it to STDOUT, however my Python program stops outputting any data unless you press 'enter' key. After pressing 'enter' the most recent event data is printed.
What could be causing the need to press enter for more output? Could it be something times out when the doi.sh server script is waiting for an event?
My current code for running and reading from the subprocess is:
cmd = './doi.sh'
data = subprocess.Popen(cmd, stdout = subprocess.PIPE, stderr = subprocess.STDOUT, bufsize=1)
while True:
output = data.stdout.readline()
if output == '' and data.poll() is not None:
break
if output:
output = output.decode('utf-8')
print(output)
I have functions built to parse the info from the decoded string, these work on the lines that are output by the subprocess, but having to press enter to update the console log defeats the purpose of the whole program.
This is almost certainly an issues with the buffering of stdout. Try replacing the command as like this:
stdbuf -o0 ./doi.sh
This will force the stdout for doi.sh to be unbuffered. You will also need to set the keyword arg shell=True in subprocess.Popen()
Here is a good explanation of this: https://unix.stackexchange.com/questions/25372/turn-off-buffering-in-pipe
I'm new to fabric and want to run a long-running script on a remote computer, so far, I have been using something like this:
import fabric
c = fabric.Connection("192.168.8.16") # blocking
result = c.run("long-running-script-outputing-state-information-into-stdout.py")
Is there a way to read stdout as it comes asynchronously instead of using the 'result' object that can be used only after the command has finished?
If you want to use fabric to do some stuff remotely, you have first of all follow this structure to make a connection:
#task(hosts=["servername"])
def do_things(c):
with connection(host=host, user=user,) as c:
c.run("long-running-script-outputing-state-information-into-stdout.py")
This will output the whole output regardless what you are doing!
you have to use with connection(host=host, user=user,) as c: to ensure that everything you run will run within that connection context!
First of all, I don't know what's the best way to get the functionality I would like to archive in the end.
My code will do the following:
#celery.task
def updateServerByID(sevrerID):
#run update task
os.system("samplecommadn to update server by id...")
#check if the output of the console contains "Success!", if yes, end job by using "return" statement
#return
These are the two ways I think of getting the code running:
Redirecting output of console command to a file (using python to "monitor" this file for changes and read the files content each time it's changed
Check the output of the console command for "Success!"
All in all I think way 2 would be the most efficient, but how to read the whole console output in python? Is there any way to prevent the celery task itself from printing this content?
This do nothing to do with celery, it does the matter that how to get output of get output of os.system.
Just get output in celery.task.updateServerByID following [python-how-to-get-stdout-after-running-os-system](Python: How to get stdout after running os.system?)
Hello I'm really new to the Python programming language and i have encountered a problem writing one script. I want to save the output from stdout that i obtain when i run a tcpdump command in a variable in a Python script, but i want the tpcdump command to run continuously because i want to gather the length from all packets transferred that get filtered by tcpdump(with the filter i wrote).
I tried :
fin, fout = os.popen4(comand)
result = fout.read()
return result
But it just hangs.
I'm guessing that it hangs because os.popen4 doesn't return until the child process exits. You should be using subprocess.Popen instead.
import subprocess
import shlex #just so you don't need break "comand" into a list yourself ;)
p=subprocess.Popen(shlex.split(comand),stdout=subprocess.PIPE)
first_line_of_output=p.stdout.readline()
second_line_of_output=p.stdout.readline()
...