Print output from python to C# application - python

I have made a C# app which calls a python script.
C# app uses Process object to call python script.
I also have redirected the sub-process standard output so I can process the output from python script.
But the problem is:
The output(via print function) from python will always arrive at once when the script terminates.
I want the output to arrive in real time while script running.
I can say I have tried almost all of method can get from google, like add flush of sys.out, redirect sysout in python, C# event driven message receiving or just using while to wait message etc,.
How to flush output of print function?
PyInstaller packaged application works fine in Console mode, crashes in Window mode
I am very wondering that like PyCharm or other python IDE, they run python script inside, but they can print the output one by one without hacking original python script, how they do that?
The python version is 2.7.
Hope to have advise.
Thank you!

I just use very stupid but working method to resolve it:
using thread to periodically flush the sys.out, the code piece is like this:
import sys
import os
import threading
import time
run_thread = False
def flush_print():
while run_thread:
# print 'something'
sys.stdout.flush()
time.sleep(1)
in main function:
if __name__ == '__main__':
thread = threading.Thread(target=flush_print)
run_thread = True
thread.start()
# my big functions with some prints, the function will block until completed
run_thread = False
thread.join()
Apparently this is ugly, but I have no better method to make work done .

Related

Can't get python subprocess.Popen() to start another python script in the background

I am in a bit of a pickle here. I have a python script (gather.py) that gathers information from an .xml file and uploads it into a database on a infinite loop that sleeps for 60sec; btw all of this is local. I am using Flask to run a webpage that will later pull information from the database, but at the moment all it does is display a sample page (main.py). I want to run main.py as for it to start gather.py as background process that won't prevent Flask from starting, I tried importing gather.py but it halts the process (indefinitely) and Flask won't start. After Googling for a while it seems that the best option is to use a task queue (Celery) and a message-broker (RabbitMQ) to take care of this. This is fine if the application were to do a lot of stuff in the background, but I only need it to do 1 or 2 things. So I did more digging and found posts stating that subprocess.Popen() could do the job. I tried using it and I don't think it failed, since it didn't raise any errors, but the database is empty. I confirmed that both gather.py and main.py work independently. I tried running the following code in IDLE:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x049A1CF0>
Now, I don't know what this means, I tried using .value and .attrib but understandably I get this:
AttributeError: 'Popen' object has no attribute 'value'
and
AttributeError: 'Popen' object has no attribute 'attrib'
Then I read on a StackOverflow post that stdout=subprocess.PIPE would cause the program to halt so, in a 'just in case' moment, I ran:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x034A77D0>
Through all this process the database tables have remained empty. I am new to the subprocess module but all this checks and I can't figure out why it is not running gather.py. Is it because it has an infinite loop?? If there is a better option pls let me know.
Python version: 3.4.4
PS. IDK if it'll matter but I am running a portable version of Python (PortableApps) on a Windows 10 PC. This is why I included sys.executable inside subprocess.Popen().
Solution 1 (all in python script):
Try to use Thread and Queue.
I do this:
from flask import Flask
from flask import request
import json
from Queue import Queue
from threading import Thread
import time
def task(q):
q.put(0)
t = time.time()
while True:
time.sleep(1)
q.put(time.time() - t)
queue = Queue()
worker = Thread(target=task, args=(queue,))
worker.start()
app = Flask(__name__)
#app.route('/')
def message_from_queue():
msg = "Running: calculate %f seconds" % queue.get()
return msg
if __name__ == '__main__':
app.run(host='0.0.0.0')
If you run this code each access to '/' get a value calculate in task in background. Maybe you need to block until the task get a value, but it isnt enough information in the question. Of course you need to refactor your gather.py to pass a queue for it.
Solution 2 (using a system script):
For windows, create a .bat file and run both script from there:
#echo off
start python 'path\to\gather.py'
set FLASK_APP=app.py
flask run
This will run gather.py and after start the flask server. If you use start /min python 'path\to\gather.py' the gather will run in minimized mode.
subprocess.Popen will not work in opening a python program because it recognizes python as a file and not a executable. Subprocess.Popen can only open .exe files and nothing other than that.
You can use:
os.system('python_file_path.py')
but it won't be a background process(depends on your script)

How to display two different outputs in python console

Is there a way to split the output console?
I would like to display one section on top (the main program) and the bottom part will display a progress bar for example.
(excuse my horrible design skills)
Any ideas will be greatly appreciated :)
If there is one python app that outputs - using curses library as #Rawing suggested: https://docs.python.org/3.5/howto/curses.html . It's prebuilt and at hand.
If there are more apps that output data there are several ways to do so. First, you can use byobu or alike and have split terminal with outputs from different apps visible on the same screen. Second, you can have a broadcaster app that collects data from worker apps (or threads) and displays them later with curses (see above).
More, you can dump data to a file and then using Linux watch command show contents at regular intervals:
watch cat file
There are lots of other methods too.
if you need two or multiple consoles for the output of your python script then you can do this if you are on windows.
Use win32console module to open a second console for your thread or subprocess output.
Here is a sample code:
import win32console
import multiprocessing
def subprocess(queue):
win32console.FreeConsole() #Frees subprocess from using main console
win32console.AllocConsole() #Creates new console and all input and output of subprocess goes to this new console
while True:
print(queue.get())
#prints any output produced by main script passed to subprocess using queue
if __name__ == "__main__":
queue = multiprocessing.Queue()
multiprocessing.Process(target=subprocess, args=[queue]).start()
while True:
print("Hello World in main console")
queue.put("Hello work in sub process console")
#sends above string to subprocess and it prints it into its console
#and whatever else you want to do in ur main process
You can also do this with threading. You have to use queue module if you want the queue functionality as threading module doesn't have queue
Here is the win32console module documentation

get a return value from a Daemon Process in Python

I have written a python daemon process that can be started and stopped using the following commands
/usr/local/bin/daemon.py start
/usr/local/bin/daemon.py stop
I can achieve the same results by calling these commands from a python script
os.system('/usr/local/bin/daemon.py start')
os.system('/usr/local/bin/daemon.py stop')
this works perfectly fine, but now I wish to add a functionality to the daemon process such that when I run the command
os.system('/usr/local/bin/daemon.py foo')
the daemon returns a Python object. So, something like :
foobar = os.sytem('/usr/local/bin/daemon.py foo')
just to be clear, I have all the logic ready in the daemon to return a Python object, I just can't figure out how to pass this object to the calling python script. Is there some way?
Don't you mean you want to implement simple serialization and deserialization?
In that case I'd propose to look at pickle (https://docs.python.org/2/library/pickle.html) to transform your data into a generic text format at the daemon side and transform it back to Python code at the client side.
I think, marshaling is what you need: https://docs.python.org/2.7/library/marshal.html & https://docs.python.org/2/library/pickle.html

How do I start a subprocess in python and not wait for it to return

I'm building a site in django that interfaces with a large program written in R, and I would like to have a button on the site that runs the R program. I have that working, using subprocess.call(), but, as expected, the server does not continue rendering the view until subprocess.call() returns. As this program could take several hours to run, that's not really an option.
Is there any way to run the R program and and keep executing the python code?
I've searched around, and looked into subprocess.Popen(), but I couldn't get that to work.
Here's the generic code I'm using in the view:
if 'button' in request.POST:
subprocess.call('R CMD BATCH /path/to/script.R', shell=True)
return HttpResponseRedirect('')
Hopefully I've just overlooked something simple.
Thank you.
subprocess.Popen(['R', 'CMD', 'BATCH', '/path/to/script.R'])
The process will be started asynchronously.
Example:
$ cat 1.py
import time
import subprocess
print time.time()
subprocess.Popen(['sleep', '1000'])
print time.time()
$ python 1.py
1340698384.08
1340698384.08
You must note that the child process will run even after the main process stops.
You may use a wrapper for subprocess.call(), that wrapper would have its own thread within which it will call subprocess.call() method.

Unable to open a Python subprocess in Web2py (SIGABRT)

I've got an Apache2/web2py server running using the wsgi handler functionality. Within one of the controllers, I am trying to run an external executable to perform some processing on 2 files.
My approach to this is to use the subprocess module to kick off the executable. I have simplified the code to a bare-bones implementation with little success.
from subprocess import *
p = Popen(("echo", "Hello"), shell=False)
ret = p.wait()
print "Process ended with status %s" % ret
When running the above code on its own (create new file and running via python command line), it works exactly as expected.
However, as soon as I place the exact same code into my web2py controller, the external process stops working. Instead of the process returning with code 0 as is expected in the above example, it always returns -6 and "Hello" is not printed to stdout.
After doing some digging, I found that negative results from p.wait() implies that a signal caused the process to end abnormally. And, according to some docs I found, -6 corresponds to the SIGABRT signal.
I would have expected this signal to be a result of some poorly executed code in my child process. However, since this is only running echo (and since it works outside of web2py) I have my doubts that the child process is signalling itself.
Is there some web2py limitation/configuration that causes Popen() requests to always fail? If so, how can I modify my logic so that the controller (or whatever) is actually able to spawn this external process?
** EDIT: Looks like web2py applications may not like the subprocess module. According to a reply to a message reply in the web2py email group:
"You should not use subprocess in a web2py application (if you really need too, look into the admin/controllers/shell.py) but you can use it in a web2py program running from shell (web2py.py -R myprogram.py)."
I will be checking out some options based on the note here and see if any solution presents itself.
In the end, the best I was able to come up with involved setting up a simple XML RPC server and call the functions from that:
my_server.py
#my_server.py
from SimpleXMLRPCServer import SimpleXMLRPCServer, SimpleXMLRPCRequestHandler
from subprocess import *
proc_srvr = xmlrpclib.ServerProxy("http://localhost:12345")
def echo_fn():
p = Popen(("echo", "hello"), shell=False)
ret = p.wait()
print "Process ended with status %s" % ret
return True # RPC Server doesn't like to return None
def main():
server = SimpleXMLRPCServer(("localhost", 12345), ErrorHandler)
server.register_function(echo_fn, "echo_fn")
while True:
server.handle_request()
if __name__ == "__main__":
main()
web2py_controller.py
#web2py_controller.py
def run_echo():
proc_srvr = xmlrpclib.ServerProxy("http://localhost:12345")
proc_srvr.echo_fn()
I'll be honest, I'm not a Python nor SimpleRPCServer guru, so the overall code may not be up to best-practice standards. However, going this route did allow me to, in effect, call a subprocess from a controller in web2py.
(Note, this was a quick and dirty simplification of the code that I have in my project. I have not validated it is in a working state, so it may require some tweaks.)

Categories

Resources