How to disable print statements conveniently so that pythonw can run? - python

I have a python script (.py) which runs normally on the console. When I try running the script using pythonw by changing the extension to (.pyw), it stopped working. My guess is that the print statements are causing pythonw to die based on results returned from Google Search.
Is there a convenient way to disable the print statements so that pythonw can run? I have too many print statements in the script. Changing them one by one is not practical, particularly if I want to switch back to using normal python.
Thanks.

create your own print function, say out, and then change all prints to out. the out method can then be changed once to output to console or to a file.

I discovered an answer from another StackOverflow thread which worked well for me. I have upvoted the answer. The credit is not mine. If you want to upvote, go to that link and upvote. See link below.
Can I get the output of "print" statement in pythonw?
You can globally redirect stdout by assigning to sys.stdout:
import sys
sys.stdout = open("mylog.txt", "w")
Then the rest of your program's stdout, including print statements, will go to mylog.txt.

Just redirect all output to /dev/null. Python has a platform agnostic way to do this.
import os
import sys
# Only redirect for pythonw not regular python.
if os.path.splitext(os.path.basename(sys.executable))[0] == 'pythonw':
sys.stdout = open(os.devnull, 'w')
sys.stderr = open(os.devnull, 'w')

Related

Blocking sys.stdout and stderr does not prevent C code from printing

I am including in my python code a function compiled in c via a cython wrapper. I have to take that function as given and cannot change it. Unfortunately, when I run that function, I see output that is bothering me.
I have tried a lot of tricks that are supposed to get rid of it, all of which play with sys.stdout or sys.stderr -- most noteably, the new contextlib.redirect_stdout. However, nothing I tried managed to silence the output.
At the most basic level, I simply tried setting
sys.stdout = open(os.devnull, 'w')
sys.stderr = open(os.devnull, 'w')
Which is not a safe, or practicable way of doing it, but it should shut the function up. Unfortunately, I can still see the output. What am I missing? Is there perhaps another "output type" besides stdout that this function might be using?
If it helps, I am inside a Pycharm debugging session and see this output in my debugging console.
Updated question to reflect that changing stderr did not help
A C function prints to a file descriptor (1 for stdout, 2 for stderr). If you want to prevent the printing, redirect that FD, that can be done also temporarily. Here is a litte demo:
import os
STDOUT = 1
saved_fd = os.dup(STDOUT)
null_fd = os.open(os.devnull, os.O_WRONLY)
os.dup2(null_fd, STDOUT)
os.system('echo TEST 1') # redirected to /dev/null
os.dup2(saved_fd, STDOUT)
os.system('echo TEST 2') # normal
# note: close the null_fd, saved_fd when no longer needed
If the C code opens the terminal device itself, there is very little you can do to prevent it. But that would be very unusual (I would even say a bug) unless there is a specific reason to do so.
Is there perhaps another "output type" besides stdout that this
function might be using?
Yes, there exist stderr, which would be unaffected by stdout redirect, simple example, let printer.py content be
import sys
sys.stderr.write("printing to stderr")
then running in terminal
python printer.py > output.txt
lead to appearance of
printing to stderr
as > output.txt redirects only stdout.

Python subprocess.run C Program not working

I am trying to write the codes to run a C executable using Python.
The C program can be run in the terminal just by calling ./myprogram and it will prompt a selection menu, as shown below:
1. Login
2. Register
Now, using Python and subprocess, I write the following codes:
import subprocess
subprocess.run(["./myprogram"])
The Python program runs but it shows nothing (No errors too!). Any ideas why it is happening?
When I tried:
import subprocess
subprocess.run(["ls"])
All the files in that particular directory are showing. So I assume this is right.
You have to open the subprocess like this:
import subprocess
cmd = subprocess.Popen(['./myprogram'], stdin=subprocess.PIPE)
This means that cmd will have a .stdin you can write to; print by default sends output to your Python script's stdout, which has no connection with the subprocess' stdin. So do that:
cmd.stdin.write('1\n') # tell myprogram to select 1
and then quite probably you should:
cmd.stdin.flush() # don't let your input stay in in-memory-buffers
or
cmd.stdin.close() # if you're done with writing to the subprocess.
PS If your Python script is a long-running process on a *nix system and you notice your subprocess has ended but is still displayed as a Z (zombie) process, please check that answer.
Maybe flush stdout?
print("", flush=True,end="")

Using subprocess to call R from Python, want to keep STDOUT and ignore STDERR

So this code in Python that I have currently works in returning my STDOUT in the variable "run":
run = subprocess.check_output(['Rscript','runData.R',meth,expr,norm])
But it still prints to the screen all this ugly text from having to install a package in R, etc, etc. So I would like for that to be ignored and sent into STDERR. Is there any way to do this? This is what I'm currently working on but it doesn't seem to work. Again, I just want it to ignore what it is printing to the screen except the results. So I want to ignore STDERR and keep STDOUT. Thank you!
run = subprocess.Popen(['Rscript','runData.R',meth,expr,norm],shell=False, stdout=subprocess.PIPE,stderr=devnull)
To avoid piping stderr entirely you may redirect it to os.devnull:
os.devnull
The file path of the null device. For example: '/dev/null' for POSIX, 'nul' for Windows. Also available via os.path.
import os
import subprocess
with open(os.devnull) as devnull:
subprocess.Popen([cmd arg], stdout=subprocess.PIPE, stderr=devnull)
I actually solved my problem as soon as I posted this! My apologies! This is how it worked:
output = subprocess.Popen(['Rscript','runData.R',meth,expr,norm],shell=False, stdout=subprocess.PIPE,stderr=subprocess.PIPE)
final = output.stdout.read()
This ignored the messy stuff from the command line and saved my results into final.
Thank you for everyone's quick replies!

What's the best way to handle output redirection?

I want my program by default to stdout, but give the option of writing it to a file. Should I create my own print function and call that testing that there is an output file or is there a better way? That seems inefficient to me, but every way I can think of calls an additional if test for every print call. I know this really doesn't matter in the long run probably, at least of this script, but I'm just trying to learn good habits.
Just write to standard out using print. If the user wants to redirect the output to a file they can do that:
python foo.py > output.txt
Write to a file object, and when the program starts either have that object point to sys.stdout or to a file specified by the user.
Mark Byers' answer is more unix-like, where most command line tools just use stdin and stdout and let the user do redirection as they see fit.
No, you don't need to create separate print function. In Python 2.6 you have this syntax:
# suppose f is an open file
print >> f, "hello"
# now sys.stdout is file too
print >> sys.stdout, "hello"
In Python 3.x:
print("hello", file=f)
# or
print("hello", file=sys.stdout)
So you really don't have to differentiate files and stdout. They are the same.
A toy example, which outputs "hello" the way you want:
#!/usr/bin/env python3
import sys
def produce_output(fobj):
print("hello", file=fobj)
# this can also be
# fobj.write("hello\n")
if __name__=="__main__":
if len(sys.argv) > 2:
print("Too many arguments", file=sys.stderr)
exit(1)
f = open(argv[1], "a") if len(argv)==2 else sys.stdout
produce_output(f)
Note that the printing procedure is abstracted of whether it is working with stdout or a file.
I recommend you using the logging module and logging.handlers... stream, output files, etc..
If you using subprocess module, then based on an option you take from your command line, you can have the stdout option to an open file object. This way, from within the program you can redirect to a file.
import subprocess
with open('somefile','w') as f:
proc = subprocess.Popen(['myprog'],stdout=f,stderr=subprocess.PIPE)
out,err = proc.communicate()
print 'output redirected to somefile'
My reaction would be to output to a temp file, then either dump that to stdio, or move it to where they requested.

How do you make Python wait so that you can read the output?

I've always been a heavy user of Notepad2, as it is fast, feature-rich, and supports syntax highlighting. Recently I've been using it for Python.
My problem: when I finish editing a certain Python source code, and try to launch it, the screen disappears before I can see the output pop up. Is there any way for me to make the results wait so that I can read it, short of using an input() or time-delay function? Otherwise I'd have to use IDLE, because of the output that stops for you to read.
(My apologies if this question is a silly one, but I'm very new at Python and programming in general.)
If you don't want to use raw_input() or input() you could log your output (stdout, stderr) to a file or files.
You could either use the logging module, or just redirect sys.stdout and sys.stderr.
I would suggest using a combination of the logging and traceback if you want to log errors with their trace stack.
Something like this maybe:
import logging, traceback
logging.basicConfig(filename=r'C:\Temp\log.txt', level=logging.DEBUG)
try:
#do some stuff
logging.debug('I did some stuff!')
except SomeException:
logging.error(traceback.format_exc())
Here's an example of redirecting stdout and stderr:
if __name__ == '__main__':
save_out = sys.stdout # save the original stdout so you can put it back later
out_file = open(r'C:\Temp\out.txt', 'w')
sys.stdout = out_file
save_err = sys.stderr
err_file = open(r'C:\Temp\err.txt', 'w')
sys.stderr = err_file
main() #call your main function
sys.stdout = save_out # set stdout back to it's original object
sys.stderr = save_err
out_file.close()
err_file.close()
I'm going to point out that this is not the easiest or most straight forward way to go.
This is a "problem" with Notepad2, not Python itself.
Unless you want to use input()/sleep (or any other blocking function) in your scripts, I think you have to turn to the settings in Notepad2 and see what that has to offer.
you could start in the command window. e.g.:
c:\tmp\python>main.py
adding raw_input() (or input() in py3k) at the end of your script will let you freeze it for until enter is pressed, but it's not a good thing to do.
You can add a call to raw_input() to the end of your script in order to make it wait until you press Enter.

Categories

Resources