I upgraded my ubuntu installation to 22.04 from 18.04 yesterday. Now I notice that python virtual environment is no longer working as expect.
I use python to run a lot of tools, and hence am highly dependent on subprocess library.
However to my "horror" I notice that it changed quite a lot, even when I keep using python 3.8. Mostly I notice I can no longer interact and the output is no longer piped to the shell that executes the python script.
import subprocess
def main():
proc = subprocess.run(["echo", "test"], check=True, stdout=subprocess.PIPE)
print('-------')
print(proc.stdout)
print("FINISH")
if __name__ == "__main__":
main()
If I call this with python3.8 test.py I notice that the output isn't displayed in the shell. However it is displayed when the prints happen.
What changed and how do I fix this? So output of the subprocess is piped to the output and can be seen?
Especially since a lot of tools are just running dockers (which in turn run git/javascript programs) and having output while the process is busy is kind of useful.
Related
I am trying to launch an android emulator from Python. I have tried the following code:
os.system('C:\\Nox\\bin\\Nox.exe -clone:Nox')
subprocess.Popopen('C:\\Nox\\bin\\Nox.exe -clone:Nox')
The emulator launched by either code closes as soon as python code is terminated. However, when I run the code ('C:\\Nox\\bin\\Nox.exe -clone:Nox') in Win10 terminal, the emulator doesn't close when the terminal is closed.
How can I keep the emulator running when python code terminates? I do not want to keep python code running.
I don't have a Windows machine to try this on, but in Ubuntu the following did it for me:
import subprocess
subprocess.Popen('<your command string>', shell=True)
So in your case:
import subprocess
subprocess.Popen('C:\\Nox\\bin\\Nox.exe -clone:Nox', shell=True)
Note there is a parameter creationFlags with values that seem of interest (https://docs.python.org/3/library/subprocess.html#windows-constants), however hopefully shell=True will suffice.
Do note the strong warnings in the documentation around opening a process with shell=True where the process being run depends upon some user input!
I try to run this script:
file = open("console-output.txt", "w")
task = subprocess.Popen(sys.executable + " \"main.py\"", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(task.stdout.readline, ''):
print("Got data")
file.write(line)
file.flush()
file.close()
It works fine and prints the program output to the console-output.txt file. However it outputs every text at once at the end of the program. I would like to have a live output to my file so that I can see the output of long-running programs. Am I doing anything wrong or is this a bug? I am on Ubuntu 17.10 with Python 3.6.3 64Bit.
It seems to me like task.stdout.readline is blocking till the program is completely finished.
After a lot more research using different search terms, I found out that C (and therefore many interpreters and programs) detects whether the program output is a console or a pipe. It buffers every output, as soon as the buffer has enough empty space or is not flushed, if in pipe mode. To force an unbuffered behaviour, you just need to pass -u to the target python interpreter. If not using Python, you may want to try the stdbuf command pre-installed on almost all common linux platforms, available via the coreutils package in Mac OS X (you have to call gstdbuf instead of stdbuf ). After a lot of research, I found out that the only equivalent for stdbuf on linux could be the stdbuf.exe found in the git-scm for windows. However I did not test it yet.
Before I start, some notes on my environment: python 2.7.14 installed through miniconda, macOS 10.13.3.
The Problem
I'm trying to write a data processing pipeline (call it analyze.py) in python which calls several different programs. One of those programs, EMAN2 uses its own python environment to run (located, say, ~/EMAN2/bin/python). From the command line, a call to eman2 might look something like this:
~/EMAN2/bin/e2buildstacks.py <args>
where e2buildstacks.py specifies its own python environment using the standard #!/Users/<username>/EMAN2/bin/python type formulation at the top of the file (note, there's a lot of these different .py files to run, I'm just using one name as an example).
In python, I'm using a subprocess.Popen construction for these calls. A very simple example would be:
import subprocess
stacks_cmd = '/Users/<username>/EMAN2/bin/e2buildstacks.py <args>'
process = subprocess.Popen(stacks_cmd, shell=True, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
Where I've replaced the actual args and username, for simplicity.
If this is all I'm doing, it works fine. I can run python analyze.py from the command line and it runs EMAN2 without problem.
However, this pipeline is getting wrapped up in a GUI (wxpython based). So I have to use pythonw to run my program. When I try to do:
pythonw analyze.py
It can't run the EMAN2 scripts correctly, I get an error:
Traceback (most recent call last):
File "/Users/<username>/EMAN2/bin/e2buildstacks.py", line 34, in <module>
from EMAN2 import *
ImportError: No module named EMAN2
This indicates to me that it's using my miniconda python rather than the ~/EMAN2/bin/python to run the script.
(Note: if anyone uses EMAN2, I can provide you with the full argument list and the input files. But rest assured that's not the issue, I can run the command I'm using just fine from the command line)
What I've tried
I've tried several different things to fix this problem. The simplest was specifying the python to be used in the command:
import subprocess
stacks_cmd = '/Users/<username>/EMAN2/bin/python /Users/<username>/EMAN2/bin/e2buildstacks.py <args>'
process = subprocess.Popen(stacks_cmd, shell=True, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
That didn't work, and fails with the same error.
I've tried using Popen in shell=False mode
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/python', '/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
or
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
Both of those fail with the same error.
I've tried specifying the executable:
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE, executable='/Users/<username>/EMAN2/bin/python')
stacks_output, stacks_error = process.communicate()
This one gives a different error:
Unknown option: --
usage: /Users/<username>/EMAN2/bin/e2buildstacks.py [option] ... [-c cmd | -m mod | file | -] [arg] ...
Try `python -h' for more information.
Which makes me think that somehow the arguments aren't getting passed correctly to the script, but are rather being passed to python itself (I probably don't really understand what the executable setting does here).
I've tried setting the environment variable:
import os
my_env = os.environ.copy()
my_env['PATH'] = '/Users/<username>/EMAN2/bin:'+my_env['PATH']
my_env['PYTHONPATH'] = '/Users/<username>/EMAN2/bin'
And then passing that to subprocess.Popen in any of the above commands as env=my_env. This fails with the same errors.
At this point, I'm pretty much out of ideas and hoping someone can help. Again, this is only happening with pythonw, not python.
Things I've read looking for solutions
I couldn't find anything on stackoverflow that quite matched this problem. things I've looked at:
subprocess a program using another version of python
Never answered successfully.
Module cannot be found when using "pythonw" (instead of "python") to run an application
The problem is not which python/pythonw I'm using, both are coming from miniconda.
Python subprocess is running a different version of Python
This is kind of the exact opposite of my problem, so not much use.
https://github.com/ContinuumIO/anaconda-issues/issues/199
This suggests that the miniconda pythonw is a bit of a hack that can cause various problems, but doesn't directly offer a solution (lets say that using another version of python is strongly discouraged).
Python subprocess/Popen with a modified environment
Python Script not running in crontab calling pysaunter
These led me to trying to modify the environment variables.
Final note
Changing the python distribution being used is possible, but very much not the ideal solution. I'm writing something that will be run in several different environments (including linux and windows, which I haven't tested on yet) and by people who aren't me, so I need a more bulletproof solution.
Any help folks can provide is much appreciated.
I'd like to call a separate non-child python program from a python script and have it run externally in a new shell instance. The original python script doesn't need to be aware of the instance it launches, it shouldn't block when the launched process is running and shouldn't care if it dies. This is what I have tried which returns no error but seems to do nothing...
import subprocess
python_path = '/usr/bin/python'
args = [python_path, '&']
p = subprocess.Popen(args, shell=True)
What should I be doing differently
EDIT
The reason for doing this is I have an application with a built in version of python, I have written some python tools that should be run separately alongside this application but there is no assurance that the user will have python installed on their system outside the application with the builtin version I'm using. Because of this I can get the python binary path from the built in version programatically and I'd like to launch an external version of the built in python. This eliminates the need for the user to install python themselves. So in essence I need a simple way to call an external python script using my current running version of python programatically.
I don't need to catch any output into the original program, in fact once launched I'd like it to have nothing to do with the original program
EDIT 2
So it seems that my original question was very unclear so here are more details, I think I was trying to over simplify the question:
I'm running OSX but the code should also work on windows machines.
The main application that has a built in version of CPython is a compiled c++ application that ships with a python framework that it uses at runtime. You can launch the embedded version of this version of python by doing this in a Terminal window on OSX
/my_main_app/Contents/Frameworks/Python.framework/Versions/2.7/bin/python
From my main application I'd like to be able to run a command in the version of python embedded in the main app that launches an external copy of a python script using the above python version just like I would if I did the following command in a Terminal window. The new launched orphan process should have its own Terminal window so the user can interact with it.
/my_main_app/Contents/Frameworks/Python.framework/Versions/2.7/bin/python my_python_script
I would like the child python instance not to block the main application and I'd like it to have its own terminal window so the user can interact with it. The main application doesn't need to be aware of the child once its launched in any way. The only reason I would do this is to automate launching an external application using a Terminal for the user
If you're trying to launch a new terminal window to run a new Python in (which isn't what your question asks for, but from a comment it sounds like it's what you actually want):
You can't. At least not in a general-purpose, cross-platform way.
Python is just a command-line program that runs with whatever stdin/stdout/stderr it's given. If those happen to be from a terminal, then it's running in a terminal. It doesn't know anything about the terminal beyond that.
If you need to do this for some specific platform and some specific terminal program—e.g., Terminal.app on OS X, iTerm on OS X, the "DOS prompt" on Windows, gnome-terminal on any X11 system, etc.—that's generally doable, but the way to do it is by launching or scripting the terminal program and telling it to open a new window and run Python in that window. And, needless to say, they all have completely different ways of doing that.
And even then, it's not going to be possible in all cases. For example, if you ssh in to a remote machine and run Python on that machine, there is no way it can reach back to your machine and open a new terminal window.
On most platforms that have multiple possible terminals, you can write some heuristic code that figures out which terminal you're currently running under by just walking os.getppid() until you find something that looks like a terminal you know how to deal with (and if you get to init/launchd/etc. without finding one, then you weren't running in a terminal).
The problem is that you're running Python with the argument &. Python has no idea what to do with that. It's like typing this at the shell:
/usr/bin/python '&'
In fact, if you pay attention, you're almost certainly getting something like this through your stderr:
python: can't open file '&': [Errno 2] No such file or directory
… which is exactly what you'd get from doing the equivalent at the shell.
What you presumably wanted was the equivalent of this shell command:
/usr/bin/python &
But the & there isn't an argument at all, it's part of sh syntax. The subprocess module doesn't know anything about sh syntax, and you're telling it not to use a shell, so there's nobody to interpret that &.
You could tell subprocess to use a shell, so it can do this for you:
cmdline = '{} &'.format(python_path)
p = subprocess.Popen(cmdline, shell=True)
But really, there's no good reason to. Just opening a subprocess and not calling communicate or wait on it already effectively "puts it in the background", just like & does on the shell. So:
args = [python_path]
p = subprocess.Popen(args)
This will start a new Python interpreter that sits there running in the background, trying to use the same stdin/stdout/stderr as your parent. I'm not sure why you want that, but it's the same thing that using & in the shell would have done.
Actually I think there might be a solution to your problem, I found a useful solution at another question here.
This way subprocess.popen starts a new python shell instance and runs the second script from there. It worked perfectly for me on Windows 10.
You can try using screen command
with this command a new shell instance created and the current instance runs in the background.
# screen; python script1.py
After running above command, a new shell prompt will be seen where we can run another script and script1.py will be running in the background.
Hope it helps.
When I launch a PowerShell script from Python, the delay seems to be approximately 45s, and I cannot figure out why.
I'm trying to run a PowerShell script (accessing some APIs only available to PowerShell) from a Python script.
I've tried a lot of permutations, and all incur ~45 second delay compared to just running the script from a command prompt, using an identical command line.
For example - sample.ps1 might say:
echo foo
And runner.py might say:
import subprocess
p = subprocess.Popen([POWERSHELL, '-File', 'sample.ps1'], stdout=subprocess.STDOUT)
d = p.stdout.read()
Running the .ps1 script directly is fast, running it via runner.py (Python 2.7, 32bit on a 64bit machine) incurs 45 second delay.
The exact same thing occurs if I use "os.system", or Twisted's built-in process tools. So I suspect it's some subtle interaction between the Python interpreter and the Powershell interpreter, possibly related to creation of console windows, or handling of stdin/out/err streams? (which I know don't "really exist" in the same way on Windows)
I do not see any such delays. It is pretty snappy. ( that will also depend on what your script actually does.) Try using call:
from subprocess import call
call(["powershell", "sample.ps1"])
PowerShell loads your user's profile by default. Use the -NoProfile argument to turn that behavior off:
import subprocess
p = subprocess.Popen([POWERSHELL, '-NoProfile', '-File', 'sample.ps1'], stdout=subprocess.STDOUT)
d = p.stdout.read()