I am trying to run the following simple command in Python which returns a directory associated with the Node.js
import subprocess
import shutil
cmd = [shutil.which("npm"), 'bin']
subprocess.run(cmd, stdout=subprocess.PIPE)
I've tested this on two different systems, on my Mac it runs almost instantly whereas on my Windows 10 system it hangs for a long time even though running the command npm bin in the terminal completes immediately. Running other commands, say cmd = [shutil.which("condo"), "info"] works fine. If I don't specify the stdout argument it will also work OK. I would like to figure why it is hanging and how to fix that.
UPDATE
Also tried
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
p.communicate()
On Windows 10 under Python 3.6.10, same result as running subprocess.run().
Related
I am trying to launch a python subprocess from excel using PYXLL, but it seems to have trouble launching the cmd window and running commands.
Below is a sample of what I am trying to run:
#xl_macro()
def test():
if 1 == 1:
xlcAlert("Next line nothing happens") #Popup appears
p = subprocess.Popen(r'start cmd /k', shell=True, creationflags=subprocess.CREATE_NEW_CONSOLE, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
xlcAlert("{}".format(p.pid)) #p was never launched
I am trying to capture values from excel and pass them in a subprocess. This works when executing in my IDE: data is read from excel and then subprocess launches window. However, once adding the decorator to have it run as macro in EXCEL, the script will just stop once subprocess.Popen line is reached. Is there any way to launch a subprocess from pyxll?
After investigation, and thanks to Charles Duffy, Microsoft Office SandBoxing kills the shell subprocess. This has been implemented for security reasons in latest versions.
The simple solution is to run subprocess with shell=False and pass the args in a list:
p1 = subprocess.Popen(cmdlist, shell=False)
The Sandboxing will not terminate the process - python window will open while script is running.
Before I start, some notes on my environment: python 2.7.14 installed through miniconda, macOS 10.13.3.
The Problem
I'm trying to write a data processing pipeline (call it analyze.py) in python which calls several different programs. One of those programs, EMAN2 uses its own python environment to run (located, say, ~/EMAN2/bin/python). From the command line, a call to eman2 might look something like this:
~/EMAN2/bin/e2buildstacks.py <args>
where e2buildstacks.py specifies its own python environment using the standard #!/Users/<username>/EMAN2/bin/python type formulation at the top of the file (note, there's a lot of these different .py files to run, I'm just using one name as an example).
In python, I'm using a subprocess.Popen construction for these calls. A very simple example would be:
import subprocess
stacks_cmd = '/Users/<username>/EMAN2/bin/e2buildstacks.py <args>'
process = subprocess.Popen(stacks_cmd, shell=True, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
Where I've replaced the actual args and username, for simplicity.
If this is all I'm doing, it works fine. I can run python analyze.py from the command line and it runs EMAN2 without problem.
However, this pipeline is getting wrapped up in a GUI (wxpython based). So I have to use pythonw to run my program. When I try to do:
pythonw analyze.py
It can't run the EMAN2 scripts correctly, I get an error:
Traceback (most recent call last):
File "/Users/<username>/EMAN2/bin/e2buildstacks.py", line 34, in <module>
from EMAN2 import *
ImportError: No module named EMAN2
This indicates to me that it's using my miniconda python rather than the ~/EMAN2/bin/python to run the script.
(Note: if anyone uses EMAN2, I can provide you with the full argument list and the input files. But rest assured that's not the issue, I can run the command I'm using just fine from the command line)
What I've tried
I've tried several different things to fix this problem. The simplest was specifying the python to be used in the command:
import subprocess
stacks_cmd = '/Users/<username>/EMAN2/bin/python /Users/<username>/EMAN2/bin/e2buildstacks.py <args>'
process = subprocess.Popen(stacks_cmd, shell=True, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
That didn't work, and fails with the same error.
I've tried using Popen in shell=False mode
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/python', '/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
or
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE)
stacks_output, stacks_error = process.communicate()
Both of those fail with the same error.
I've tried specifying the executable:
import subprocess
stacks_cmd = ['/Users/<username>/EMAN2/bin/e2buildstacks.py', <args>]
process = subprocess.Popen(stacks_cmd, shell=False, stdout=subprocess.PIPE, executable='/Users/<username>/EMAN2/bin/python')
stacks_output, stacks_error = process.communicate()
This one gives a different error:
Unknown option: --
usage: /Users/<username>/EMAN2/bin/e2buildstacks.py [option] ... [-c cmd | -m mod | file | -] [arg] ...
Try `python -h' for more information.
Which makes me think that somehow the arguments aren't getting passed correctly to the script, but are rather being passed to python itself (I probably don't really understand what the executable setting does here).
I've tried setting the environment variable:
import os
my_env = os.environ.copy()
my_env['PATH'] = '/Users/<username>/EMAN2/bin:'+my_env['PATH']
my_env['PYTHONPATH'] = '/Users/<username>/EMAN2/bin'
And then passing that to subprocess.Popen in any of the above commands as env=my_env. This fails with the same errors.
At this point, I'm pretty much out of ideas and hoping someone can help. Again, this is only happening with pythonw, not python.
Things I've read looking for solutions
I couldn't find anything on stackoverflow that quite matched this problem. things I've looked at:
subprocess a program using another version of python
Never answered successfully.
Module cannot be found when using "pythonw" (instead of "python") to run an application
The problem is not which python/pythonw I'm using, both are coming from miniconda.
Python subprocess is running a different version of Python
This is kind of the exact opposite of my problem, so not much use.
https://github.com/ContinuumIO/anaconda-issues/issues/199
This suggests that the miniconda pythonw is a bit of a hack that can cause various problems, but doesn't directly offer a solution (lets say that using another version of python is strongly discouraged).
Python subprocess/Popen with a modified environment
Python Script not running in crontab calling pysaunter
These led me to trying to modify the environment variables.
Final note
Changing the python distribution being used is possible, but very much not the ideal solution. I'm writing something that will be run in several different environments (including linux and windows, which I haven't tested on yet) and by people who aren't me, so I need a more bulletproof solution.
Any help folks can provide is much appreciated.
I am pulling my hair out here. I am spawning a process which I need the feedback from in Python.
When I run the command in the cmd window it runs fine, but when I try to run it via Python the terminal hangs.
p = subprocess.Popen(startcmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(out, err) = p.communicate()
Where startcmd is a string which when printed in the Python console looks like this:
"C:/Program Files/GRASS GIS 7.2.1/grass72.bat" --version
If I copy and paste this into a Windows cmd, it shows the version information and returns control to the command prompt about a second later, but in Python it freezes up.
I should point out, if I replace the startcmd string with something like "dir" or even "python --version", it works fine!
Additional: I have tried shell=True, this has the same result.
Additional: I have tried sending the cmd and arguments through as an array as suggested in an answer below given that shell=False, but this also hangs the same.
Additional: I have added the GRASS path to the system PATH, so that now I can simply call grass72 --version in the cmd window to get a result, however this also still freezes in Python but works fine in cmd.
Additional: I have created a basic .bat file to test if .bat files run ok via Python, here is what I created:
#echo off
title Test Batch Script
echo I should see this message
This runs fine both in cmd, and in Python.
Problem found but not solved!
So, I'm running the script which spawns the process using subprocess.Popen using Python 3.6. The .bat file which is spawned launches a Python script using a version of Python (based on 2.7) which comes shipped with GRASS:
%GRASS_PYTHON% "\BLAH\BLAH\grass72.py"
What is interesting, is that if I launch the subprocess.Popen script with Python 2.7, it works fine. Ahah, you may think, solved! But this doesn't solve my problem - because I really need Python 3.6 to be launching the process, also why does it matter what version of Python launches the batch file? The new Python script which is spawned is launched with Python 2.7 anyway.
Since I started re-directing stdout I can see that there is an error when I use Python 3.6 to launch the process:
File "C:\ProgramData\Anaconda3\lib\site.py", line 177
file=sys.stderr)
^
SyntaxError: invalid syntax
Notice its reverting to Anaconda3! Even though it is launched using python.exe from 2.7!
I experienced the same issue with Python 3.6 and 3.7 on Windows hanging for subprocess calls:
p = subprocess.Popen(startcmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(out, err) = p.communicate()
Upon closer investigation I noticed this occurs only if the process writes more than about 4 KB (4096 bytes) of output which might explain why your short script does not reproduce this.
A workaround I found is using tempfile in the standard library:
# Write to a temporary file because pipe redirection seems broken
with tempfile.NamedTemporaryFile(mode="w+") as tmp_out,
tempfile.NamedTemporaryFile(mode="w+") as tmp_err:
p = subprocess.Popen(startcmd, stdout=tmp_out, stderr=tmp_err,
universal_newlines=True)
# `run` waits for command to complete, `Popen` continues Python program
while p.poll() is None:
time.sleep(.1)
# Cursor is after the last write call, reset to read output
tmp_out.seek(0)
tmp_err.seek(0)
out = tmp_out.read()
err = tmp_err.read()
You don't specify shell=True in your arguments to Popen. The recommended usage in that case is to specify a sequence of arguments instead of a string. So you should set startcmd equal to ["C:/Program Files/GRASS GIS 7.2.1/grass72.bat", "--version"].
Try this:
p = subprocess.Popen(startcmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
I have a very basic subprocess.Popen command like:
cmd = ['docker', 'run', '--name=test','server']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
When I run this in a Python script from terminal, this correctly launches the docker image and the image is persisted until I docker stop it.
However, when I run the same code from PyCharm 5, the docker image is immediately stopped when the above line is completed.
I have verified this in the debugger (running docker ps -a shows this image immediately dies when launched from PyCharm).
What do I have to do in order to keep my subprocess opened when running with PyCharm? Not being able to run the debugger is quite annoying.
Are you printing out the output from the Popen call (stdout and stderr)? What does it say?
Launching subprocesses from an IDE can be tricky, because certain environment variables (like PATH) may not be set in the same way as your shell.
I'm executing a set of commands that first require me to call bash. I am trying to automate these commands by writing a Python script to do this. My first command obviously needs to be bash, so I run
p = subprocess.call(['bash'])
and it launches the bash shell no problem.
Where I then have problems is trying to execute the remaining code in the bash environment. I thought perhaps there was a need for process communication (i.e. redirecting stdout as in
p0 = subprocess.Popen(cmd, stdout=subprocess.PIPE)
p1 = subprocess.Popen(['bash'], stdin=p0.stdout)
p1.communicate()
) but the piping doesn't seem to solve my problem.
How can I write this script so that it mimics the following sequential Linux commands?
$ bash
$ cmd1
$ cmd2
...
I'm working with Ubuntu 14.04 and Python 2.7.6.
Thanks in advance for the guidance!
import subprocess
def bash_command(cmd):
subprocess.Popen(cmd, shell=True, executable='/bin/bash')
bash_command('[your_command]')
You don't need to call run bash separately. You can run something like:
p1 = subprocess.call(['cmd1'])
p2 = subprocess.call(['cmd2'])
If you must run bash for some reason (the commands contain bash statements, for example), you can run bash -c "cmd1; cmd2" from subprocess.call().
Edit: As Busturdust pointed out, you can also try setting shell=True, but that uses sh, not bash. But that may be enough for you.