I am trying to capture the output from a shell command (npm --version) however only the first line is read and the process does not end.
import subprocess
proc = subprocess.Popen(['npm', '--version'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
proc.wait()
for line in proc.stdout:
print(line.decode("utf-8").strip())
print("does not get here?!")
Any idea how I could detect the end of this process?.
If I open a cmd and execute 'npm --version', it ends as expected so I do not know why this done in the above does not end.
Some extra information that maybe of use!...
npm is installed via nvm
this is used to manage node installs via symlinks
npm from what I can see is a .cmd file that executes node?
Running this in python command prompt...
>>> import subprocess
>>> proc = subprocess.Popen(['npm', '--version'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
>>> proc.wait()
0
>>> proc.stdout.readline()
'6.10.3\n'
>>> proc.stdout.readline()
''
Now the second .readline() takes a very very long time to complete!
Using stdout=PIPE and/or stderr=PIPE in popen.wait() will cause a deadlock. Try using communicate() to avoid that.
This is due to other OS pipe buffers filling up and blocking the child process.
See this documentation on how to use communicate ()
https://docs.python.org/2/library/subprocess.html
Hope I could help!
Can you please share the console output when you manually type on the command prompt please.
The code you have shared works on my machine and i am assuming it may have to do something with the way npm is installed. In any case can you share the output from command console.
Thanks
Pushpa
Related
I'm trying to use Popen to start a shell process, execute commands, print the commands to output, and print the output of the commands (if any). Writing to stdin is working fine, but trying to read from stdout causes the program to freeze.
Here's my program:
with Popen(["/bin/sh"], stdin=PIPE, stdout=PIPE, stderr=STDOUT, text=True, bufsize=0) as proc:
with open("script.txt") as scriptFile:
for line in scriptFile.readlines():
proc.stdin.write(line)
print(f"$ {line.strip()}")
# Reading from proc.stdout locks the program.
for outputLine in proc.stdout.readlines():
print(outputLine)
And here's a simplified script.txt. (The real one creates a git repository and illustrates the use of various git commands.)
mkdir project
cd project
echo "This is line 1.\nThis is line 2." > text1.txt
cat text1.txt
If I don't try to read from stdout all of the commands in script.txt (and my real version with multiple git commands) work fine.
Is there any way to get the output of commands from stdout interspersed with writing them to stdin?
Seems duplicate of Using Python to open a shell environment, run a command and exit environment. I want to run the ulimit command in the shell environment in Redhat. Procedure: Open shell environment, run ulimit commands on shell, get the result and exit the shell environmnet. Referencing the above solution, I tried:
from subprocess import Popen, PIPE
def read_limit():
p = Popen('sh', stdin=PIPE)
file_size = p.communicate('ulimit -n')
open_files = p.communicate('ulimit -f')
file_locks = p.communicate('ulimit -x')
return file_size, open_files, file_locks
But I got error: ValueError: I/O operation on closed file.
The documentation for communicate() says:
send data to stdin. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate.
After that, the pipe will be closed.
You can use
p.stdin.write("something")
p.stdin.flush()
result = p.stdout.readline()
for your three commands and then
p.stdin.close()
p.wait()
at the end to terminate it
I have created a Python script and compiled it into an exe file with PyInstaller. In the process, I have specified the -w option to get an app that doesn't have any console.
Everything works fine except the execution of commands using popen:
mout = subprocess.Popen(['ls','C:\'])
This line generates an exception [Error 6] The handle is invalid.
I have tried adding the parameters
stdout=subprocess.PIPE, stderr=subprocess.PIPE but it stills not working. I think that is because the main process doesn't have any console assigned. I want to execute the command but without opening any shell, it has to be transparent to the user.
Is there any option?
This should solve your issue.
proc = subprocess.Popen(['ls','C:\\'], shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE, creationflags=CREATE_NO_WINDOW)
I can't figure out how to close a bash shell that was started via Popen. I'm on windows, and trying to automate some ssh stuff. This is much easier to do via the bash shell that comes with git, and so I'm invoking it via Popen in the following manner:
p = Popen('"my/windows/path/to/bash.exe" | git clone or other commands')
p.wait()
The problem is that after bash runs the commands I pipe into it, it doesn't close. It stays open causing my wait to block indefinitely.
I've tried stringing an "exit" command at the end, but it doesn't work.
p = Popen('"my/windows/path/to/bash.exe" | git clone or other commands && exit')
p.wait()
But still, infinite blocking on the wait. After it finishes its task, it just sits at a bash prompt doing nothing. How do I force it to close?
Try Popen.terminate() this might help kill your process. If you have only synchronous executing commands try to use it directly with subprocess.call().
for example
import subprocess
subprocess.call(["c:\\program files (x86)\\git\\bin\\git.exe",
"clone",
"repository",
"c:\\repository"])
0
Following is an example of using a pipe but this is a little overcomplicated for most use cases and makes sense only if you talk with a service that needs interaction (at least in my opinion).
p = subprocess.Popen(["c:\\program files (x86)\\git\\bin\\git.exe",
"clone",
"repository",
"c:\\repository"],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)
print p.stderr.read()
fatal: destination path 'c:\repository' already exists and is not an empty directory.
print p.wait(
128
This can be applied to ssh as well
To kill the process tree, you could use taskkill command on Windows:
Popen("TASKKILL /F /PID {pid} /T".format(pid=p.pid))
As #Charles Duffy said, your bash usage is incorrect.
To run a command using bash, use -c parameter:
p = Popen([r'c:\path\to\bash.exe', '-c', 'git clone repo'])
In simple cases, you could use subprocess.check_call instead of Popen().wait():
import subprocess
subprocess.check_call([r'c:\path\to\bash.exe', '-c', 'git clone repo'])
The latter command raises an exception if bash process returns non-zero status (it indicates an error).
I am writing a small python script that needs to execute git commands from inside a given directory
The code is as follows:
import subprocess, os
pr = subprocess.Popen(['/usr/bin/git', 'status'],
cwd=os.path.dirname('/path/to/dir/'),
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True)
(out, error) = pr.communicate()
print out
But it shows git usage as the output.
If the command doesn't involve git for eg. ['ls'] then it shows the correct output.
Is there anything I am missing ?
python version - 2.6.6
Thanks.
subprocess.Popen:
On Unix, with shell=True: […] If args is a sequence, the first item specifies the command string, and any additional items will be treated as additional arguments to the shell itself.
You don't want shell=True and also a list of arguments. Set shell=False.