I am using a subprocess call in python where I have to print a file contents using cat. The file name is a variable that I generate in the python code itself. This is my code:
pid = str(os.getpid())
tmp_file_path = "/tmp/" + pid + "/data_to_synnet"
synnet_output = subprocess.check_output(["cat echo '%s'"%tmp_file_path], shell=True)
The above code throws an error saying cat: echo: No such file or directory.
However, when I use only subprocess.check_output(["echo '%s'"%tmp_file_path], shell=True), the variable name is printed correctly.
Also, I tried doing this (cat echo $tmp_file_name) in the command line and it works. Can someone please tell what is wrong?
The command you want is this:
"cat '%s'"%tmp_file_path
Just get rid of the "echo" word.
Alternatively,
synnet_output = subprocess.check_output(["cat", tmp_file_path], shell=False)
Related
I am trying to run through subprocess a command line command that receives as arguments files. However, these files might have characters like "&" and those can be interpreted as CMD commands if they are not between quotes (").
It usually worked and I had the command passed broken in a list.
Example:
from subprocess import run
file = r'broken&difficult.txt'
command = ['convert', file]
run(command)
However it will return an stdErr:
StdErr: 'diffcult.txt' is not recognized as an internal or external command, operable program or batch file
The returncode is 1.
I have tried to change the file name variable to:
file =r'"broken&difficult.txt"'
The result is that it is not able to find any file. With a returncode of 0
You need to use the CMD escape character - the carrot ^ - before the ampersand.
Try:
import subprocess
file = 'broken^&difficult.txt'
command = ['convert', file]
subprocess.run(command, shell=True)
Example of how this works:
import subprocess
# create a file
with open('broken&difficult.txt', 'w') as fp:
fp.write('hello\nworld')
# use `more` to have OS read contents
subprocess.run(['more', 'broken^&difficult.txt'], shell=True)
# prints:
hello
world
# returns:
CompletedProcess(args=['more', 'broken^&difficult.txt'], returncode=0)
Im converting some of my bash scripts to python, and the scripts were mostly used to run command line tools.
I realise Popen is the recomended way to go, but I get errors if I try the method described below, so could please someone xplain it to me on this simple example.
bash code:
varA = 50
command1 > file.foo
echo $varA | command2 file.foo
python code:
varA=50
com1 = Popen('command1',stdin=PIPE, stdout=PIPE)
com1out = com1.stdout #NOTE com1out is of 'file' type
com2 = Popen('command2 %s' % com1out),stdin=varA, stdout=PIPE)
The shell code
command1 > file.foo
echo $varA | command2 file.foo
uses a regular file file.foo, and command2 expects the name of such a file as an argument, so the straight translation to Python is:
from subprocess import *
call('command1', stdout=open('file.foo', 'w'))
Popen(['command2', 'file.foo'], stdin=PIPE).communicate(str(varA)+'\n')
I am trying to get the stdout of a python script to be shell-piped in as stdin to another python script like so:
find ~/test -name "*.txt" | python my_multitail.py | python line_parser.py
It should print an output but nothing comes out of it.
Please note that this works:
find ~/test -name "*.txt" | python my_multitail.py | cat
And this works too:
echo "bla" | python line_parser.py
my_multitail.py prints out the new content of the .txt files:
from multitail import multitail
import sys
filenames = sys.stdin.readlines()
# we get rid of the trailing '\n'
for index, filename in enumerate(filenames):
filenames[index] = filename.rstrip('\n')
for fn, line in multitail(filenames):
print '%s: %s' % (fn, line),
sys.stdout.flush()
When a new line is added to the .txt file ("hehe") then my_multitail.py prints:
/home/me/test2.txt: hehe
line_parser.py simply prints out what it gets on stdin:
import sys
for line in sys.stdin:
print "line=", line
There is something I must be missing. Please community help me :)
There's a hint if you run your line_parser.py interactively:
$ python line_parser.py
a
b
c
line= a
line= b
line= c
Note that I hit ctrl+D to provoke an EOF after entering the 'c'. You can see that it's slurping up all the input before it starts iterating over the lines. Since this is a pipeline and you're continuously sending output through to it, this doesn't happen and it never starts processing. You'll need to choose a different way of iterating over stdin, for example:
import sys
line = sys.stdin.readline()
while line:
print "line=", line
line = sys.stdin.readline()
Im using Python's subprocess module to run a dxl script. My Problem is when i try to catch the Output (In this example a print-statement or a error message) of my dxl script, it is shown in the command prompt, but when i try to catch it with stdout=subprocess.PIPE or subprocess.check_output it always returns an empty string. Is there a way to catch the output or how could I get the Error messages from Doors?
It's important that you dont see the GUI of DOORS.
Here is my quick example that shows my problem:
test.dxl
print "Hello World"
test.py
import subprocess
doorsPath = "C:\\Program Files (x86)\\IBM\\Rational\\DOORS\\9.5\\bin\\doors.exe"
userInfo = ' -user dude -password 1234 -d 127.0.0.1 -batch ".\\test.dxl"'
dxl = " -W"
output = subprocess.check_output(doorsPath+dxl+userInfo)
print(output)
Edit: Using Windows 7 , DOORS 9.5 and Python 2.7
I know this post is pretty old, but the solution to the problem is to use
cout << ... instead of print. You can override the print perms like shown here
DOORS Print Redirect Tutorial for print, cout and logfiles
I'm feeling lucky here,
change print "Hello World" to cout << "Hello World"
and userInfo = ' -user dude -password 1234 -d 127.0.0.1 -batch ".\\test.dxl > D:\output.txt"', as in cmd promt the text can be directly exported to a text file.
your script have many error try this link for example for subprocess
and try this :
import subprocess
import sys
path = "C:\\Program Files(x86)\\IBM\\Rational\\DOORS\\9.5\\bin\\doors.exe"
userInfo = "C:\\Program Files (x86)\\IBM\\Rational\\DOORS\\9.5\\bin\\doors.exe"
proc = subprocess.Popen([path,userInfo,"-W"])
proc.communicate()
i hape it work on your system!
My goal is to compare two data one is from text file and one is from directory and after comparing it this is will notify or display in the console what are the data that is not found for example:
ls: /var/patchbundle/rpms/:squid-2.6.STABLE21-7.el5_10.x86_64.rpm NOT FOUND!
ls: /var/patchbundle/rpms/:tzdata-2014j-1.el5.x86_64.rpm
ls: /var/patchbundle/rpms/:tzdata-java-2014j-1.el5.x86_64.rpm
ls: /var/patchbundle/rpms/:wireshark-1.0.15-7.el5_11.x86_64.rpm
ls: /var/patchbundle/rpms/:wireshark-gnome-1.0.15-7.el5_11.x86_64.rpm
ls: /var/patchbundle/rpms/:yum-updatesd-0.9-6.el5_10.noarch.rpm NOT FOUND
It must be like that. So Here's my python code.
import package, sys, os, subprocess
path = '/var/tools/tools/newrpms.txt'
newrpms = open(path, "r")
fds = newrpms.readline()
def checkrc(rc):
if(rc != 0):
sys.exit(rc)
cmd = package.Errata()
for i in newrpms:
rc = cmd.execute("ls /var/patchbundle/rpms/ | grep %newrpms ")
if ( != 0):
cmd.logprint ("%s not found !" % i)
checkrc(rc)
sys.exit(0)
newrpms.close
Please see the shell script. This script its executing file but because I want to use another language that's why Im trying python
retval=0
for i in $(cat /var/tools/tools/newrpms.txt)
do
ls /var/patchbundle/rpms/ | grep $i
if [ $? != 0 ]
then
echo "$i NOT FOUND!"
retval=255
fi
done
exit $retval
Please see my Python code. What is wrong because it is not executing like the shell executing it.
You don't say what the content of "newrpms.txt" is; you say the script is not executing how you want - but you don't say what it is doing; I don't know what package or package.Errata are, so I'm playing guess-the-problem; but lots of things are wrong.
if ( != 0): is a syntax error. If {empty space} is not equal to zero?
cmd.execute("ls /var/patchbundle/rpms/ | grep %newrpms ") is probably not doing what you want. You can't put a variable in a string in Python like that, and if you could newrpms is the file handle not the current line. That should probably be ...grep %s" % (i,)) ?
The control flow is doing:
Look in this folder, try to find files
Call checkrc()
Only quit with an error if the last file was not found
newrpms.close isn't doing anything, it would need to be newrpms.close() to call the close method.
You're writing shell-script-in-Python. How about:
import os, sys
retval=0
for line in open('/var/tools/tools/newrpms.txt'):
rpm_path = '/var/patchbundle/rpms/' + line.strip()
if not os.path.exists(rpm_path):
print rpm_path, "NOT FOUND"
retval = 255
else:
print rpm_path
sys.exit(retval)
Edited code slightly, and an explanation:
The code is almost a direct copy of the shell script into Python. It loops over every line in the text file, and calls line.strip() to get rid of the newline character at the end. It builds rpm_path which will be something like "/var/patchbundle/rpms/:tzdata-2014j-1.el5.x86_64.rpm".
Then it uses sys.path.exists() which tests if a file exists and returns True if it does, False if it does not, and uses that test to set the error value and print the results like the shell script prints them. This replaces the "ls ... | grep " part of your code for checking if a file exists.