Python subprocess module: what the code does? - python

What this piece of code is doing?
with open(temp_path) as f:
command = "xdg-open"
subprocess.Popen(
["im=$(cat);" + command + " $im; rm -f $im"], shell=True, stdin=f
)
I'm confused with the subprocess part...
What the shell script does?

im=$(cat)
uses cat to read the standard input, and assigns the result to the variable im. Since you use stdin=f, that reads the contents of temp_path.
command + " $im;`
executes the command xdg-open with $im as its argument. So this uses the contents of the file as the argument to xdg-open, which opens the file in its default application. Since the argument should be a filename, this implies that temp_path contains a filename.
rm -f $im
deletes the file that was opened.
This seems like a silly way to do this. A better way to write it would be:
with open(temp_path) as f:
filename = f.read().strip()
command = "xdg-open"
subprocess.Popen([command, filename])
os.remove(filename)
Although I haven't seen the rest of the script, I suspect the temp path is also unnecessary when doing it this way -- it seems like it was just using that as a way to get the filename into the shell variable.

Related

Dos2unix not working when trying to silence command

I was calling dos2unix from within Python this way:
call("dos2unix " + file1, shell=True, stdout=PIPE)
However to silence the Unix output, I did this:
f_null = open(os.devnull, 'w')
call("dos2unix " + file1, shell=True, stdout=f_null , stderr=subprocess.STDOUT)
This doesn't seem to work. The command isn't being called anymore as the diff that I perform on the file1 against file2 (did a diff -y file1 file2 | cat -t and could see the line endings hadn't changed).
file2 is the file I am comparing file1 against. It has Unix line endings as it is generated on the box. However, there is a chance that file1 doesn't.
Not sure, why but I would try to get rid of the "noise" around your command & check return code:
check_call(["dos2unix",file1], stdout=f_null , stderr=subprocess.STDOUT)
pass as list of args, not command line (support for files with spaces in it!)
remove shell=True as dos2unix isn't a built-in shell command
use check_call so it raises an exception instead of failing silently
At any rate, it is possible that dos2unix detects that the output isn't a tty anymore and decides to dump the output in it instead (dos2unix can work from standard input & to standard output). I'd go with that explanation. You could check it by redirecting to a real file instead of os.devnull and check if the result is there.
But I would do a pure python solution instead (with a backup for safety), which is portable and doesn't need dos2unix command (so it works on Windows as well):
with open(file1,"rb") as f:
contents = f.read().replace(b"\r\n",b"\n")
with open(file1+".bak","wb") as f:
f.write(contents)
os.remove(file1)
os.rename(file1+".bak",file1)
reading the file fully is fast, but could choke on really big files. A line-by-line solution is also possible (still using the binary mode):
with open(file1,"rb") as fr, open(file1+".bak","wb") as fw:
for l in fr:
fw.write(l.replace(b"\r\n",b"\n"))
os.remove(file1)
os.rename(file1+".bak",file1)

Python writes 0 to the file

import os
fileHandle = open('booksNames.txt', 'r+')
def getData():
data = os.system('dir /b /a /s *.pdf *.epub *.mobi')
fileHandle.writelines(str(data))
fileHandle.close()
I'm trying to write the data returned by the os.system function to a file. But the only thing that gets written in file is 0. Here are some other variations that I tried as well.
import os
fileHandle = open('booksNames.txt', 'r+')
getData = lambda:os.system('dir /b /a /s *.pdf *.epub *.mobi')
data = getData()
fileHandle.writelines(str(data))
fileHandle.close()
On the output window, it gives perfect output but while writing to a text fileit writes zero. I've also tried using return but no use. Please Help.
Use the subprocess module. There are a number of methods, but the simplest is:
>>> import subprocess
>>> with open('out.txt','w') as f:
... subprocess.call(['dir','/b','/a','/s','*.pdf','*.epub','*.mobi'],stdout=f,stderr=f,shell=True)
...
0
Zero is the exit code, but the content will be in out.txt.
For windows (I assume you are using Windows since you are using the 'dir' command, not the Unix/Linux 'ls'):
simply let the command do the work.
os.system('dir /b /a /s *.pdf *.epub *.mobi >> booksNames.txt')
Using '>>' will append to any existing file. just use '>' to write a new file.
I liked the other solution using subprocess, but since this is OS-specific anyway, I think this is simpler.

python subprocess module can't parse filename with special characters "("

I have a Python program that reads files and then tars them into tar balls of a certain size.
One of my files not only has spaces in it but also contains parentheses. I have the following code:
cmd = "/bin/tar -cvf " + tmpname + " '" + filename + "'"
NOTE: Those are single quotes inside double quotes outside of the filename variable. It's a little difficult to see.
Where tmpname and filename are variables in a for-loop that are subject to change each iteration (irrelevant).
As you can see the filename I'm tarballing contains single quotes around the file name so that the shell (bash) interprets it literally as is and doesn't try to do variable substitution which "" will do or program execution which ` will do.
As far as I can see, the cmd variable contains the exact syntax for the shell to interpret the command as I want it to. However when I run the following subprocess command substituting the cmd variable:
cmdobj = call(cmd, shell=True)
I get the following output/error:
/bin/tar: 237-r Property Transport Request (PTR) 012314.pdf: Cannot stat: No such file or directory
/bin/tar: Exiting with failure status due to previous errors
unable to tar: 237-r Property Transport Request (PTR) 012314.pdf
I even print the command out to the console before running the subprocess command to see what it will look when running in the shell and it's:
cmd: /bin/tar -cvf tempname0.tar '237-r Property Transport Request (PTR) 012314.pdf'
When I run the above command in the shell as is it works just fine. Not really sure what's going on here. Help please!
Pass a list of args without shell=True and the full path to the file if running from a different directory:
from subprocess import check_call
check_call(["tar","-cvf",tmpname ,"Property Transport Request (PTR) 012314.pdf"])
Also use tar not 'bin/tar'. check_call will raise a CalledProcessError if the command returns a non-zero exit status.
The call method that is part of the subprocess module should have an array of strings passed.
On the command line you would call
tar -cvf "file folder with space/"
The following is equivalent in python
call(["tar", "-cvf", "file folder with space/"])
You are making this call in the shell
"tar -cvf 'file folder with space/'"
Which causes the shell to look for a program with the exact name as `tar -cvf 'file folder with space/'
This avoids string concatenation, which makes for cleaner code.

os.system: saving shell variables with multiple commands in one method

I am having a problem using my command/commands with one instance of os.system.
Unfortunately I have to use os.system as I have no control over this, as I send the string to the os.system method. I know I should really use subprocess module for my case, but that ain't an option.
So here is what I am trying to do.
I have a string like below:
cmd = "export BASE_PATH=`pwd`; export fileList=`python OutputString.py`; ./myscript --files ${fileList}; cp outputfile $BASE_PATH/.;"
This command then gets sent to the os.system module like so
os.system(cmd)
unfortunately when I consult my log file I get something that looks like this
os.system(r"""export BASE_PATH=/tmp/bla/bla; export fileList=; ./myscript --files ; cp outputfile /.;""")
As you can see BASE_PATH seems to be working but then when I call it with the cp outputfile /.
I get a empty string
Also with my fileList I get a empty string as fileList=python OutputString.py should print out a file list to this variable.
My thoughts:
Are these bugs due to a new process for each command? Hence I loose the variable in BASE_PATH in the next command.
Also for I not sure why fileList is empty.
Is there a solution to my above problem using os.system and my command string?
Please Note I have to use os.system module. This is out of my control.

writing command line output to file

I am writing a script to clean up my desktop, moving files based on file type. The first step, it would seem, is to ls -1 /Users/user/Desktop (I'm on Mac OSX). So, using Python, how would I run a command, then write the output to a file in a specific directory? Since this will be undocumented, and I'll be the only user, I don't mind (prefer?) if it uses os.system().
You can redirect standard output to any file using > in command.
$ ls /Users/user/Desktop > out.txt
Using python,
os.system('ls /Users/user/Desktop > out.txt')
However, if you are using python then instead of using ls command you can use os.listdir to list all the files in the directory.
path = '/Users/user/Desktop'
files = os.listdir(path)
print files
After skimming the python documentation to run shell command and obtain the output you can use the subprocess module with the check_output method.
After that you can simple write that output to a file with the standard Python IO functions: File IO in python.
To open a file, you can use the f = open(/Path/To/File) command. The syntax is f = open('/Path/To/File', 'type') where 'type' is r for reading, w for writing, and a for appending. The commands to do this are f.read() and f.write('content_to_write'). To get the output from a command line command, you have to use popen and subprocess instead of os.system(). os.system() doesn't return a value. You can read more on popen here.

Categories

Resources