I am trying to tail a file's content through python. The code I am using is as below
#! /usr/bin/python
import subprocess
import os.path
# Get the file path
filepath = os.path.join(baseDir,"filename.*" + uniqueId)
# Call subprocess and get last 10 lines from file
spTailFile = subprocess.Popen(["tail", "-10", filepath ], stdout=subprocess.PIPE)
tailOutput = spTailFile.communicate()[0]
print tailOutput
The above code throws an error as below:
tail: cannot open `/hostname/user/app/filename.*39102'
I see output if I execute the tail command with the filepath directly in bash.
tail -10 /hostname/user/app/filename.*39102
Why is subprocess passing an extra backtick (`) when executing the tail command?
Update:
I ended up using glob to find the file as #cdarke had suggested and then passing it to the Popen cmd.
Bash extends the '*', Popen not.
Two possibilities:.
1. Do it within your script and pass a filename without '*'.
2. Create a Bash script and call this from python.
Related
What this piece of code is doing?
with open(temp_path) as f:
command = "xdg-open"
subprocess.Popen(
["im=$(cat);" + command + " $im; rm -f $im"], shell=True, stdin=f
)
I'm confused with the subprocess part...
What the shell script does?
im=$(cat)
uses cat to read the standard input, and assigns the result to the variable im. Since you use stdin=f, that reads the contents of temp_path.
command + " $im;`
executes the command xdg-open with $im as its argument. So this uses the contents of the file as the argument to xdg-open, which opens the file in its default application. Since the argument should be a filename, this implies that temp_path contains a filename.
rm -f $im
deletes the file that was opened.
This seems like a silly way to do this. A better way to write it would be:
with open(temp_path) as f:
filename = f.read().strip()
command = "xdg-open"
subprocess.Popen([command, filename])
os.remove(filename)
Although I haven't seen the rest of the script, I suspect the temp path is also unnecessary when doing it this way -- it seems like it was just using that as a way to get the filename into the shell variable.
I am trying to find the path to a file named 'config.txt' from a usb flash drive plugged into a raspberry pi. The physical drive that is used may not always be the same, so the path may not always be the same. So I use
'find /media/pi/*/config.txt'
to locate the path in the terminal and works just fine.
Now I go to use check_output and get a giant string of paths.
from subprocess import check_output
cmd = ['find', '/media/pi/*/config.txt']
out = check_output(cmd,shell=True)
I set the shell as True to allow wild cards, according to https://docs.python.org/2/library/subprocess.html
Results for out are:
'.\n./.Xauthority\n./.xsession-errors\n./Public\n./.dmrc\n./Downloads\n./test.sh\n./.idlerc\n./.idlerc/recent-files.lst\n./.idlerc/breakpoints.lst\n./.asoundrc\n./.bash_logout\n./.profile\n./Templates\n./Music\n./.bash_history\n./Videos\n./.local\n./.local/share\n./.local/share/gvfs-metadata\n./.local/share/gvfs-metadata/home\n./.local/share/gvfs-metadata/home-d6050e94.log\n./.local/share/applications\n./.local/share/recently-used.xbel\n./.local/share/Trash\n.....
And it continues to go on for awhile.
I tried looking at several other similar questions including the link below, but no luck.
Store output of subprocess.Popen call in a string
You would need to pass a single string exactly as you would from your shell if you want to use a wildcard:
from subprocess import check_output
cmd = 'find /media/pi/*/config.txt'
out = check_output(cmd,shell=True)
You don't actually need subprocess at all, glob will do what you want:
from glob import glob
files = glob('/media/pi/*/config.txt')
Since you are using the shell=True you can use the following:
from subprocess import check_output
cmd = 'cd /media/pi && find . -iname config.txt'
out = check_output(cmd, shell=True)
Try to avoid use of wildcards if possible, and just change your current working directory before searching for your target file.
I have a Python program that reads files and then tars them into tar balls of a certain size.
One of my files not only has spaces in it but also contains parentheses. I have the following code:
cmd = "/bin/tar -cvf " + tmpname + " '" + filename + "'"
NOTE: Those are single quotes inside double quotes outside of the filename variable. It's a little difficult to see.
Where tmpname and filename are variables in a for-loop that are subject to change each iteration (irrelevant).
As you can see the filename I'm tarballing contains single quotes around the file name so that the shell (bash) interprets it literally as is and doesn't try to do variable substitution which "" will do or program execution which ` will do.
As far as I can see, the cmd variable contains the exact syntax for the shell to interpret the command as I want it to. However when I run the following subprocess command substituting the cmd variable:
cmdobj = call(cmd, shell=True)
I get the following output/error:
/bin/tar: 237-r Property Transport Request (PTR) 012314.pdf: Cannot stat: No such file or directory
/bin/tar: Exiting with failure status due to previous errors
unable to tar: 237-r Property Transport Request (PTR) 012314.pdf
I even print the command out to the console before running the subprocess command to see what it will look when running in the shell and it's:
cmd: /bin/tar -cvf tempname0.tar '237-r Property Transport Request (PTR) 012314.pdf'
When I run the above command in the shell as is it works just fine. Not really sure what's going on here. Help please!
Pass a list of args without shell=True and the full path to the file if running from a different directory:
from subprocess import check_call
check_call(["tar","-cvf",tmpname ,"Property Transport Request (PTR) 012314.pdf"])
Also use tar not 'bin/tar'. check_call will raise a CalledProcessError if the command returns a non-zero exit status.
The call method that is part of the subprocess module should have an array of strings passed.
On the command line you would call
tar -cvf "file folder with space/"
The following is equivalent in python
call(["tar", "-cvf", "file folder with space/"])
You are making this call in the shell
"tar -cvf 'file folder with space/'"
Which causes the shell to look for a program with the exact name as `tar -cvf 'file folder with space/'
This avoids string concatenation, which makes for cleaner code.
I would like to start out by saying any help is greatly appreciated. I'm new to Python and scripting in general. I am trying to use a program called samtools view to convert a file from .sam to a .bam I need to be able do what this BASH command is doing in Python:
samtools view -bS aln.sam > aln.bam
I understand that BASH commands like | > < are done using the subprocess stdin, stdout and stderr in Python. I have tried a few different methods and still can't get my BASH script converted correctly. I have tried:
cmd = subprocess.call(["samtools view","-bS"], stdin=open(aln.sam,'r'), stdout=open(aln.bam,'w'), shell=True)
and
from subprocess import Popen
with open(SAMPLE+ "."+ TARGET+ ".sam",'wb',0) as input_file:
with open(SAMPLE+ "."+ TARGET+ ".bam",'wb',0) as output_file:
cmd = Popen([Dir+ "samtools-1.1/samtools view",'-bS'],
stdin=(input_file), stdout=(output_file), shell=True)
in Python and am still not getting samtools to convert a .sam to a .bam file. What am I doing wrong?
Abukamel is right, but in case you (or others) are wondering about your specific examples....
You're not too far off with your first attempt, just a few minor items:
Filenames should be in quotes
samtools reads from a named input file, not from stdin
You don't need "shell=True" since you're not using shell tricks like redirection
So you can do:
import subprocess
subprocess.call(["samtools", "view", "-bS", "aln.sam"],
stdout=open('aln.bam','w'))
Your second example has more or less the same issues, so would need to be changed to something like:
from subprocess import Popen
with open('aln.bam', 'wb',0) as output_file:
cmd = Popen(["samtools", "view",'-bS','aln.sam'],
stdout=(output_file))
You can pass execution to the shell by kwarg 'shell=True'
subprocess.call('samtools view -bS aln.sam > aln.bam', shell=True)
I am writing a script to clean up my desktop, moving files based on file type. The first step, it would seem, is to ls -1 /Users/user/Desktop (I'm on Mac OSX). So, using Python, how would I run a command, then write the output to a file in a specific directory? Since this will be undocumented, and I'll be the only user, I don't mind (prefer?) if it uses os.system().
You can redirect standard output to any file using > in command.
$ ls /Users/user/Desktop > out.txt
Using python,
os.system('ls /Users/user/Desktop > out.txt')
However, if you are using python then instead of using ls command you can use os.listdir to list all the files in the directory.
path = '/Users/user/Desktop'
files = os.listdir(path)
print files
After skimming the python documentation to run shell command and obtain the output you can use the subprocess module with the check_output method.
After that you can simple write that output to a file with the standard Python IO functions: File IO in python.
To open a file, you can use the f = open(/Path/To/File) command. The syntax is f = open('/Path/To/File', 'type') where 'type' is r for reading, w for writing, and a for appending. The commands to do this are f.read() and f.write('content_to_write'). To get the output from a command line command, you have to use popen and subprocess instead of os.system(). os.system() doesn't return a value. You can read more on popen here.