I am working on Debian Stable Linux which is otherwise working very well. I have following code in a python script file named "myrev" which works to reverse order of lines of given text file:
#! /usr/bin/python3
import sys
if len(sys.argv) < 2:
print("Usage: myrev infile")
sys.exit()
try:
with open(sys.argv[1], "r") as f:
lines = f.read().split("\n")
except:
print("Unable to read infile.")
sys.exit()
lines.reverse()
print("\n".join(lines))
It works properly and prints out reverse order of lines if I use following Linux command
./myrev infile
However, if I try to redirect output with following command to original file, a blank file is generated:
./myrev infile > infile
After above command, infile becomes an empty file.
Why can't I redirect output to original file and how can this be solved?
Using > opens the target file for write, same as if opened via fopen with mode "w". That immediately truncates the file, before your program even launches (it happens while the shell is configuring the environment before execing your program). You can't use this approach to read from a file and replace it as a single step. Best you could do would be something like:
./myrev infile > infile.tmp && mv -f infile.tmp infile
where, if the command succeeds, you complete the work by replacing the original file with the contents of the new file.
Related
I am trying to check for information on Linux server,if it has certain disks named 'ocr'.
So I have run a shell command and capture the output in a text file.
Then I will search for the string in the file.But this below script doest work.
import os
myCmd = os.popen('ls /dev/asm|grep ocr').read()
print(myCmd)
with open('myCmd') as f:
if 'ocr' in f.read():
print("RAC server")
and capture the output in a text file
You've saved it in a string variable, then you're trying to open and read a file named myCmd. These likely have different content because it's not clear where you've actually written any files
You don't need a file for the logic of that code
if 'ocr' in myCmd:
print("RAC server")
Also, you really shouldn't be using shell commands if you don't have to
for f in os.listdir("/dev/asm"):
if "ocr" in f:
print("RAC server")
break
The problem: I want to iterate over folder in search of certain file type, then execute it with a program and the name.ext as argument, and then run python script that changes the output name of the first program.
I know there is probably a better way to do the above, but the way I thought of was this:
[BAT]
for /R "C:\..\folder" %%a IN (*.extension) do ( SET name=%%a "C:\...\first_program.exe" "%%a" "C:\...\script.py" "%name%" )
[PY]
import io
import sys
def rename(i):
name = i
with open('my_file.txt', 'r') as file:
data = file.readlines()
data[40] ='"C:\\\\Users\\\\UserName\\\\Desktop\\\\folder\\\\folder\\\\' + name + '"\n'
with open('my_file.txt', 'w') as file:
file.writelines( data )
if __name__ == "__main__":
rename(sys.argv[1])
Expected result: I wish the python file changed the name, but after putting it once into the console it seems to stay with the script. The BAT does not change it and it bothers me.
PS. If there is a better way, I'll be glad to get to know it.
This is the linux bash version, I am sure you can change the loop etc to make it work as batch file instead of your *.exe I use cat as a generic input output example
#! /bin/sh
for f in *.txt
do
suffix=".txt"
name=${f%$suffix}
cat $f > tmp.dat
awk -v myName=$f '{if(NR==5) print $0 myName; else print $0 }' tmp.dat > $name.dat
done
This produces "unique" output *.dat files named after the input *.txt files. The files are treated by cat (virtually your *.exe) and the output is put into a temorary file. Eventually, this is handled by awk changing line 5 here. with the output placed in the unique file, as mentioned above.
This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
Closed 5 years ago.
i'm trying to automate the research about a list of domain i have (this list is a .txt file, about 350/400 lines).
I need to give the same command (that uses a py script) for each line i have in the txt file. Something like that:
import os
with open('/home/dogher/Desktop/copia.txt') as f:
for line in f:
process(line)
os.system("/home/dogher/Desktop/theHarvester-master/theHarvester.py -d "(line)" -l 300 -b google -f "(line)".html")
I know there is wrong syntax with "os.system" but i don't know how to insert the text in the line, into the command..
Thanks so much and sorry for bad english..
import os
with open('data.txt') as f:
for line in f:
os.system('python other.py ' + line)
If the contents of other.py are as follows:
import sys
print sys.argv[1]
then the output of the first code snippet would be the contents of your data.txt.
I hope this was what you wanted, instead of simply printing by print, you can process your line too.
Due to the Linux tag i suggest you a way to do what you want using bash
process_file.sh:
#!/bin/bash
#your input file
input=my_file.txt
#your python script
py_script=script.py
# use each line of the file `input` as argument of your script
while read line
do
python $py_script $line
done < "$input"
you can access the passed lines in python as follow:
script.py:
import sys
print sys.argv[1]
Hope below solution will be helpful for you :
with open('name.txt') as fp:
for line in fp:
subprocess.check_output('python name.py {}'.format(line), shell=True)
Sample File I have used :
name.py
import sys
name = sys.argv[1]
print name
name.txt:
harry
kat
patrick
Your approach subjects each line of your file to evaluation by the shell, which will break when (not if) it comes across a line with any of the characters with special meaning to the shell: spaces, quotes, parentheses, ampersands, semicolons, etc. Even if today's input file doesn't contain any such character, your next project will. So learn to do this correctly today:
for line in openfile:
subprocess.call("/home/dogher/Desktop/theHarvester-master/theHarvester.py",
"-d", line, "-l", "300", "-b", "google", "-f", line+".html")
Since the command line arguments do not need to be parsed, subprocess will execute your command without involving a shell.
I am a novice to python scripting. I have a script that I am hoping to run on all files in a directory. I found very helpful advice in this thread. However, I am having difficulty in determining how to format the actual script so that it retrieves the filename of the file that I want to run the script on in the command prompt, i.e. "python script.py filename.*" I've tried my best at looking through the Python documentation and the forums in this site and have come up empty (I probably just don't know what keywords I should be searching).
I am currently able to run my script on one file at a time, and output it with a new file extension using the following code, but this way I can only do one file at a time. I'd like to be able to iterate over the whole directory using 'GENE.*':
InFileName = 'GENE.303'
InFile = open(InFileName, 'r') #opens a pipeline to the file to be read line by line
OutFileName = InFile + '.phy'
OutFile = open(OutFileName, 'w')
What can I do to the code to allow myself to use an iteration through the directory similar to what is done in this case? Thank you!
You are looking for:
import sys
InFileName = sys.argv[1]
See the documentation.
For something more sophisticated, take a look at the optparse and argparse modules (the latter is preferable but is only available in newer versions of Python).
You have quite a few options to process a list of files using Python:
You can use the shell expansion facilities of your command line to pass more filenames to your script and then iterate the command line arguments:
import sys
def process_file(fname):
with open(fname) as f:
for line in f:
# TODO: implement
print line
for fname in sys.argv[1:]:
process_file(fname)
and call it like:
python my_script.py * # expands to all files in the directory
You can also use the glob module to do this expansion:
import glob
for fname in glob.glob('*'):
process_file(fname)
I am running 32-bit Windows 7 and Python 2.7.
I am trying to write a command line Python script that can run from CMD. I am trying to assign a value to sys.argv[1]. The aim of my script is to calculate the MD5 hash value of a file. This file will be inputted when the script is invoked in the command line and so, sys.argv[1] should represent the file to be hashed.
Here's my code below:
import sys
import hashlib
filename = sys.argv[1]
def md5Checksum(filePath):
fh = open(filePath, 'rb')
m = hashlib.md5()
while True:
data = fh.read(8192)
if not data:
break
m.update(data)
return m.hexdigest()
# print len(sys.argv)
print 'The MD5 checksum of text.txt is', md5Checksum(filename)
Whenver I run this script, I receive an error:
filename = sys.argv[1]
IndexError: list index out of range
To call my script, I have been writing "script.py test.txt" for example. Both the script and the source file are in the same directory. I have tested len(sys.argv) and it only comes back as containing one value, that being the python script name.
Any suggestions? I can only assume it is how I am invoking the code through CMD
You should check that in your registry the way you have associated the files is correct, for example:
[HKEY_CLASSES_ROOT\Applications\python.exe\shell\open\command]
#="\"C:\\Python27\\python.exe\" \"%1\" %*"
The problem is in the registry. Calling python script.py test.txt works, but this is not the solution. Specially if you decide to add the script to your PATH and want to use it inside other directories as well.
Open RegEdit and navigate to HKEY_CLASSES_ROOT\Applications\python.exe\shell\open\command. Right click on name (Default) and Modify. Enter:
"C:\Python27\python.exe" "%1" %*
Click OK, restart your CMD and try again.
try to run the script using python script.py test.txt, you might have a broken association of the interpreter with the .py extention.
Did you try sys.argv[0]? If len(sys.argv) = 0 then sys.argv[1] would try to access the second and nonexistent item