I have a python script and I wrote it such that it will generate an output file to a new directory called test using these two lines:
self.mkdir_p("test") # create directory named "test"
file_out = open("test/"+input,"w")
and the mkdir_p function is as follow:
def mkdir_p(self,path):
try:
os.makedirs(path)
except OSError as exc:
if exc.errno == errno.EEXIST:
pass
else: raise
Now, I have all the files that I would like my script to run on stored in directory called storage, and my question is, how can I write a script such that I can run all the files in storage from my home directory(where my python script is located), and saved all the output to the test directory as I coded in my python script?
I did a naive approach in bash like:
#!/bin/bash
# get the directory name where the files are stored (storage)
in=$1
# for files in (storage) directory
for f in $1/*
do
echo "Processing $f file..."
./my_python_script.py $f
done
and it didnt work and threw IOError: No such file or directory: 'test/storage/inputfile.txt'
I hope I explained my problem clear enough.
Thanks in advance
$f is storage/inputfile.txt and Python prepends test/ to that, then complains because test/storage does not exist. Create the directory, or strip the directory part before creating the output file name.
Related
TLDR: I want to change the working directory of my script to the script file's location(instead of system32) when i run it by double clicking on it.
I have a really annoying problem I couldn't solve. I am building a python script that will take 4 text files as input and create some graphs and an excel sheet using these text files. I am going to pass my script to a friend who will copy this script into different folders and execute the script in those folders by just double-clicking on the script. The problem I am facing is when I execute my code out of cmd everything works fine. But if I double click on it, the directory my code is working changes automatically, and my program can't find the required 4 text files. I am attaching the required parts of my code below and also attaching some screenshots.
ss1
ss2
def fileOpenCheckLoad(fname):
pl=list()
try:
fh=open(fname)
except:
print("ERROR:"+ fname +" is missing. Execution will terminate.")
x=input("Press enter to quit.")
quit()
test1=fh.readline()
test2=fh.readline()
if test1[6]!=fname[5] and test2!='t x y\n' :
print("ERROR: Check the contents of:"+ fname)
x=input("Press enter to quit.")
quit()
count=0
for lines in fh:
count=count+1
if count>2 :
nums=lines.split()
pl.append((float(nums[2]), float(nums[2])))
tbr=(pl,count-2)
return tbr
# main starts here.
cwd = os.getcwd()
print(cwd)
# In this part we open and load files into the memory. If there is an error we terminate.
(pl1, count1)=fileOpenCheckLoad('point1.txt')
(pl2, count2)=fileOpenCheckLoad('point2.txt')
(pl3, count3)=fileOpenCheckLoad('point3.txt')
(pl4, count4)=fileOpenCheckLoad('point4.txt')
Before calling os.getcwd(), insert this line:
os.chdir(os.path.dirname(os.path.abspath(__file__)))
Explanation
__file__ is a special variable in Python; as described here, "__file__ is the pathname of the file from which the module was loaded"
os.path.abspath returns the absolute path to the input file or directory (I included this because depending on how you load up a Python file, sometimes __file__ will be stored as a relative path, and absolute paths tend to be safer to work with)
os.path.dirname returns the name of the directory which contains the input file (because chdir will return an error if we give it the name of a file, so we need to give it the name of the directory which contains the file)
os.chdir changes the working directory
In my code, I create a directory like so:
try:
os.makedirs(playlist_name)
except OSError as e:
if e.errno != errno.EEXIST:
raise
Which creates a directory in the place where I run my python script.
Then I wish to copy three files from the original directory where the folder is located into the newly created directory, like so
# Copy FFMPEG files into that folder so youtube dl can download the videos as audio tracks
# Tried using os.getcwd() to get full path, same error
shutil.copyfile(os.getcwd() + '\\ffmpeg.exe', os.getcwd() + "\\" + playlist_name)
shutil.copyfile('ffplay.exe', "/" + playlist_name + "/")
shutil.copyfile('ffprobe.exe', "/" + playlist_name + "/")
However, trying to copy those files throws this error:
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\ME\\Documents\\python\\DIRECTORY\\PLAYLIST_NAME_HERE'
I have tried various shutil copy methods with the same error.
EDIT: This is running on windows
Per the copyfile docs:
dst must be the complete target file name; look at shutil.copy() for a copy that accepts a target directory path.
You can't use it to do what you do in the shell, naming a source file and a target directory and having it deduce the file should be put in the directory with the file's original name. You have to explicitly name the target file, or it thinks you're trying to copy to the same name as the directory, and unlike replacing a file, you can't replace a directory with a file without explicitly renaming the directory or deleting the whole directory tree first. To fix, just make sure to repeat the file name in both source and destination:
for filename in ('ffmpeg.exe', 'ffplay.exe', 'ffprobe.exe'):
shutil.copyfile(filename, os.path.join(playlist_name, filename))
The problem would be more obvious on a UNIX-like system, because those systems would reject the action with EISDIR, causing a Python IsADirectoryError to be raised, but Windows for some reason chose to use more general error codes associated with permission/access problems (EACCES and related Windows specific error codes) that Python translates to PermissionError (because Windows just isn't telling it the real problem, and it would introduce all sorts of race conditions if Python tried to check if the real problem was trying to use a directory as a file to fix the exception type).
I think that's easy but my script doesn't work. I think it's gonna be easier if I show you what I want: I want a script (in python) which does that:
I have a directory like:
boite_noire/
....helloworld/
....test1.txt/
....test2.txt/
And after running the script I would like something like:
boite_noire/
helloworld/
....test1/
........test1_date.txt
....test2/
........test2_date.txt
and if I add an other test1.txt like:
boite_noire/
helloworld/
....test1/
........test1_date.txt
....test2/
........test2_date.txt
....test1.txt
The next time I run the script:
boite_noire/
helloworld/
....test1/
........test1_date.txt
........test1_date.txt
....test2/
........test2_date.txt
I wrote this script :
But os.walk read files in directories and then create a directory named as the file, and I don't want that :(
Can someone help me please?
You could loop through each file and move it into the correct directory. This will work on a linux system (not sure about Windows - maybe better to use the shutil.move command).
import os
import time
d = 'www/boite_noire'
date = time.strftime('%Y_%m_%d_%H_%M_%S')
filesAll = os.listdir(d)
filesValid= [i for i in filesAll if i[-4:]=='.txt']
for f in filesValid:
newName = f[:-4]+'_'+date+'.txt'
try:
os.mkdir('{0}/{1}'.format(d, f[:-4]))
except:
print 'Directory {0}/{1} already exists'.format(d, f[:-4])
os.system('mv {0}/{1} {0}/{2}/{3}'.format(d, f, f[:-4], newName))
This is what the code is doing:
Find all file in a specified directory
Check the extension is .txt
For each valid file:
Create a new name by appending the date/time
Create the directory if it exists
move the file into the directory (changing the name as it is moved)
So, Im writing a python script which will open a tar file and if there is a directory in it, my script will open that directory and check for files...
E = raw_input("Enter the tar file name: ") // This will take input from the user
tf = tarfile.open(E) // this will open the tar file
Now whats the best way to check it 'tf' is having directory or not ? Rather then going my terminal and doing ls there I want do something in the same python script that checks if there is a directory after unzipping the tar.
In Python you can check to see if paths exist by using the os.path.exists(f) command, where f is a string representation of the filename and its path:
import os
path = 'path/filename'
if os.path.exists(path):
print('it exists')
EDIT:
The tarfile object has a method "getnames()" which gives the paths of all the objects in the tar file.
paths = tf.getnames() #returns list of pathnames
f = paths[1] #say the 2nd element in the list is the file you want
tf.extractfile(f)
Say there's a file named "file1" in directory "S3". Then one of the elements of tf.getnames() will be 'S3/file1'. Then you can extract it.
I'm trying to write a function that backs up a directory with files of different permission to an archive on Windows XP. I'm using the tarfile module to tar the directory. Currently as soon as the program encounters a file that does not have read permissions, it stops giving the error: IOError: [Errno 13] Permission denied: 'path to file'. I would like it to instead just skip over the files it cannot read rather than end the tar operation. This is the code I am using now:
def compressTar():
"""Build and gzip the tar archive."""
folder = 'C:\\Documents and Settings'
tar = tarfile.open ("C:\\WINDOWS\\Program\\archive.tar.gz", "w:gz")
try:
print "Attempting to build a backup archive"
tar.add(folder)
except:
print "Permission denied attempting to create a backup archive"
print "Building a limited archive conatining files with read permissions."
for root, dirs, files in os.walk(folder):
for f in files:
tar.add(os.path.join(root, f))
for d in dirs:
tar.add(os.path.join(root, d))
You should add more try statements :
for root, dirs, files in os.walk(folder):
for f in files:
try:
tar.add(os.path.join(root, f))
except IOError:
pass
for d in dirs:
try:
tar.add(os.path.join(root, d), recursive=False)
except IOError:
pass
[edit] As Tarfile.add is recursive by default, I've added the recursive=False parameter when adding directories, otherwise you could run into problems.
You will need the same try/except blocks for when you are trying to add the files with read permissions. Right now, if any of the files or sub directories are not readable, then your program will crash.
Another option that isn't so reliant on try blocks is to check the permissions before trying to add the file/folder to your tarball. There is a whole question about how to best do this (and some pitfalls to avoid when using Windows): Python - Test directory permissions
The basic pseudo code would be something like:
if folder has read permissions:
add folder to tarball
else:
for each item in folder:
if item has read permission:
add item to tarball
Just to add to what everyone else has said, there's a native python function to which you can pass the file parameter and the property you're looking for to check for that property: hasattr('/path/to/file.txt', "read") OR hasattr('/path/to/file.txt', "write") and so on
hope this helps anyone else out there