check permissions of directories in python - python

i want a python program that given a directory, it will return all directories within that directory that have 775 (rwxrwxr-x) permissions
thanks!

Neither answer recurses, though it's not entirely clear that that's what the OP wants. Here's a recursive approach (untested, but you get the idea):
import os
import stat
import sys
MODE = "775"
def mode_matches(mode, file):
"""Return True if 'file' matches 'mode'.
'mode' should be an integer representing an octal mode (eg
int("755", 8) -> 493).
"""
# Extract the permissions bits from the file's (or
# directory's) stat info.
filemode = stat.S_IMODE(os.stat(file).st_mode)
return filemode == mode
try:
top = sys.argv[1]
except IndexError:
top = '.'
try:
mode = int(sys.argv[2], 8)
except IndexError:
mode = MODE
# Convert mode to octal.
mode = int(mode, 8)
for dirpath, dirnames, filenames in os.walk(top):
dirs = [os.path.join(dirpath, x) for x in dirnames]
for dirname in dirs:
if mode_matches(mode, dirname):
print dirname
Something similar is described in the stdlib documentation for
stat.

Take a look at the os module. In particular os.stat to look at the permission bits.
import os
for filename in os.listdir(dirname):
path=os.path.join(dirname, filename)
if os.path.isdir(path):
if (os.stat(path).st_mode & 0777) == 0775:
print path

Compact generator based on Brian's answer:
import os
(fpath for fpath
in (os.path.join(dirname,fname) for fname in os.listdir(dirname))
if (os.path.isdir(fpath) and (os.stat(fpath).st_mode & 0777) == 0775))

Does it have to be python?
You can also use find to do that :
"find . -perm 775"

You can check 775 permission for files and directories using below command
path="location of file or directory"
if (oct(os.stat(path).st_mode)[-3:]=="775"):
print(path)

Related

Edit identical line in several files [duplicate]

I need to iterate through all .asm files inside a given directory and do some actions on them.
How can this be done in a efficient way?
Python 3.6 version of the above answer, using os - assuming that you have the directory path as a str object in a variable called directory_in_str:
import os
directory = os.fsencode(directory_in_str)
for file in os.listdir(directory):
filename = os.fsdecode(file)
if filename.endswith(".asm") or filename.endswith(".py"):
# print(os.path.join(directory, filename))
continue
else:
continue
Or recursively, using pathlib:
from pathlib import Path
pathlist = Path(directory_in_str).glob('**/*.asm')
for path in pathlist:
# because path is object not string
path_in_str = str(path)
# print(path_in_str)
Use rglob to replace glob('**/*.asm') with rglob('*.asm')
This is like calling Path.glob() with '**/' added in front of the given relative pattern:
from pathlib import Path
pathlist = Path(directory_in_str).rglob('*.asm')
for path in pathlist:
# because path is object not string
path_in_str = str(path)
# print(path_in_str)
Original answer:
import os
for filename in os.listdir("/path/to/dir/"):
if filename.endswith(".asm") or filename.endswith(".py"):
# print(os.path.join(directory, filename))
continue
else:
continue
This will iterate over all descendant files, not just the immediate children of the directory:
import os
for subdir, dirs, files in os.walk(rootdir):
for file in files:
#print os.path.join(subdir, file)
filepath = subdir + os.sep + file
if filepath.endswith(".asm"):
print (filepath)
You can try using glob module:
import glob
for filepath in glob.iglob('my_dir/*.asm'):
print(filepath)
and since Python 3.5 you can search subdirectories as well:
glob.glob('**/*.txt', recursive=True) # => ['2.txt', 'sub/3.txt']
From the docs:
The glob module finds all the pathnames matching a specified pattern according to the rules used by the Unix shell, although results are returned in arbitrary order. No tilde expansion is done, but *, ?, and character ranges expressed with [] will be correctly matched.
Since Python 3.5, things are much easier with os.scandir() and 2-20x faster (source):
with os.scandir(path) as it:
for entry in it:
if entry.name.endswith(".asm") and entry.is_file():
print(entry.name, entry.path)
Using scandir() instead of listdir() can significantly increase the
performance of code that also needs file type or file attribute
information, because os.DirEntry objects expose this information if
the operating system provides it when scanning a directory. All
os.DirEntry methods may perform a system call, but is_dir() and
is_file() usually only require a system call for symbolic links;
os.DirEntry.stat() always requires a system call on Unix but only
requires one for symbolic links on Windows.
Python 3.4 and later offer pathlib in the standard library. You could do:
from pathlib import Path
asm_pths = [pth for pth in Path.cwd().iterdir()
if pth.suffix == '.asm']
Or if you don't like list comprehensions:
asm_paths = []
for pth in Path.cwd().iterdir():
if pth.suffix == '.asm':
asm_pths.append(pth)
Path objects can easily be converted to strings.
Here's how I iterate through files in Python:
import os
path = 'the/name/of/your/path'
folder = os.fsencode(path)
filenames = []
for file in os.listdir(folder):
filename = os.fsdecode(file)
if filename.endswith( ('.jpeg', '.png', '.gif') ): # whatever file types you're using...
filenames.append(filename)
filenames.sort() # now you have the filenames and can do something with them
NONE OF THESE TECHNIQUES GUARANTEE ANY ITERATION ORDERING
Yup, super unpredictable. Notice that I sort the filenames, which is important if the order of the files matters, i.e. for video frames or time dependent data collection. Be sure to put indices in your filenames though!
You can use glob for referring the directory and the list :
import glob
import os
#to get the current working directory name
cwd = os.getcwd()
#Load the images from images folder.
for f in glob.glob('images\*.jpg'):
dir_name = get_dir_name(f)
image_file_name = dir_name + '.jpg'
#To print the file name with path (path will be in string)
print (image_file_name)
To get the list of all directory in array you can use os :
os.listdir(directory)
I'm not quite happy with this implementation yet, I wanted to have a custom constructor that does DirectoryIndex._make(next(os.walk(input_path))) such that you can just pass the path you want a file listing for. Edits welcome!
import collections
import os
DirectoryIndex = collections.namedtuple('DirectoryIndex', ['root', 'dirs', 'files'])
for file_name in DirectoryIndex(*next(os.walk('.'))).files:
file_path = os.path.join(path, file_name)
I really like using the scandir directive that is built into the os library. Here is a working example:
import os
i = 0
with os.scandir('/usr/local/bin') as root_dir:
for path in root_dir:
if path.is_file():
i += 1
print(f"Full path is: {path} and just the name is: {path.name}")
print(f"{i} files scanned successfully.")
Get all the .asm files in a directory by doing this.
import os
path = "path_to_file"
file_type = '.asm'
for filename in os.listdir(path=path):
if filename.endswith(file_type):
print(filename)
print(f"{path}/{filename}")
# do something below
I don't understand why some answers are complicated. This is how I would do it with Python 2.7. Replace DIRECTORY_TO_LOOP with the directory you want to use.
import os
DIRECTORY_TO_LOOP = '/var/www/files/'
for root, dirs, files in os.walk(DIRECTORY_TO_LOOP, topdown=False):
for name in files:
print(os.path.join(root, name))

files not deletes when using os module

I tried to make a program which delete all of the empty files ( whose size is zero ). Then, i run the program by dragging the script file in "command prompt" and run it .
However, no empty files had deleted (but i have some of them).
Please help me to find the error in my code.
import os
a = os.listdir('C:\\Python27')
for folder in a :
sizes = os.stat('C:\\Python27')
b = sizes.st_size
s = folder
if b == 0 :
remove('C:\\Python27\s')
You're assigning the values iterator os.listdir returns to folder and yet you aren't using it at all in os.stat or os.remove, but instead you are passing to them fixed values that you don't need.
You should do something like this:
import os
dir = 'C:\\Python27'
for file_name in os.listdir(dir):
file_path = os.path.join(dir, file_name)
if os.stat(file_path).st_size == 0:
os.remove(file_path)
You can delete something like the following code and you need to add some exception handling. I have used a test folder name to demonstrate.
import os
import sys
dir = 'c:/temp/testfolder'
for root, dirs, files in os.walk(dir):
for file in files:
fname = os.path.join(root, file)
try:
if os.path.getsize(fname) == 0:
print("Removing file %s" %(fname))
os.remove(fname)
except:
print("error: unable to remove 0 byte file")
raise

Iterating over files in a directory and applying a user defined function automatically [duplicate]

I need to iterate through all .asm files inside a given directory and do some actions on them.
How can this be done in a efficient way?
Python 3.6 version of the above answer, using os - assuming that you have the directory path as a str object in a variable called directory_in_str:
import os
directory = os.fsencode(directory_in_str)
for file in os.listdir(directory):
filename = os.fsdecode(file)
if filename.endswith(".asm") or filename.endswith(".py"):
# print(os.path.join(directory, filename))
continue
else:
continue
Or recursively, using pathlib:
from pathlib import Path
pathlist = Path(directory_in_str).glob('**/*.asm')
for path in pathlist:
# because path is object not string
path_in_str = str(path)
# print(path_in_str)
Use rglob to replace glob('**/*.asm') with rglob('*.asm')
This is like calling Path.glob() with '**/' added in front of the given relative pattern:
from pathlib import Path
pathlist = Path(directory_in_str).rglob('*.asm')
for path in pathlist:
# because path is object not string
path_in_str = str(path)
# print(path_in_str)
Original answer:
import os
for filename in os.listdir("/path/to/dir/"):
if filename.endswith(".asm") or filename.endswith(".py"):
# print(os.path.join(directory, filename))
continue
else:
continue
This will iterate over all descendant files, not just the immediate children of the directory:
import os
for subdir, dirs, files in os.walk(rootdir):
for file in files:
#print os.path.join(subdir, file)
filepath = subdir + os.sep + file
if filepath.endswith(".asm"):
print (filepath)
You can try using glob module:
import glob
for filepath in glob.iglob('my_dir/*.asm'):
print(filepath)
and since Python 3.5 you can search subdirectories as well:
glob.glob('**/*.txt', recursive=True) # => ['2.txt', 'sub/3.txt']
From the docs:
The glob module finds all the pathnames matching a specified pattern according to the rules used by the Unix shell, although results are returned in arbitrary order. No tilde expansion is done, but *, ?, and character ranges expressed with [] will be correctly matched.
Since Python 3.5, things are much easier with os.scandir() and 2-20x faster (source):
with os.scandir(path) as it:
for entry in it:
if entry.name.endswith(".asm") and entry.is_file():
print(entry.name, entry.path)
Using scandir() instead of listdir() can significantly increase the
performance of code that also needs file type or file attribute
information, because os.DirEntry objects expose this information if
the operating system provides it when scanning a directory. All
os.DirEntry methods may perform a system call, but is_dir() and
is_file() usually only require a system call for symbolic links;
os.DirEntry.stat() always requires a system call on Unix but only
requires one for symbolic links on Windows.
Python 3.4 and later offer pathlib in the standard library. You could do:
from pathlib import Path
asm_pths = [pth for pth in Path.cwd().iterdir()
if pth.suffix == '.asm']
Or if you don't like list comprehensions:
asm_paths = []
for pth in Path.cwd().iterdir():
if pth.suffix == '.asm':
asm_pths.append(pth)
Path objects can easily be converted to strings.
Here's how I iterate through files in Python:
import os
path = 'the/name/of/your/path'
folder = os.fsencode(path)
filenames = []
for file in os.listdir(folder):
filename = os.fsdecode(file)
if filename.endswith( ('.jpeg', '.png', '.gif') ): # whatever file types you're using...
filenames.append(filename)
filenames.sort() # now you have the filenames and can do something with them
NONE OF THESE TECHNIQUES GUARANTEE ANY ITERATION ORDERING
Yup, super unpredictable. Notice that I sort the filenames, which is important if the order of the files matters, i.e. for video frames or time dependent data collection. Be sure to put indices in your filenames though!
You can use glob for referring the directory and the list :
import glob
import os
#to get the current working directory name
cwd = os.getcwd()
#Load the images from images folder.
for f in glob.glob('images\*.jpg'):
dir_name = get_dir_name(f)
image_file_name = dir_name + '.jpg'
#To print the file name with path (path will be in string)
print (image_file_name)
To get the list of all directory in array you can use os :
os.listdir(directory)
I'm not quite happy with this implementation yet, I wanted to have a custom constructor that does DirectoryIndex._make(next(os.walk(input_path))) such that you can just pass the path you want a file listing for. Edits welcome!
import collections
import os
DirectoryIndex = collections.namedtuple('DirectoryIndex', ['root', 'dirs', 'files'])
for file_name in DirectoryIndex(*next(os.walk('.'))).files:
file_path = os.path.join(path, file_name)
I really like using the scandir directive that is built into the os library. Here is a working example:
import os
i = 0
with os.scandir('/usr/local/bin') as root_dir:
for path in root_dir:
if path.is_file():
i += 1
print(f"Full path is: {path} and just the name is: {path.name}")
print(f"{i} files scanned successfully.")
Get all the .asm files in a directory by doing this.
import os
path = "path_to_file"
file_type = '.asm'
for filename in os.listdir(path=path):
if filename.endswith(file_type):
print(filename)
print(f"{path}/{filename}")
# do something below
I don't understand why some answers are complicated. This is how I would do it with Python 2.7. Replace DIRECTORY_TO_LOOP with the directory you want to use.
import os
DIRECTORY_TO_LOOP = '/var/www/files/'
for root, dirs, files in os.walk(DIRECTORY_TO_LOOP, topdown=False):
for name in files:
print(os.path.join(root, name))

How to find all files with a particular extension? [duplicate]

This question already has answers here:
Find all files in a directory with extension .txt in Python
(25 answers)
Closed 2 months ago.
I am trying to find all the .c files in a directory using Python.
I wrote this, but it is just returning me all files - not just .c files:
import os
import re
results = []
for folder in gamefolders:
for f in os.listdir(folder):
if re.search('.c', f):
results += [f]
print results
How can I just get the .c files?
try changing the inner loop to something like this
results += [each for each in os.listdir(folder) if each.endswith('.c')]
Try "glob":
>>> import glob
>>> glob.glob('./[0-9].*')
['./1.gif', './2.txt']
>>> glob.glob('*.gif')
['1.gif', 'card.gif']
>>> glob.glob('?.gif')
['1.gif']
KISS
# KISS
import os
results = []
for folder in gamefolders:
for f in os.listdir(folder):
if f.endswith('.c'):
results.append(f)
print results
There is a better solution that directly using regular expressions, it is the standard library's module fnmatch for dealing with file name patterns. (See also glob module.)
Write a helper function:
import fnmatch
import os
def listdir(dirname, pattern="*"):
return fnmatch.filter(os.listdir(dirname), pattern)
and use it as follows:
result = listdir("./sources", "*.c")
for _,_,filenames in os.walk(folder):
for file in filenames:
fileExt=os.path.splitext(file)[-1]
if fileExt == '.c':
results.append(file)
For another alternative you could use fnmatch
import fnmatch
import os
results = []
for root, dirs, files in os.walk(path)
for _file in files:
if fnmatch.fnmatch(_file, '*.c'):
results.append(os.path.join(root, _file))
print results
or with a list comprehension:
for root, dirs, files in os.walk(path)
[results.append(os.path.join(root, _file))\
for _file in files if \
fnmatch.fnmatch(_file, '*.c')]
or using filter:
for root, dirs, files in os.walk(path):
[results.append(os.path.join(root, _file))\
for _file in fnmatch.filter(files, '*.c')]
Change the directory to the given path, so that you can search files within directory. If you don't change the directory then this code will search files in your present directory location:
import os #importing os library
import glob #importing glob library
path=raw_input() #input from the user
os.chdir(path)
filedata=glob.glob('*.c') #all files with .c extenstions stores in filedata.
print filedata
import os, re
cfile = re.compile("^.*?\.c$")
results = []
for name in os.listdir(directory):
if cfile.match(name):
results.append(name)
The implementation of shutil.copytree is in the docs. I mofdified it to take a list of extentions to INCLUDE.
def my_copytree(src, dst, symlinks=False, *extentions):
""" I modified the 2.7 implementation of shutils.copytree
to take a list of extentions to INCLUDE, instead of an ignore list.
"""
names = os.listdir(src)
os.makedirs(dst)
errors = []
for name in names:
srcname = os.path.join(src, name)
dstname = os.path.join(dst, name)
try:
if symlinks and os.path.islink(srcname):
linkto = os.readlink(srcname)
os.symlink(linkto, dstname)
elif os.path.isdir(srcname):
my_copytree(srcname, dstname, symlinks, *extentions)
else:
ext = os.path.splitext(srcname)[1]
if not ext in extentions:
# skip the file
continue
copy2(srcname, dstname)
# XXX What about devices, sockets etc.?
except (IOError, os.error), why:
errors.append((srcname, dstname, str(why)))
# catch the Error from the recursive copytree so that we can
# continue with other files
except Error, err:
errors.extend(err.args[0])
try:
copystat(src, dst)
# except WindowsError: # cant copy file access times on Windows
# pass
except OSError, why:
errors.extend((src, dst, str(why)))
if errors:
raise Error(errors)
Usage: For example, to copy only .config and .bat files....
my_copytree(source, targ, '.config', '.bat')
this is pretty clean.
the commands come from the os library.
this code will search through the current working directory and list only the specified file type. You can change this by replacing 'os.getcwd()' with your target directory and choose the file type by replacing '(ext)'. os.fsdecode is so you don't get a bytewise error from .endswith(). this also sorts alphabetically, you can remove sorted() for the raw list.
import os
filenames = sorted([os.fsdecode(file) for file in os.listdir(os.getcwd()) if os.fsdecode(file).endswith(".(ext)")])
Here's yet another solution, using pathlib (and Python 3):
from pathlib import Path
gamefolder = "path/to/dir"
result = sorted(Path(gamefolder).glob("**.c"))
Notice the double asterisk (**) in the glob() argument. This will search the gamefolder as well as its subdirectories. If you only want to search the gamefolder, use a single * in the pattern: "*.c". For more details, see the documentation.
If you replace '.c' with '[.]c$', you're searching for files that contain .c as the last two characters of the name, rather than all files that contain a c, with at least one character before it.
Edit: Alternatively, match f[-2:] with '.c', this MAY be computationally cheaper than pulling out a regexp match.
Just to be clear, if you wanted the dot character in your search term, you could've escaped it too:
'.*[backslash].c' would give you what you needed, plus you would need to use something like:
results.append(f), instead of what you had listed as results += [f]
This function returns a list of all file names with the specified extension that live in the specified directory:
import os
def listFiles(path, extension):
return [f for f in os.listdir(path) if f.endswith(extension)]
print listFiles('/Path/to/directory/with/files', '.txt')
If you want to list all files with the specified extension in a certain directory and its subdirectories you could do:
import os
def filterFiles(path, extension):
return [file for root, dirs, files in os.walk(path) for file in files if file.endswith(extension)]
print filterFiles('/Path/to/directory/with/files', '.txt')
You can actually do this with just os.listdir
import os
results = [f for f in os.listdir(gamefolders/folder) if f.endswith('.c')]

Find a file in python

I have a file that may be in a different place on each user's machine. Is there a way to implement a search for the file? A way that I can pass the file's name and the directory tree to search in?
os.walk is the answer, this will find the first match:
import os
def find(name, path):
for root, dirs, files in os.walk(path):
if name in files:
return os.path.join(root, name)
And this will find all matches:
def find_all(name, path):
result = []
for root, dirs, files in os.walk(path):
if name in files:
result.append(os.path.join(root, name))
return result
And this will match a pattern:
import os, fnmatch
def find(pattern, path):
result = []
for root, dirs, files in os.walk(path):
for name in files:
if fnmatch.fnmatch(name, pattern):
result.append(os.path.join(root, name))
return result
find('*.txt', '/path/to/dir')
In Python 3.4 or newer you can use pathlib to do recursive globbing:
>>> import pathlib
>>> sorted(pathlib.Path('.').glob('**/*.py'))
[PosixPath('build/lib/pathlib.py'),
PosixPath('docs/conf.py'),
PosixPath('pathlib.py'),
PosixPath('setup.py'),
PosixPath('test_pathlib.py')]
Reference: https://docs.python.org/3/library/pathlib.html#pathlib.Path.glob
In Python 3.5 or newer you can also do recursive globbing like this:
>>> import glob
>>> glob.glob('**/*.txt', recursive=True)
['2.txt', 'sub/3.txt']
Reference: https://docs.python.org/3/library/glob.html#glob.glob
I used a version of os.walk and on a larger directory got times around 3.5 sec. I tried two random solutions with no great improvement, then just did:
paths = [line[2:] for line in subprocess.check_output("find . -iname '*.txt'", shell=True).splitlines()]
While it's POSIX-only, I got 0.25 sec.
From this, I believe it's entirely possible to optimise whole searching a lot in a platform-independent way, but this is where I stopped the research.
If you are using Python on Ubuntu and you only want it to work on Ubuntu a substantially faster way is the use the terminal's locate program like this.
import subprocess
def find_files(file_name):
command = ['locate', file_name]
output = subprocess.Popen(command, stdout=subprocess.PIPE).communicate()[0]
output = output.decode()
search_results = output.split('\n')
return search_results
search_results is a list of the absolute file paths. This is 10,000's of times faster than the methods above and for one search I've done it was ~72,000 times faster.
If you are working with Python 2 you have a problem with infinite recursion on windows caused by self-referring symlinks.
This script will avoid following those. Note that this is windows-specific!
import os
from scandir import scandir
import ctypes
def is_sym_link(path):
# http://stackoverflow.com/a/35915819
FILE_ATTRIBUTE_REPARSE_POINT = 0x0400
return os.path.isdir(path) and (ctypes.windll.kernel32.GetFileAttributesW(unicode(path)) & FILE_ATTRIBUTE_REPARSE_POINT)
def find(base, filenames):
hits = []
def find_in_dir_subdir(direc):
content = scandir(direc)
for entry in content:
if entry.name in filenames:
hits.append(os.path.join(direc, entry.name))
elif entry.is_dir() and not is_sym_link(os.path.join(direc, entry.name)):
try:
find_in_dir_subdir(os.path.join(direc, entry.name))
except UnicodeDecodeError:
print "Could not resolve " + os.path.join(direc, entry.name)
continue
if not os.path.exists(base):
return
else:
find_in_dir_subdir(base)
return hits
It returns a list with all paths that point to files in the filenames list.
Usage:
find("C:\\", ["file1.abc", "file2.abc", "file3.abc", "file4.abc", "file5.abc"])
Below we use a boolean "first" argument to switch between first match and all matches (a default which is equivalent to "find . -name file"):
import os
def find(root, file, first=False):
for d, subD, f in os.walk(root):
if file in f:
print("{0} : {1}".format(file, d))
if first == True:
break
The answer is very similar to existing ones, but slightly optimized.
So you can find any files or folders by pattern:
def iter_all(pattern, path):
return (
os.path.join(root, entry)
for root, dirs, files in os.walk(path)
for entry in dirs + files
if pattern.match(entry)
)
either by substring:
def iter_all(substring, path):
return (
os.path.join(root, entry)
for root, dirs, files in os.walk(path)
for entry in dirs + files
if substring in entry
)
or using a predicate:
def iter_all(predicate, path):
return (
os.path.join(root, entry)
for root, dirs, files in os.walk(path)
for entry in dirs + files
if predicate(entry)
)
to search only files or only folders - replace “dirs + files”, for example, with only “dirs” or only “files”, depending on what you need.
Regards.
SARose's answer worked for me until I updated from Ubuntu 20.04 LTS. The slight change I made to his code makes it work on the latest Ubuntu release.
import subprocess
def find_files(file_name):
command = ['locate'+ ' ' + file_name]
output = subprocess.Popen(command, stdout=subprocess.PIPE, shell=True).communicate()[0]
output = output.decode()
search_results = output.split('\n')
return search_results
#F.M.F's answers has a few problems in this version, so I made a few adjustments to make it work.
import os
from os import scandir
import ctypes
def is_sym_link(path):
# http://stackoverflow.com/a/35915819
FILE_ATTRIBUTE_REPARSE_POINT = 0x0400
return os.path.isdir(path) and (ctypes.windll.kernel32.GetFileAttributesW(str(path)) & FILE_ATTRIBUTE_REPARSE_POINT)
def find(base, filenames):
hits = []
def find_in_dir_subdir(direc):
content = scandir(direc)
for entry in content:
if entry.name in filenames:
hits.append(os.path.join(direc, entry.name))
elif entry.is_dir() and not is_sym_link(os.path.join(direc, entry.name)):
try:
find_in_dir_subdir(os.path.join(direc, entry.name))
except UnicodeDecodeError:
print("Could not resolve " + os.path.join(direc, entry.name))
continue
except PermissionError:
print("Skipped " + os.path.join(direc, entry.name) + ". I lacked permission to navigate")
continue
if not os.path.exists(base):
return
else:
find_in_dir_subdir(base)
return hits
unicode() was changed to str() in Python 3, so I made that adjustment (line 8)
I also added (in line 25) and exception to PermissionError. This way, the program won't stop if it finds a folder it can't access.
Finally, I would like to give a little warning. When running the program, even if you are looking for a single file/directory, make sure you pass it as a list. Otherwise, you will get a lot of answers that not necessarily match your search.
example of use:
find("C:\", ["Python", "Homework"])
or
find("C:\\", ["Homework"])
but, for example: find("C:\\", "Homework") will give un-wanted answers.
I would be lying if I said I know why this happens. Again, this is not my code and I just made the adjustments I needed to make it work. All credit should go to #F.M.F.

Categories

Resources