I would like to rename files in directory using bash.
I am trying:
find . -type f -exec mv '{}' $(urlencode {}) \;
but urlencode encodes literally {} instead of taking result of find.
If changed to:
. -type f -exec mv '{}' $(echo {}) \;
echo prints result of find.
The urlencode is an alias:
$alias urlencode='python -c "import sys, urllib as ul; \
print ul.quote_plus(sys.argv[1])"
Decoding URL encoding (percent encoding)
Can anyone explain this behavior and suggest solution?
Use bash -c command:
find . -type f -exec bash -c 'mv "$1" $(urlencode "$1")' _ {} \;
Here,
find . -type f -exec mv '{}' $(echo {}) \;
the command substitution is unquoted, so it gets expanded in the command line, before find sees it. The resulting command that runs is
find . -type f -exec mv '{}' {} \;
and then find replaces both copies of {} with the current file name. The same happens if the command substitution is double-quoted.
If it's single-quoted, however, then find will expand the {} within it, and run commands like mv ./somefile $(echo ./somefile), which will not work unless the directory $(echo . exists.
The main point here is that find -exec doesn't run through a shell.
You need to ask for a shell there explicitly. Either once per each file
find . -type f -exec sh -c 'mv "$1" "$(urlencode "$1")"' sh {} \;
or one shell for multiple files and a loop to deal with them all
find . -type f -exec sh -c 'for f; do mv "$f" "$(urlencode "$f")"; done' sh {} +
Of course, if urlencode is an alias, you'll have to jump through hoops to get it to work in a noninteractive shell. It's probably better to put it as a script in the PATH, or as an exported function (in which case, run bash -c instead of sh -c).
Related
I am trying to figure out invalid JSON file in my directory tree. I have more than 100+ JSON files so trying to see if there is any easy way by which I can run some linux command to figure out which JSON files are invalid. I want to know all those file names.
I tried this command but this doesn't give me anything on my console and I do have bunch of invalid JSON files.
find . -name \*.json -exec echo {} \; -exec python -m json.tool "{}" \; 2>&1 | grep "No JSON" -B 1
I am trying to run it on my mac.
In python you could do
#!/usr/bin/env python3
from pathlib import Path
import json
# scan subdirs from current directory
for jsonfile in Path(".").glob("**/*.json"):
try:
json.load(open(jsonfile))
print(jsonfile, "success")
except Exception as e:
print(jsonfile, "fail", e)
In bash or zsh, using jq to validate JSON files:
find . -name "*.json" -print0 | while IFS= read -d '' -r filename; do
if ! jq . "$filename" >/dev/null 2>&1; then
echo "$filename is invalid"
fi
done
I need to create a bash script that create individual zip file for each of python file in the same directory.
I found the following command could create the zip with the file and original file extension. e.g created test.py.zip for test.py
find . -name '*.py' -exec zip '{}.zip' '{}' \;
How can I update the command to get rid of the original file extension. e.g. test.zip instead of test.py.zip
Thanks in advance.
You can strip an extension using basename <file> <extension>. There are a variety of ways you could arrange to do that.
Using a loop in bash
For example, loop through the results from find and then zip the file:
for f in $(find . -name '*.py')
do
zip "$(basename "$f" .py).zip" "$f"
done
Using a subshell in find
Unfortunately we can't use $(...) in find ... -exec directly. However we can always invoke a shell and do it there:
find . -name '*.py' -exec sh -c 'zip "$(basename "$0" .py)".zip "$0"' '{}' \;
You can't do that with find's -exec option. You can use bash's while loop and sed to process find's output instead:
find . -name '*.py' -print | while read FILE; do zip `echo $FILE|sed 's/\py$//'`zip $FILE; done
I have a python script that accepts a -f flag, and appends multiple uses of the flag.
For example, if I run python myscript -f file1.txt -f file2.txt, I would have a list of files, files=['file1.txt', 'files2.txt']. This works great, but am wondering how I can automatically use the results of a find command to append as many -f flags as there are files.
I've tried:
find ./ -iname '*.txt' -print0 | xargs python myscript.py -f
But it only grabs the first file
With the caveat that this will fail if there are more files than will fit on a single command line (whereas xargs would run myscript.py multiple times, each with a subset of the full list of arguments):
#!/usr/bin/env bash
args=( )
while IFS= read -r -d '' name; do
args+=( -f "$name" )
done < <(find . -iname '*.txt' -print0)
python myscript.py "${args[#]}"
If you want to do this safely in a way that tolerates an arbitrary number of filenames, you're better off using a long-form option -- such as --file rather than -f -- with the = separator allowing the individual name to be passed as part of the same argv entry, thus preventing xargs from splitting a filename apart from the sigil that precedes it:
#!/usr/bin/env bash
# This requires -printf, a GNU find extension
find . -iname '*.txt' -printf '--file=%p\0' | xargs -0 python myscript.py
...or, more portably (running on MacOS, albeit still requiring a shell -- such as bash -- that can handle NUL-delimited reads):
#!/usr/bin/env bash
# requires find -print0 and xargs -0; these extensions are available on BSD as well as GNU
find . -iname '*.txt' -print0 |
while IFS= read -r -d '' f; do printf '--file=%s\0' "$f"; done |
xargs -0 python myscript.py
Your title seems to imply that you can modify the script. In that case, use the nargs (number of args) option to allow more arguments for the -f flag:
parser = argparse.ArgumentParser()
parser.add_argument('--files', '-f', nargs='+')
args = parser.parse_args()
print(args.files)
Then you can use your find command easily:
15:44 $ find . -depth 1 | xargs python args.py -f
['./args.py', './for_clint', './install.sh', './sys_user.json']
Otherwise, if you can't modify the script, see #CharlesDuffy's answer.
I have a Python code that takes multiple input files and merges them into one single output file. I want to create a bash script that adds the input files automatically, without having me to manually write infile1 infile2, etc. Below is what I came up with:
FILE= `find ~/Desktop/folder -name '*.tif'`
for i in $FILE
do
gdal_merge.py -o mosaic -of GTiff $i
done
But for some reason I am getting this error:
Syntax error: word unexpected (expecting ")")
this might work :
FILE=`find ~/Desktop/folder -name '*.tif'`
gdal_merge.py -o mosaic -of GTiff "$FILE"
You could try the -exec option to find:
find ~/Desktop/folder -name '*.tif' -exec gdal_merge.py -o mosaic -of GTiff {}
My guess is that some of your files contain special characters like ( or ) or whitespacespace characters that would cause trouble. In general '-x' option would show you what's going on. Either do bash -x my_script or add set -x at the beginning of the script.
As an alternative that's somewhat better at dealing with special characters try this:
find ~/Desktop/folder -name '*.tif' -print0 | xargs -0 -n1 gdal_merge.py -o mosaic -of GTiff
I have a list of files that I can obtain using the UNIX 'find' command such as:
$ find . -name "*.txt"
foo/foo.txt
bar/bar.txt
How can I pass this output into a Python script like hello.py so I can parse it using Python's argparse library?
Thanks!
If you want just text output of find(1), then use a pipe:
~$ find . -name "*.txt" | python hello.py
If you are looking to pass list of files as arguments to the script, use xargs(1):
~$ find . -name "*.txt" -print0 | xargs -0 python hello.py
or use -exec option of find(1).
Use xargs:
find . -name "*.txt" | xargs python -c 'import sys; print sys.argv[1:]'
From man find:
-exec command ;
Execute command; true if 0 status is returned. All following
arguments to find are taken to be arguments to the command until
an argument consisting of `;' is encountered. The string `{}'
is replaced by the current file name being processed everywhere
it occurs in the arguments to the command, not just in arguments
where it is alone, as in some versions of find. Both of these
constructions might need to be escaped (with a `\') or quoted to
protect them from expansion by the shell. See the EXAMPLES sec‐
tion for examples of the use of the -exec option. The specified
command is run once for each matched file. The command is exe‐
cuted in the starting directory. There are unavoidable secu‐
rity problems surrounding use of the -exec action; you should
use the -execdir option instead.
-exec command {} +
This variant of the -exec action runs the specified command on
the selected files, but the command line is built by appending
each selected file name at the end; the total number of invoca‐
tions of the command will be much less than the number of
matched files. The command line is built in much the same way
that xargs builds its command lines. Only one instance of `{}'
is allowed within the command. The command is executed in the
starting directory.
So you can do
find . -name "*.txt" -exec python myscript.py {} +
This helps, if you need to pass arguments after the list of arguments from the find output:
$ python hello.py `find . -name "*.txt"`
I used it to concat pdf files into another one:
$ pdfunite `find . -name "*.pdf" | sort` all.pdf