How to edit argument format - python

I have a python script...it gets called different then bash
./script.py 'arg1' 'arg2 stillarg2'
In the python script there is two arguments.
Now I want to call that with a bash script
#!/bin/bash
./script.py ${1} ${2} ${3}
I want to call the bash script by in this syntax
./script.sh arg1 arg2 stillarg2
In the Bash script there is 3 arguments and no quotes.
So is there a way that by using a bash script, could call the python script with arguments. Without quotes and 3 arguments?.
I've Tried:
#!/bin/bash
./script.py "'${1}'" "'${2} ${3}'"
Outcome is:
operation not support

You just want to combine $2 and $3 in the same argument, no?
#!/bin/bash
./script.py "${1}" "${2} ${3}"

Related

Python not getting arguments passed to it from shell script

I have a shell script that calls various python scripts and passes an argument it got onto the python scripts. However, the python scripts are currently not getting the argument. when i go to sys.argv it instead shows something completely different. I have the code and sys.argv below:
Shell Script (called run_everything.sh) :
#!/bin/bash
source venv/bin/activate
python3 eval_otare_on_otare.py $1
then i have a line in eval_otare_on_otare.pt that prints the arguments passed:
print(sys.argv)
And I get the following list:
['eval_care_on_otare.py', 'run_everything.sh']
what can I do because sys.argv is clearly not getting what I want, I want it to return
['eval_care_on_otare.py', $1]
where $1 is the argument passed
If your activate script messes up the value of $1, save it first.
#!/bin/bash
arg=$1
source ./venv/bin/activate
python3 eval_otare_on_otare.py "$arg"
Tangentially note also When to wrap quotes around a shell variable?

How can I pass command line arguments containing braces using sun grid engine qsub?

I have a Python script that I would like to run on sun grid engine, and this script accepts a string command line argument that might contain braces. For instance, the script could be script.py:
import sys
print(sys.argv[1])
If I run python script.py aaa{ the output is aaa{, and if I run python script.py aaa{} the output is aaa{}. These are both the desired behavior.
However, if I run qsub -b y -cwd python script.py aaa{ the job fails with error Missing }., and if I run qsub -b y -cwd python script.py aaa{} the job succeeds but outputs aaa. This is not the desired behavior.
My hypothesis is that qsub does some preprocessing of the command line arguments to my script, but I don't want it to do this. Is there any way to make qsub pass command line arguments to my script as is, regardless of whether they contain braces or not?
The simplest solution would be to use
echo "python script.py aaa{}" | qsub -cwd
You could also create submit file containing the following:
#!/bin/bash
#$ -cwd
python ./script.py ${input}
Then, you can pass your input via qsub -v input=aaa{} script.submit
Both variants require to omit -b y.
I was able to solve my problem by running qsub -b y -cwd -shell no python script.py aaa{} instead of qsub -b y -cwd python script.py aaa{}. On my system, -shell yes seemed to be enabled by default, which initiated some preprocessing. Adding -shell no appears to fix this.

Execute a custom bash function within python

This question is a simpler version of this question.
In simple terms, I have a custom bash function my_func defined on ~/.bash_profile and uses two more bash functions defined in the same environment.
Also, my_func accepts two arguments, let's say a and b. What my_func does is that it gets connected to a remote server and sends some files (these are determined by a and b).
If I type on the bash shell:
. my_func a b everything works fine and I get some print statements on the screen.
However, if I include:
subprocess.call(['#!/bin/bash . my_func a b'], shell=True) nothing seems to happen.
I tried to export all the bash functions that are used by my_func by including:
subprocess.call(['#!/bin/bash export -f my_func'], shell=True) and I did the same for the rest of the functions that are used by my_func.
EDIT:
If I use subprocess.call(['bash', '-c' ,'my_func a b], shell=True) the bash shell will change into bash-3.2$
You need to export the function before you start the python program:
export -f my_func
python foo.py
Well, the above example might not work if the system's default shell (/bin/sh) is not bash. To circumvent this, you may use subprocess call like this:
$ function foo() { echo "bar" ; }
$ export -f foo
$ cat foo.py
import subprocess
subprocess.call(['bash', '-c', 'foo'])
$ python foo.py
bar
Alternative:
I would put the function into a lib file, let's say:
# /usr/share/my_lib/my_lib.sh
function my_func() {
# do something
}
Then I would expose the function via a script in PATH:
#!/bin/bash
# /usr/local/bin/my_prog
source /usr/lib/my_lib/my_lib.sh
my_func "$1" "$2"
In Python you would just:
subprocess.call(['/usr/local/bin/my_prog', 'a', 'b'])
Btw: If you don't need that function somewhere else, you can just put it directly into /usr/local/bin/my_prog.

How do I pass in arguments to a curled script, that has been piped into the Python process?

I have a Python script on a Github gist, which I can curl from my terminal with
curl -s https://gist.githubusercontent.com/.../script.py
The script has a main function that is executed, and I can pipe the output of the curl statement to Python, which executes the script.
curl -s https://gist.githubusercontent.com/.../script.py | python
The above statement works, but I want to provide some command line arguments to the script, without having to download it to a file. The problems I am facing are that the Python command treats any text following it as what to execute, and not as arguments for the piped file, so
curl -s https://gist.githubusercontent.com/.../script.py | python arg1 arg2
Does not work, nor does
curl -s https://gist.githubusercontent.com/.../script.py arg1 arg2 | python
How can I pass in both arguments to the file, either as standard input or command line options that my script can read?
From CLI help:
- : program read from stdin (default; interactive mode if a tty)
So what you want is:
curl -s https://gist.githubusercontent.com/.../script.py | python - arg1 arg2
Note however, that - will also appear in sys.argv (in place of the script's filename)
$ echo "import sys; print sys.argv" | python - arg1 arg2
['-', 'arg1', 'arg2']
you could save the script locally and then execute it.
curl -o some_filename.py some_link
python some_filename.py arg1 arg2
you can also save a curl output to file with:
curl some_link > some_file.py

Pass arguments to python based on wild character

I have a shell script test.sh as below:
#!/bin/sh
ARG1=/bin/file1.txt
ARG2=/bin/testfile.txt
ARG3=/bin/samplefile.txt
test.py $ARG1 $ARG2 $ARG3
The python script reads the arguments and copies the files to another location. Instead of defining all the arguments separately as ARG1, ARG2, ARG3. I want to use a wild character as *.txt to define them and pass them to test.py.
I can't change the python file and all i can change is the test.sh file. So basically define the varaibles using *.txt and pass the arguments to test.py
I'm not much familiar with shell scripting. Is there a way I can save these variables in an array and then pass it to python script separately?
Just call
test.py /bin/*.txt
and bash will expand this to
test.py /bin/file1.txt /bin/testfile.txt /bin/samplefile.txt
To test shell expansions, you can use echo:
echo /bin/*.txt
or
echo /bin/newfile /bin/*.txt
which will then echo the list of files.

Categories

Resources