Python not getting arguments passed to it from shell script - python

I have a shell script that calls various python scripts and passes an argument it got onto the python scripts. However, the python scripts are currently not getting the argument. when i go to sys.argv it instead shows something completely different. I have the code and sys.argv below:
Shell Script (called run_everything.sh) :
#!/bin/bash
source venv/bin/activate
python3 eval_otare_on_otare.py $1
then i have a line in eval_otare_on_otare.pt that prints the arguments passed:
print(sys.argv)
And I get the following list:
['eval_care_on_otare.py', 'run_everything.sh']
what can I do because sys.argv is clearly not getting what I want, I want it to return
['eval_care_on_otare.py', $1]
where $1 is the argument passed

If your activate script messes up the value of $1, save it first.
#!/bin/bash
arg=$1
source ./venv/bin/activate
python3 eval_otare_on_otare.py "$arg"
Tangentially note also When to wrap quotes around a shell variable?

Related

Run vulture on Python module with CLI arguments?

Context
Suppose one has a project structure with src.projectname.__main__.py which can be executed using the following command with accompanying arguments:
python -m src.projectname -e mdsa_size3_m1 -v -x
Question
How would one run vulture whilst passing cli arguments to the script on which vulture runs?
Approach I
When I run:
python -m src.projectname -e mdsa_size3_m1 -v -x
It throws the following
usage: vulture [options] [PATH ...]
vulture: error: unrecognized arguments: -e mdsa_size3_m1 -x
because vulture tries to parse the arguments for the script that is being ran.
Notes
I am aware normally one would expect to run vulture on the script and its entirety without narrowing down the scope with arguments. However, in this case the arguments are required to specify the number of runs/duration of the code execution.
One can hack around this issue by temporarily manually hardcoding the args with (for example):
args = parse_cli_args()
args.experiment_settings_name = "mdsa_size3_m1"
args.export_images = True
process_args(args)
assuming one has such an args object, however, I thought perhaps this functionality can be realised using the CLI, without temporarily modifying the code.

Activating a Python virtual environment and calling python script inside another python script

I am using pipenv for managing my packages. I want to write a python script that calls another python script that uses a different Virtual Environment(VE).
How can I run python script 1 that uses VE1 and call another python script (script2 that uses VE2).
I found this code for the cases where there is no need for changing the virtual environment.
import os
os.system("python myOtherScript.py arg1 arg2 arg3")
The only idea that I had was simply navigating to the target project and activate shell:
os.system("cd /home/mmoradi2/pgrastertime/")
os.system("pipenv shell")
os.system("python test.py")
but it says:
Shell for /home/..........-GdKCBK2j already activated.
No action taken to avoid nested environments.
What should I do now?
in fact my own code needs VE1 and the subprocess (second script) needs VE2. How can I call the second script inside my code?
In addition, the second script is used as a command line tool that accepts the inputs with flags:
python3 pgrastertime.py -s ./sql/postprocess.sql -t brasdor_c_07_0150
-p xml -f -r ../data/brasdor_c_07_0150.object.xml
How can I call it using the solution of #tzaman
Each virtualenv has its own python executable which you can use directly to execute the script.
Using subprocess (more versatile than os.system):
import subprocess
venv_python = '/path/to/other/venv/bin/python'
args = [venv_python, 'my_script.py', 'arg1', 'arg2', 'arg3']
subprocess.run(args)

Pass argument from shell script to Python script without having to specify argument on command line

I want to execute a shell script without having to specify any additional arguments on the command line itself. Instead I would like to hard code the arguments, e.g. input file name and file path, in the shell script.
Toy shell script:
#!/bin/bash
time python3 path/to/pyscript/graph.py \
--input-file-path=path/to/file/myfile.tsv
So, when I run $ ./script.sh, the script should pass the input file information to the py script.
Can this be done? I invariably get the error "no such directory or file" ...
Note, I will deal with the arguments on the python script side using argparse.
EDIT
It turns out that the issue was caused by something I had omitted from my toy script above because I didn't think that it could be the cause. In fact I had a line commented out and it was this line which prevented the script from running.
Toy shell script Full Version:
#!/bin/bash
time python3 path/to/pyscript/graph.py \
# this commented out line prevents the script from running
--input-file-path=path/to/file/myfile.tsv
I suspect your script is correct but the file path is wrong. Maybe you forgot a leading forward slash. Anyway, make sure that path/to/pyscript/graph.py and path/to/file/myfile.tsv are correct.
A dummy example of how to call a python script with hard-coded arguments from a BASH script:
$ cat dummy_script.py
import argparse
import os
import time
parser = argparse.ArgumentParser()
parser.add_argument("-i", "--input-file-path")
args = parser.parse_args()
if os.path.isfile(args.input_file_path):
print args.input_file_path, "is a file"
print "sleeping a second"
time.sleep(1)
$ cat time_python_script.sh
time python dummy_script.py --input-file-path=/etc/passwd
$ /bin/bash time_python_script.sh
/etc/passwd is a file
sleeping a second
real 0m1.047s
user 0m0.028s
sys 0m0.016s

Pass arguments to python based on wild character

I have a shell script test.sh as below:
#!/bin/sh
ARG1=/bin/file1.txt
ARG2=/bin/testfile.txt
ARG3=/bin/samplefile.txt
test.py $ARG1 $ARG2 $ARG3
The python script reads the arguments and copies the files to another location. Instead of defining all the arguments separately as ARG1, ARG2, ARG3. I want to use a wild character as *.txt to define them and pass them to test.py.
I can't change the python file and all i can change is the test.sh file. So basically define the varaibles using *.txt and pass the arguments to test.py
I'm not much familiar with shell scripting. Is there a way I can save these variables in an array and then pass it to python script separately?
Just call
test.py /bin/*.txt
and bash will expand this to
test.py /bin/file1.txt /bin/testfile.txt /bin/samplefile.txt
To test shell expansions, you can use echo:
echo /bin/*.txt
or
echo /bin/newfile /bin/*.txt
which will then echo the list of files.

Why can't environmental variables set in python persist?

I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0

Categories

Resources