jenkins passing in path to python argument as variable - python

in jenkins i have :
sh 'ls ${workspace}'
sh 'cd MVD/utils && python compareWhitelist.py ${workspace}/MVDZOS'
I want to pass the complete path into the python script because I am doing
os.walk(sys.argv[1])
in the python script, I am also printing out the sys.argv[1] but it is returning only "/MVDZOS". How can I get the complete path into the script?

Try using the Uppercase in quotes.
Ex:
sh 'ls ${workspace}'
sh "cd MVD/utils && python compareWhitelist.py ${WORKSPACE}/MVDZOS"

Related

run a docker and a command using python script

I have to run a docker and then a command inside the workdir using a python script.
i'm triyng to do it as follows:
command = ['gnome-terminal', '-e', "bash -c 'sudo /home/mpark/Escriptori/SRTConverter/shell_docker.sh; echo b; exec $SHELL'"]
p = subprocess.Popen(command)
where 'sudo /home/mpark/Escriptori/SRTConverter/shell_docker.sh' is a shell script with the docker run with root privileges
the fisrt command 'sudo /home/mpark/Escriptori/SRTConverter/shell_docker.sh' works good, but the second one 'echo b' that has to run inside the container doesn't work..
Thank you!

Different interpretation of quotes from python os and command line

I am running a python3 script which performs the following snippet on Debian 9:
os.environ["PA_DIR"] = "/home/myname/some_folder"
command_template = ("sudo java -Dconfig.file=$PA_DIR/google.conf "
"-jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl "
"-i {} -o $PA_DIR/au_options.json > FDP{}.log 2>&1")
command = command_template.format("test.json, "1")
os.system("screen -dm -S S{} bash -c '{}'".format("1", command))
The use of PA_DIR works as intended. When I tried it on command line:
PA_DIR="/home/myname/some_folder"
screen -dm -S S1 bash -c 'sudo java -Dconfig.file=$PA_DIR/google.conf -jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl -i test.json -o $PA_DIR/au_options.json > FDP1.log 2>&1'
it doesn't do variable substitution due to single quotes and I had to replace them with double quotes (it complains it cannot find the file /google.conf).
What is different when python runs it?
Thanks!
The Python os.system() invokes the underlying system function of the C library, which on POSIX systems is equivalent to do something like
sh -c "your_command and all its arguments"
So the command and all arguments are already surrounded by double-quotes, which does environment variable substitution. Any single quotes inside the string is irrelevant for the variable substitution.
You can test it easily. In a shell do something like
$ foo="bar"
$ echo "foo is '$foo'" # Will print foo is 'bar'
$ echo 'foo is "$foo"' # Will print foo is "$foo"
Waiting for your answer to daltonfury42, I'd bet the problem is, when running in a command line, you are not exporting the PA_DIR environment variable so it is not present in the second bash interpreter. And it behaves different beacuse of what Mihir answered.
If you run
PA_DIR=foo
you only declare a bash variable but it is not an environment variable. Then
bash -c "echo $PA_DIR"
this will output foo because your current interpreter interpolates $PA_DIR and then raises a second bash process with the command echo foo. But
bash -c 'echo $PA_DIR'
this prevents your bash interpreter from interpolating it so it raises a second bash process with the comand echo $PA_DIR. But in this second process the variable PA_DIR does not exist.
If you start your journey running
export PA_DIR=foo
this will become an environment variable that will be accessible to children processes, thus
bash -c 'echo $PA_DIR'
will output foo because the nested bash interpreter has access to the variable even if the parent bash interpreter did not interpolate it.
The same is true for any kind of children process. Try running
PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
export PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
in your shell. No quotes are involved here!
When you use the os.environ dictionary in a Python script, Python will export the variables for you. That's why you will see the variable interpolated by either
os.system("bash -c 'echo $PA_DIR'")
or
os.system('bash -c "echo $PA_DIR"')
But beware that in each case it is either the parent or either the children shell process who is interpolating the variable.
You must understand your process tree here:
/bin/bash # but it could be a zsh, fish, sh, ...
|- /usr/bin/python3 # presumably
|- /bin/sh # because os.system uses that
|- /bin/bash
If you want an environment variable to exist in the most nested process, you must export it anywhere in the upper tree. Or in that very process.

Use the path as an argument in shell file from python

I want to call from python a shell script which contain the running of another python function. I would like to use for that subprocess method. My code so far look like:
arguments = ["./my_shell.sh", path]
ret_val = subprocess.Popen(arguments, stdout=subprocess.PIPE)
while the script is the following:
#!/bin/sh
cd ...
python -c "from file import method;
method()"
How can I give in the directory (to cd) of the path that I pass as an argument in the shell file?
You can access your arguments as $1, $2, etc. So your cd command would simply be cd $1.

Run bash while loop in Python

I am trying to run a bash while loop inside a Python3.6 script. What I have tried so far is:
subprocess.run(args=['while [ <condition> ]; do <command> done;'])
I get the following error:
FileNotFoundError: [Errno 2] No such file or directory
Is there a way to run such a while loop inside Python?
The part that's tripping you up is providing the args as a list. From the documentation:
If the cmd argument to popen2 functions is a string, the command is executed through /bin/sh. If it is a list, the command is directly executed.
This seems to do what you want:
subprocess.run('i=0; while [ $i -lt 3 ]; do i=`expr $i + 1`; echo $i; done', shell=True)
Notice it's specified as a string instead of a list.
Running bash for loop in Python 3.x is very much like running while loop.
#! /bin/bash
for i in */;
do
zip -r "${i%/}.zip" "$i";
done
This will iterate over a path and zip all directories. To run above bash script in Python:
import subprocess
subprocess.run('for i in */; do zip -r "${i%/}.zip" "$i"; done', shell=True)

Linux run shell cmd from python, Failed to load config file

I have installed a backup program called rclone on my raspberry pi which is running Debian, I have successfully ran the cmd in the shell to backup a folder to google drive but I really need to be able to do so each time a take a photo with my python script, I have little experience in Linux compared to others and I thought that if I made a shell script with a basic shebang of
#!/bin/sh
or
#!/bin/bash
then the cmd below
rclone copy /var/www/html/camera_images pictures::folder1
I then made the .sh file executable, and this works if I just click it in the folder and execute but if I try to call that .sh script from python with
os.system('sh /home/pi/py/upload.sh')
or
os.system(' rclone copy /var/www/html/camera_images pictures::folder1 ')
I get an error in the shell saying
Failed to load config file "/root/.rclone.conf" using default - no such directory.
But the .conf is located in /home/pi as it should be. and if i try
os.system(' sh rclone copy /var/www/html/camera_images pictures::folder1 ')
I get
sh: 0: Cant open rclone.
How can I can run the copy cmd or a script to do so from python?
this is how i installed rclone
cd
wget http://downloads.rclone.org/rclone-v1.34-linux-arm.zip
unzip rclone-v1.34-linux-arm.zip
cd rclone-v1.34-linux-arm
sudo cp rclone /usr/sbin/
sudo chown root:root /usr/sbin/rclone
sudo chmod 755 /usr/sbin/rclone
sudo mkdir -p /usr/local/share/man/man1
sudo cp rclone.1 /usr/local/share/man/man1/
sudo mandb
rclone config
Use --config in your rclone command
From docs:
--config string Config file. (default /home/ncw/.rclone.conf")
Your command should looks like:
os.system(' sh rclone copy --config /home/pi/.rclone.conf /var/www/html/camera_images pictures::folder1 ')
You should be using subprocess module instead of os.system.
You can use subprocess.Popen to create a process and give it a working directory.
subprocess.Popen(your_command, cwd=path_to_your_executable_dir, shell=True)
(Use shell=True to pass a simple string command among other conveniences).
The shell argument (which defaults to False) specifies whether to use
the shell as the program to execute. If shell is True, it is
recommended to pass args as a string rather than as a sequence.
On Unix with shell=True, the shell defaults to /bin/sh. If args is a
string, the string specifies the command to execute through the shell.
This means that the string must be formatted exactly as it would be
when typed at the shell prompt. This includes, for example, quoting or
backslash escaping filenames with spaces in them. If args is a
sequence, the first item specifies the command string, and any
additional items will be treated as additional arguments to the shell
itself. That is to say, Popen does the equivalent of: ....
Thank you every one :)
I have it working now with
os.system(' rclone copy --config /home/pi/.rclone.conf /var/www/html/camera_images pictures::folder1 ')
Note that if i put sh at the start i got the error sh: 0: Can't open rclone though i read yesterday about putting something like ,:0 at the end as a return value ? either way it works without the sh.
and the subprocess works too which i shall use instead.
subprocess.Popen('rclone copy --config /home/pi/.rclone.conf /var/www/html/camera_images pictures::folder1', shell=True)

Categories

Resources