Use the path as an argument in shell file from python - python

I want to call from python a shell script which contain the running of another python function. I would like to use for that subprocess method. My code so far look like:
arguments = ["./my_shell.sh", path]
ret_val = subprocess.Popen(arguments, stdout=subprocess.PIPE)
while the script is the following:
#!/bin/sh
cd ...
python -c "from file import method;
method()"
How can I give in the directory (to cd) of the path that I pass as an argument in the shell file?

You can access your arguments as $1, $2, etc. So your cd command would simply be cd $1.

Related

Change directory from python script for calling shell

I would like to build a python script, which can manipulate the state of it's calling bash shell, especially it's working directory for the beginning.
With os.chdir or os.system("ls ..") you can only change the interpreters path, but how can I apply the comments changes to the scripts caller then?
Thank you for any hint!
You can't do that directly from python, as a child process can never change the environment of its parent process.
But you can create a shell script that you source from your shell, i.e. it runs in the same process, and in that script, you'll call python and use its output as the name of the directory to cd to:
/home/choroba $ cat 1.sh
cd "$(python -c 'print ".."')"
/home/choroba $ . 1.sh
/home $

Execute a custom bash function within python

This question is a simpler version of this question.
In simple terms, I have a custom bash function my_func defined on ~/.bash_profile and uses two more bash functions defined in the same environment.
Also, my_func accepts two arguments, let's say a and b. What my_func does is that it gets connected to a remote server and sends some files (these are determined by a and b).
If I type on the bash shell:
. my_func a b everything works fine and I get some print statements on the screen.
However, if I include:
subprocess.call(['#!/bin/bash . my_func a b'], shell=True) nothing seems to happen.
I tried to export all the bash functions that are used by my_func by including:
subprocess.call(['#!/bin/bash export -f my_func'], shell=True) and I did the same for the rest of the functions that are used by my_func.
EDIT:
If I use subprocess.call(['bash', '-c' ,'my_func a b], shell=True) the bash shell will change into bash-3.2$
You need to export the function before you start the python program:
export -f my_func
python foo.py
Well, the above example might not work if the system's default shell (/bin/sh) is not bash. To circumvent this, you may use subprocess call like this:
$ function foo() { echo "bar" ; }
$ export -f foo
$ cat foo.py
import subprocess
subprocess.call(['bash', '-c', 'foo'])
$ python foo.py
bar
Alternative:
I would put the function into a lib file, let's say:
# /usr/share/my_lib/my_lib.sh
function my_func() {
# do something
}
Then I would expose the function via a script in PATH:
#!/bin/bash
# /usr/local/bin/my_prog
source /usr/lib/my_lib/my_lib.sh
my_func "$1" "$2"
In Python you would just:
subprocess.call(['/usr/local/bin/my_prog', 'a', 'b'])
Btw: If you don't need that function somewhere else, you can just put it directly into /usr/local/bin/my_prog.

Pass arguments to python based on wild character

I have a shell script test.sh as below:
#!/bin/sh
ARG1=/bin/file1.txt
ARG2=/bin/testfile.txt
ARG3=/bin/samplefile.txt
test.py $ARG1 $ARG2 $ARG3
The python script reads the arguments and copies the files to another location. Instead of defining all the arguments separately as ARG1, ARG2, ARG3. I want to use a wild character as *.txt to define them and pass them to test.py.
I can't change the python file and all i can change is the test.sh file. So basically define the varaibles using *.txt and pass the arguments to test.py
I'm not much familiar with shell scripting. Is there a way I can save these variables in an array and then pass it to python script separately?
Just call
test.py /bin/*.txt
and bash will expand this to
test.py /bin/file1.txt /bin/testfile.txt /bin/samplefile.txt
To test shell expansions, you can use echo:
echo /bin/*.txt
or
echo /bin/newfile /bin/*.txt
which will then echo the list of files.

How to alternate around directories using subprocess

I want to change the current directory using subprocess.
For example:
import os, sys, subprocess
os.environ['a'] = '/home'
os.environ['b'] = '/'
subprocess.call('cd $a', shell=True)
subprocess.call('ls', shell=True)
subprocess.call('cd $b', shell=True)
subprocess.call('ls', shell=True)
I think that this should work like a command line unix
$ export a='/home'
$ export b='/'
$ cd $a
$ ls
$ cd $b
$ ls
But it doesn't happen..
How must I do to change the current dir?
Thanks.
To change the directory just use os.chdir() instead.
You can also execute commands in specific directoeies by running subprocess.Popen(...) - it has an optional parameter cwd=None. Just use it to specify the working directory.
Also, you could take a look at a small module I wrote that completes some missing functionality from Python standard library. Probably this module especially https://github.com/ssbarnea/tendo/blob/master/tendo/tee.py

How to make a call to an executable from Python script?

I need to execute this script from my Python script.
Is it possible? The script generate some outputs with some files being written. How do I access these files? I have tried with subprocess call function but without success.
fx#fx-ubuntu:~/Documents/projects/foo$ bin/bar -c somefile.xml -d text.txt -r aString -f anotherString >output
The application "bar" also references to some libraries, it also create the file "bar.xml" besides the output. How do I get access to these files? Just by using open()?
Thank you,
Edit:
The error from Python runtime is only this line.
$ python foo.py
bin/bar: bin/bar: cannot execute binary file
For executing the external program, do this:
import subprocess
args = ("bin/bar", "-c", "somefile.xml", "-d", "text.txt", "-r", "aString", "-f", "anotherString")
#Or just:
#args = "bin/bar -c somefile.xml -d text.txt -r aString -f anotherString".split()
popen = subprocess.Popen(args, stdout=subprocess.PIPE)
popen.wait()
output = popen.stdout.read()
print output
And yes, assuming your bin/bar program wrote some other assorted files to disk, you can open them as normal with open("path/to/output/file.txt"). Note that you don't need to rely on a subshell to redirect the output to a file on disk named "output" if you don't want to. I'm showing here how to directly read the output into your python program without going to disk in between.
The simplest way is:
import os
cmd = 'bin/bar --option --otheroption'
os.system(cmd) # returns the exit status
You access the files in the usual way, by using open().
If you need to do more complicated subprocess management then the subprocess module is the way to go.
For executing a unix executable file. I did the following in my Mac OSX and it worked for me:
import os
cmd = './darknet classifier predict data/baby.jpg'
so = os.popen(cmd).read()
print so
Here print so outputs the result.

Categories

Resources