how to pass an argument from python code to bash script? - python

I have a python code in which at the beginning it takes a string variable let say "element_name" from user and build some sub-folders based on this string and also some output files created by this code move to those folders.
On the other hand, I have a bash script in which some codes should be running in the sub-folders made in python code.
Any help how to introduce those folders in bash? How to pass the "element_name" from python to bash?
In python code "a.py" I tried
first = subprocess.Popen(['/bin/echo', element_name], stdout=subprocess.PIPE)
second = subprocess.Popen(['bash', 'path/to/script', '--args'], stdin=first.stdout)
and then in bash
source a.py
echo $element_name
but it doesn't work.

It's not clear from your question what is in your scripts, but I guess
subprocess.run(['/bin/bash', 'path/to/script', '--args', element_name])
is doing what you intend to do, passing the value of element_name to script as an argument.

I found a way. What I did is to pass the argument in a bash file and import this bash file as a source to my main bash file. Now everything works well.

Related

Using sys.argv to pass command line arguments to a python script

I know this question has been asked in various variations but none of them seem to work for my specific case:
(btw all mentioned files are in my PATH)
I have a simple python script called test.py:
import sys
print('Hello world!')
print(sys.argv)
counter = 0
while True:
counter +=1
The counter is just there to keep the command window open so don't worry about that.
When I enter
test.py test test
into cmd I get the following output:
Hello world!
['C:\\Users\\path\\to\\test.py']
For some reason unknown to me the two other commands (sys.argv[1] and sys.argv[2]) are missing.
However when I create a .bat file like this:
#C:\Users\path\to\python.exe C:\Users\path\to\test.py %*
and call it in cmd
test.bat test test
I get my desired output:
Hello world!
['C:\\Users\\path\\to\\test.py', 'test', 'test']
I've read that the
%*
in the .bat file means that all command line arguments are passed to the python script but why are exactly these arguments not passed to the python script when I explicitly call it in cmd?
From what I've read all command line arguments entered after the script name should be passed to said script but for some reason it doesn't work that way.
What am I overlooking here?
I'm guessing that you need to actually run the script through the command line with C:\Users\path\to\python.exe test.py test test.
I'm not sure how Windows handles just test.py test test but from my limited experience it is probably just trying to open all 3 of those files. Since the first one (test.py) has a .py extension, it is opened with the Python Interpreter and is run automatically. The other two are not actually being passed in as an argument.

How to call . /home/test.sh file in python script

I have file called . /home/test.sh (the space between the first . and / is intentional) which contains some environmental variables. I need to load this file and run the .py. If I run the command manually first on the Linux server and then run python script it generates the required output. However, I want to call . /home/test.sh from within python to load the profile and run rest of the code. If this profile is not loaded python scripts runs and gives 0 as an output.
The call
subprocess.call('. /home/test.sh',shell=True)
runs fine but the profile is not loaded on the Linux terminal to execute python code and give the desired output.
Can someone help?
Environment variables are not inherited directly by the parent process, which is why your simple approach does not work.
If you are trying to pick up environment variables that have been set in your test.sh, then one thing you could do instead is to use env in a sub-shell to write them to stdout after sourcing the script, and then in Python you can parse these and set them locally.
The code below will work provided that test.sh does not write any output itself. (If it does, then what you could do to work around it would be to echo some separator string afterward sourcing it, and before running the env, and then in the Python code, strip off the separator string and everything before it.)
import subprocess
import os
p = subprocess.Popen(". /home/test.sh; env -0", shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
out, _ = p.communicate()
for varspec in out.decode().split("\x00")[:-1]:
pos = varspec.index("=")
name = varspec[:pos]
value = varspec[pos + 1:]
os.environ[name] = value
# just to test whether it works - output of the following should include
# the variables that were set
os.system("env")
It is also worth considering that if all that you want to do is set some environment variables every time before you run any python code, then one option is just to source your test.sh from a shell-script wrapper, and not try to set them inside python at all:
#!/bin/sh
. /home/test.sh
exec "/path/to/your/python/script $#"
Then when you want to run the Python code, you run the wrapper instead.

Executing a profile load shell script from a python program [duplicate]

This question already has answers here:
how to "source" file into python script
(8 answers)
Closed 3 years ago.
I am struggling to execute a shell script from a Python program. The actual issue is the script is a load profile script and runs manually as :
. /path/to/file
The program can't be run as sh script as the calling programs are loading some configuration file and so must need to be run as . /path/to/file
Please do guide how can I integrate the same in my Python script? I am using subprocess.Popen command to run the script and as said the only way it works is to run as . /path/to/file and so not giving the right result.
Without knowledge of the precise reason the script needs to be sourced, this is slightly speculative.
The fundamental problem is this: How do I get a source command to take effect outside the shell script?
Let's say your sourced file does something like
export fnord="value"
This cannot (usefully) be run in a subshell (as a normally executed script would) because the environment variable and its value will be lost when the script terminates. The solution is to source (aka .) this snippet from an already running shell; then the value stays in that shell's environment until that shell terminates.
But Python is not a shell, and there is no general way for Python to execute arbitrary shell script code, short of reimplementing the shell in Python. You can reimplement a small subset of the shell's functionality with something like
with open('/path/to/file') as shell_source:
lines = shell_source.readlines()
for line in lines:
if line.strip().startswith('export '):
var, value = line[7:].strip().split('=', 1)
if value.startswith('"'):
value = value.strip('"')
elif value.startswith("'"):
value = value.strip("'")
os.environ[var] = value
with some very strict restrictions (let's not say naïve assumptions) on the allowable shell script syntax in the file. But what if the file contained something else than a series of variable assignments, or the assignment used something other than trivial quoted strings in the values? (Even the export might or might not be there. Its significance is to make the variable visible to subprocesses of the current shell; maybe that is not wanted or required? Also export variable=value is not portable; proper Bourne shell script syntax would use variable=value; export variable or one of the many variations.)
If you know what exactly your Python script needs from the shell script, maybe do something like
r = subprocess.run('. /path/to/file; printf "%s\n" "$somevariable"',
shell=True, capture_output=True, text=True)
os.environ['somevariable'] = r.stdout.split('\n')[-2]
to source the entire script in a subshell, then print to standard output the part you actually need, and capture that from your Python script (and assign it to an environment variable if that's what you eventually need to accomplish).

Passing variables from Python GUI to PowerShell

So I am creating an application that can connect printers with a Python GUI that runs PowerShell scripts in the background. I was wondering if there was a way I could pass a variable inputted from a Python widget into a PowerShell script that is being invoked by Python. This variable would be the name of the printer that I could specify in Python so that I do not have to create separate scripts for each printer.
My code in Python that calls upon the PS script:
def connect():
if self.printerOpts.get() == 'Chosen Printer':
subprocess.call(["C:\\WINDOWS\\system32\\WindowsPowerShell\\v1.0\\powershell.exe",'-ExecutionPolicy','Unrestricted', '.\'./ScriptName\';'])
PS script that connects printer to computer:
Add-Printer -ConnectionName \\server\printer -AsJob
Basically, I am wondering if I can pass a variable from Python into the "printer" part of my PS script so that I do not have to create a different script for each printer that I would like to add.
A better way to do this would be completely in PowerShell or complete in Python.
What you're after is doable. You can pass it in the same way that you have passed -ExecutionPolicy Unrestricted, by ensuring that the PowerShell script is expecting the variable.
My Python is non-existant so please bear with if that part doesn't work.
Python
myPrinter # string variable in Python with printer name
subprocess.call(["C:\\WINDOWS\\system32\\WindowsPowerShell\\v1.0\\powershell.exe",'-ExecutionPolicy','Unrestricted', '.\'./ScriptName\';','-printer',myPrinter])
PowerShell
param(
$printer
)
Add-Printer -ConnectionName \\server\$printer -AsJob
The way that worked for me was first to specify that I was passing a variable as a string in my PS script:
param([string]$path)
Add-Printer -ConnectionName \\server\$path
My PS script was not expecting this variable. In my Python script I had to first define the my variable which named path as a string and then input "path" into the end of my subprocess function.
path = "c"
subprocess.call(["C:\\WINDOWS\\system32\\WindowsPowerShell\\v1.0\\powershell.exe",'-ExecutionPolicy','Unrestricted', 'Script.ps1', path])

Running python script within a shell script: files don't save

I am very new to shell scripting, so I'm still figuring things out. Here is my problem:
I have a python .py executable file which creates multiple files and saves them to a directory. I need to run that file in a shell script. For some reason, the shell script executes the python script but no new files appear in my directory. When I just run the .py file, everything works fine
Here's what my shell script looks like:
#!/bin/bash
cd /home/usr/directory
python myfile.py
Within my python script, the files that are saved are pickled object instances. So every one of them looks something like this:
f = file('/home/usr/anotherdirectory/myfile.p','w')
pickle.dump(myObject,f)
f.close()
This line:
f = file('/home/usr/directory/myfile.p','w')
Should be:
f = open('/home/usr/directory/myfile.p','wb+')
For best practices it should be done like this:
with open('/home/usr/directory/myfile.p','wb+') as fs:
pickle.dump(myObject, fs)
The documentation for the file function states:
When opening a file, it’s preferable to use open() instead of invoking this constructor directly.
Problems like this may be one of the reasons why. Try changing
f = file('/home/usr/directory/myfile.p','w')
to
f = open('/home/usr/directory/myfile.p','w')

Categories

Resources