How to read variable exported from shell script , in python - python

I have two script
1. demo.ksh
2. demo.py
in demo.ksh i am exporting variable as
#!/bin/ksh
TEST="Hello"
export TEST
in demo.py I am executing demo.ksh and trying to read exported value as ..
import os
import subprocess
cmd='. demo.ksh' #I even tried 'demo.py' (no .)
subprocess(cmd,shell=True,stdout=subprocess.PIPE)
print(os.getenv('TEST'))
print(os.environ['TEST'])
I am expecting
Hello
Hello
But getting
None
KeyError: 'TEST'
Although it is a simple exercise. I could not find correct solution for this.Please help me what is wrong with my code.

Exporting a variable makes it available to sub-processes of the spawned shell process, but not to the parent process (i.e. your Python program).
To get the expected output try a shell script like this:
#!/bin/ksh
TEST="Hello"
export TEST
python demo.py
You can instead communicate with the subprocess via STDOUT. For this, subprocess.check_output can be useful.

Related

Some functions from library works in python shell but not in a script

I'm having trouble getting certain functions from a library called art (https://github.com/sepandhaghighi/art) to run in a script, though they work fine in a shell. The script/commands entered sequentially look like this:
from art import *
randart() <(function fails in script, but succeeds in shell)
tart("test") <(different function, same library, succeeds in both shell and script)
import sys
print(sys.version)
The python version is 3.7.5 for both the shell and the script. The first function does not throw an error when run in a script, but does not give any output. Its desired output is a random ascii_art from a collection. I feel like I'm missing something really simple. Any ideas? The documentation on github reports "Some environments don't support all 1-Line arts", however they are the same python version on the same machine. Are there other portions of the environment that could be the cause?
You need to print randart() while writing in script. Make a habit of using print() for everything while printing. Shell is the place which returns the value by default whereas you need to tell the script window what to do with any name or function.
so use this:
from art import *
print(randart())
in the shell it is implicitly printed ... in a script you must explicitly
print(randart())

Running python script from perl, with argument to stdin and saving stdout output

My perl script is at path:
a/perl/perlScript.pl
my python script is at path:
a/python/pythonScript.py
pythonScript.py gets an argument from stdin, and returns result to stdout. From perlScript.pl , I want to run pythonScript.py with the argument hi to stdin, and save the results in some variable. That's what I tried:
my $ret = `../python/pythonScript.py < hi`;
but I got the following error:
The system cannot find the path specified.
Can you explain the path can't be found?
The qx operator (backticks) starts a shell (sh), in which prog < input syntax expects a file named input from which it will read lines and feed them to the program prog. But you want the python script to receive on its STDIN the string hi instead, not lines of a file named hi.
One way is to directly do that, my $ret = qx(echo "hi" | python_script).
But I'd suggest to consider using modules for this. Here is a simple example with IPC::Run3
use warnings;
use strict;
use feature 'say';
use IPC::Run3;
my #cmd = ('program', 'arg1', 'arg2');
my $in = "hi";
run3 \#cmd, \$in, \my $out;
say "script's stdout: $out";
The program is the path to your script if it is executable, or perhaps python script.py. This will be run by system so the output is obtained once that completes, what is consistent with the attempt in the question. See documentation for module's operation.
This module is intended to be simple while "satisfy 99% of the need for using system, qx, and open3 [...]. For far more power and control see IPC::Run.
You're getting this error because you're using shell redirection instead of just passing an argument
../python/pythonScript.py < hi
tells your shell to read input from a file called hi in the current directory, rather than using it as an argument. What you mean to do is
my $ret = `../python/pythonScript.py hi`;
Which correctly executes your python script with the hi argument, and returns the result to the variable $ret.
The Some of the other answers assume that hi must be passed as a command line parameter to the Python script but the asker says it comes from stdin.
Thus:
my $ret = `echo "hi" | ../python/pythonScript.py`;
To launch your external script you can do
system "python ../python/pythonScript.py hi";
and then in your python script
import sys
def yourFct(a, b):
...
if __name__== "__main__":
yourFct(sys.argv[1])
you can have more informations on the python part here

Pass variable from Python to Bash

I am writing a bash script in which a small python script is embedded. I want to pass a variable from python to bash. After a few search I only found method based on os.environ.
I just cannot make it work. Here is my simple test.
#!/bin/bash
export myvar='first'
python - <<EOF
import os
os.environ["myvar"] = "second"
EOF
echo $myvar
I expected it to output second, however it still outputs first. What is wrong with my script? Also is there any way to pass variable without export?
summary
Thanks for all answers. Here is my summary.
A python script embedded inside bash will run as child process which by definition is not able to affect parent bash environment.
The solution is to pass assignment strings out from python and eval it subsequently in bash.
An example is
#!/bin/bash
a=0
b=0
assignment_string=$(python -<<EOF
var1=1
var2=2
print('a={};b={}'.format(var1,var2))
EOF
)
eval $assignment_string
echo $a
echo $b
Unless Python is used to do some kind of operation on the original data, there's no need to import anything. The answer could be as lame as:
myvar=$(python - <<< "print 'second'") ; echo "$myvar"
Suppose for some reason Python is needed to spit out a bunch of bash variables and assignments, or (cautiously) compose code on-the-fly. An eval method:
myvar=first
eval "$(python - <<< "print('myvar=second')" )"
echo "$myvar"
Complementing the useful Cyrus's comment in question, you just can't do it. Here is why,
Setting an environment variable sets it only for the current process and any child processes it launches. os.environ will set it only for the shell that is running to execute the command you provided. When that command finishes, the shell goes away, and so does the environment variable.
You can pretty much do that with a shell script itself and just source it to reflect it on the current shell.
There are a few "dirty" ways of getting something like this done. Here is an example:
#!/bin/bash
myvar=$(python - <<EOF
print "second"
EOF
)
echo "$myvar"
The output of the python process is stored in a bash variable. It gets a bit messy if you want to return more complex stuff, though.
You can make python return val and pass it to bash:
pfile.py
print(100)
bfile.sh
var=$(python pfile.py)
echo "$var"
output: 100
Well, this may not be what you want but one option could be running the other batch commands in python using subprocess
import subprocess
x =400
subprocess.call(["echo", str(x)])
But this is more of a temporary work around. The other solutions are more along what you are looking for.
Hope I was able to help!

set bash variable from python script

i'm calling a python script inside my bash script and I was wondering if there is a simple way to set my bash variables within my python script.
Example:
My bash script:
#!/bin/bash
someVar=""
python3 /some/folder/pythonScript.py
My python script:
anotherVar="HelloWorld"
Is there a way I can set my someVar to the value of anotherVar? I was thinking of printing properties in a file inside the python script and then read them from my bash script but maybe there is another way. Also I don't know and don't think it makes any difference but I can name both variable with the same name (someVar/someVar instead of someVar/anotherVar)
No, when you execute python, you start a new process, and every process has access only to their own memory. Imagine what would happen if a process could influence another processes memory! Even for parent/child processes like this, this would be a huge security problem.
You can make python print() something and use that, though:
#!/usr/bin/env python3
print('Hello!')
And in your shell script:
#!/usr/bin/env bash
someVar=$(python3 myscript.py)
echo "$someVar"
There are, of course, many others IPC techniques you could use, such as sockets, pipes, shared memory, etc... But without context, it's difficult to make a specific recommendation.
shlex.quote() in Python 3, or pipes.quote() in Python 2, can be used to generate code which can be evaled by the calling shell. Thus, if the following script:
#!/usr/bin/env python3
import sys, shlex
print('export foobar=%s' % (shlex.quote(sys.argv[1].upper())))
...is named setFoobar and invoked as:
eval "$(setFoobar argOne)"
...then the calling shell will have an environment variable set with the name foobar and the value argOne.

Access a script's variables and functions in interpreter after runtime

So let's say I have a script script1. Is there a way to interact with script1's variables and functions like an interpreter after or during its runtime?
I'm using IDLE and Python 2.7, but I'm wondering if I could do this in any interpreter not just IDLE's.
Say in my script, get = requests.get("example.com"). I'd like to hit F5 or whatever to run my script, and then instead of the console unloading all of the variables from memory, I'd like to be able to access the same get variable.
Is this possible?
That's a serious question. You might need to consult this page:
https://docs.python.org/2/using/cmdline.html#miscellaneous-options
Note the -i option, it makes interpreter enter interactive mode after executing given script.
you can do like this:
#file : foo.py
import requests
def req():
get = requests.get("example.com")
return get
and then run the script from a console
import foo
get = foo.req()

Categories

Resources