Running scripts whose purpose is to change shell variables with subprocess.popen - python

I have a series of scripts I am automating the calling of in python with subprocess.Popen. Basically, I call script A, then script B, then script C and so forth.
Script A sets a bunch of local shell variables, with commands such as set SOME_VARIABLE=SCRIPT_A, set PATH=%SCRIPT_A:/=\;%PATH%.
Script B and C then need to have the effects of this. In unix, you would call script A with "source script_a.sh". The effect lasts in the current command window. However, subprocess.Popen effectively launches a new window (kind of).
Apparently subprocess.Popen is not the command I want to do this. How would I do it?
edit I have tried parsing the file (which is all 'set' statements) and passing them as a dictionary to 'env' in subprocess.Popen, but it doesn't seem to have all worked..

You can use the env argument to Popen with a python dictionary containing the variables then you don't need to run the command that just sets variables.

if 'script A' get generated by another process, you will either need to change the other process so the output file is in a format that you can source (import) into your python script. or write a parser in python that can digest the vars out of 'Script A' setting them within your python script.

If all you want is to call a series of batch files using Python you could create a helper batch file which would call all these batch files like this
call scriptA;
call scriptB;
call scriptC;
and run this helper batch file using subprocess.Popen

Related

How to call a python script that requires command line arguments (that are not static) from Rstudio?

I am trying to create a Shiny app where a user will choose a short string from a drop-down menu, that string will then be passed to a python script which will output some values which will be used in the shiny app.
Using reticulate's py_run_file function with the needed values hardcoded works great. However, using this:
py_run_file('test_script.py arg1')
gives this:
Error in py_run_file_impl(file, local, convert) :
Unable to open file 'test_script.py arg1' (does it exist?)
Several threads suggest using a system() call to run a .py script with command line arguments but I don't think that would be feasible for the goals because the argument needs to be able to change. Other threads have suggested creating a python file that calls the original python file using os.system() with arguments, but that also doesn't work for my situation.
Does anyone have any ideas?
Thanks
If anyone else is struggling with this problem: I found a workaround.
Instead of feeding an argument to the python script, I just create a R global environment variable then call it in the python script.
Had no idea you could reference R environment variables by calls such as r.RVar in the python script, similar to py$PythonVar when calling python variables in R scripts.

how to save data via export in python program?

I want to automate a specific task, and I have a bash file that I want to read data from user arguments by running the script
bash my_script.sh /etc/??? path/dst
till now, I'm getting the data in my script by access to positional parameters ($1, $2) and variables, and that's fine.
But inside the script, I want to run a python program. So...
python test.py arg1 arg2 arg3
the things is, I have to read the data via python and access the output in my_script.sh
There is a constraint that I shouldn't create a file on the system. So I wonder about using export, but export is volatile since it's storing the variable till that process is opened and when I get back to my_script.sh, there isn't any clue of that data also I have no privilege to write my variable, and it's data on ~/.bashrc.
Also, I have read this and this, but they don't work the way I wanted.
if you regard the way I'm doing it, is wrong, let me know, please.
Thanks to #chepners comment, the solution was using $() instead of using export.
As it's mentioned in comments :
export only passes information from a parent to a child process, not the other direction.
So one of the correct ways to get an output from other application within your script is as follows:
output=$(python test.py arg1 arg2 arg3)

Passing modified environment using subprocess

I have a script that adds variables to the environment. That script is called through
subprocess.call('. myscript.sh', shell=True)
Is there a way I can get the modified environment and use it on my next subprocess call?This questions shows you can get the output of one call and chain it to another call Python subprocess: chaining commands with subprocess.run.Is there something similar with passing the environment?
You'll have to output the variables' content somehow. You're spawning a new process which will not propagate the environment variables back, so your python app will not see those values.
You could either make the script echo those to some file, or to the standard output if possible.
(Technically, it would be possible to stop the process and extract the values if you really wanted to hack that, but it's a bad idea.)

Retrieve environment variables from popen

I am working on converting an older system of batch files to python. I have encountered a batch file that sets the environment variables for the batch file that called it. Those values are then used again in future calls to batch files. My environment variables seem to go down the flow of calls but I am unable to bring them back up. I have read two different threads on here that talk about something similar but its not exactly the same. Is it possible to retrieve these environment variables from the subprocess?
Python subprocess/Popen with a modified environment
Set Environmental Variables in Python with Popen
What I am doing now:
p = Popen(process_, cwd=wd, stdout=PIPE, shell=True,
universal_newlines=True, stderr=STDOUT, env=env)
The old flow of bat files:
foo.bat calls foo-setparam.bat
foo-setparam.bat sets some variables
foo.bat uses these variables
foo.bat calls bar.bat
bar.bat uses variables set by both foo.bat and foo-setparam.bat
Update:
I have tried adding "call " in front of my popen parameter. I got the same behavior.
It's generally not possible for a child process to set parent environment variables, unless you want to go down a mildly evil path of creating a remote thread in the parent process and using that thread to set variables on your behalf.
It's much easier to have the python script write a temp file that contains a sequence of set commands like:
set VAR1=newvalue1
set VAR2=newvalue2
Then modify the calling batch file to call your python script like:
python MyScript.py
call MyScript_Env.cmd

Python Script Executed with Makefile

I am writing python scripts and execute them in a Makefile. The python script is used to process data in a pipeline. I would like Makefile to execute the script every time I make a change to my python scripts.
Does anyone have an idea of how to do this?
That's not a lot of information, so this answer is a bit vague. The basic principle of Makefiles is to list dependencies for each target; in this case, your target (let's call it foo) depends on your python script (let's call it do-foo.py):
foo: do-foo.py
python do-foo.py > foo
Now foo will be rerun whenever do-foo.py changes (provided, of course, you call make).
And in case when the scripts that need to be run don't produce any useful output file that can be used as a target, you can just use a dummy target:
scripts=a.py b.py c.py
checkfile=.pipeline_up_to_date
$(checkfile): $(scripts)
touch $(checkfile)
echo "Launching some commands now."
default: $(checkfile)
If you want that Makefile to be automatically "maked" immediately after saving, pyinotify, which is a wrapper for inotify, might be the only possibility under Linux. It registers at the kernel to detect FS changes and calls back your function.
See my previous post on that topic.

Categories

Resources