This question already has answers here:
How to use an existing Environment variable in subprocess.Popen()
(3 answers)
Closed 4 years ago.
Writing a python program running in a linux environment.
I cannot use paramiko etc in this environment.
I have written a series of methods to interact with the command line the one with the issue...
import subprocess
def echo(self, echo_arg):
cmd = subprocess.Popen(["echo", echo_arg], stdout=subprocess.PIPE)
return cmd.communicate()[0]
In linux I have an envionment variable UPFW_WORK_PATH
when I later call...
self.echo("$UPFW_WORK_PATH")
the console output returned is literally :
$UPFW_WORK_PATH
however when I type into the terminal...
echo $UPFW_WORK_PATH
I am returned (not actual path names):
/example/file/path
What is causing this discrepancy between manually typing "echo" to the terminal and my python method calling echo by subprocess
When you run echo $x from the shell, it is the shell that expands the variable into its value. So if the value of x is 5, for example, the argument that echo receives is 5. It will never know about the variable.
So the solution is to retrieve the value of the environment variable in your python program and pass that value to echo:
import subprocess
import os
echo_arg = os.environ['UPFW_WORK_PATH']
cmd = subprocess.Popen(["echo", echo_arg], stdout=subprocess.PIPE)
Related
This question already has answers here:
How can I specify working directory for a subprocess
(2 answers)
Closed 8 months ago.
im new to subprocessing and I have a question.
I can't find a proper solution online.
I want my path to be in a variable --> that will be passed to a function --> that will be passed to a subprocess.
I'm not allowed to show my real code, but this simple example (that I just can't get to work) would help me a lot.
This code snippet should do:
Just take my path from a variable.
"cd" the path in CMD.
Open a file that is located in this path.
So far I tried:
import subprocess
test_path = "C:/randome_path/.."
def Test_Function(test_path):
subprocess.call("cd", test_path, shell = True)
subprocess.call("python file.py", shell = True)
Test_Function()
My ErrorMessage is:
TypeError: Test_Function() missing 1 required positional argument: 'test_path'
Thank you for your time!
First you need to pass a parameter to your function because that's how you declarerd:
Test_Function(test_path) # here the function call with parameter
or using the key-value "approach"
another_path = # ...
Test_Function(test_path=another_path)
Second: the command is expecting a string not a further parameter
subprocess.call(f"python file.py", shell=True, cwd=test_path)
Note 1 to execute such command python file.py it is assumed that python's path is declared in some environment variable
Note 2 that subprocess may have some different "behaviour" under Windows
Try without shell=True. Now the commands should be given as a list of strings
def Test_Function(test_path):
subprocess.call(["python", "file.py"], cwd=test_path)
This question already has answers here:
Python, subprocess, how to pass multiples variables [duplicate]
(2 answers)
Closed 2 years ago.
I'm trying to call subprocess from a python script. Script would call 'lftp' on linux with specific parameters as shown below. The problem as that I can not pass filename (filename will be different every day).
I was trying almost every combination but without success (for example: ${fname}, $fname, {fname} and so on). I'm running out of ideas so I'm asking for a help.
Every time I get response from ftps server Access failed: 550 The system cannot find the file specified. I can properly log on and change folder.
import subprocess
import datetime
fname=different_every_day
proc=subprocess.call(
["lftp", "-u", "user:password", "ftps://servername:990", "-e",
"set ftp:ssl-protect-data true; set ftp:ssl-force true; "
"set ssl:verify-certificate no;get ${fname}"])
print(proc)
P.S. Close to proper answer was wagnifico, so i will accept his answer but for others who need solution it suppose to be as below:
proc=subprocess.call(["lftp","-u","user:pass","ftps://example.something","-e","set ftp:ssl-protect-data true; set ftp:ssl-force true; set ssl:verify-certificate no;cd Ewidencja;pget "+'"'+fname+'"'])
You are mixing python and environment variables.
When you use ${fname}, bash considers fname an environment variable, something known by your OS. Since it is not defined, it will use an empty value, thus, not finding the file.
You either need to define fname in your terminal and them call it in python, as in the question:
export fname='2020-10-29 - All computers.xls'
python your_code.py
Also, you need to add the flag shell=True when you call subprocess.call
Or define it entirely in python:
fname='2020-10-29 - All computers.xls'
proc=subprocess.call(
["lftp", "-u", "user:password", "ftps://servername:990", "-e",
"set ftp:ssl-protect-data true; set ftp:ssl-force true; "
"set ssl:verify-certificate no;get " + fname])
here try it:
import os
import time
def python_to_bash(cli_args):
output = os.popen(cli_args).read()
return output
file_name = str(time.time())+".xls"
python_to_bash("lftp -u user:password ftps://servername:990 -e set ftp:ssl-protect-data true set ftp:ssl-force true set ssl:verify-certificate no get "+file_name)
i dont know if the command that you need is right but when i need to make any dynamic names i use in this form
This question already has answers here:
Is it possible to change the Environment of a parent process in Python?
(4 answers)
Closed 4 years ago.
I have a bash script that looks like this:
python myPythonScript.py
python myOtherScript.py $VarFromFirstScript
and myPythonScript.py looks like this:
print("Running some code...")
VarFromFirstScript = someFunc()
print("Now I do other stuff")
The question is, how do I get the variable VarFromFirstScript back to the bash script that called myPythonScript.py.
I tried os.environ['VarFromFirstScript'] = VarFromFirstScript but this doesn't work (I assume this means that the python environment is a different env from the calling bash script).
you cannot propagate an environment variable to the parent process. But you can print the variable, and assign it back to the variable name from your shell:
VarFromFirstScript=$(python myOtherScript.py $VarFromFirstScript)
you must not print anything else in your code, or using stderr
sys.stderr.write("Running some code...\n")
VarFromFirstScript = someFunc()
sys.stdout.write(VarFromFirstScript)
an alternative would be to create a file with the variables to set, and make it parse by your shell (you could create a shell that the parent shell would source)
import shlex
with open("shell_to_source.sh","w") as f:
f.write("VarFromFirstScript={}\n".format(shlex.quote(VarFromFirstScript))
(shlex.quote allows to avoid code injection from python, courtesy Charles Duffy)
then after calling python:
source ./shell_to_source.sh
You can only pass environment variables from parent process to child.
When the child process is created the environment block is copied to the child - the child has a copy, so any changes in the child process only affects the child's copy (and any further children which it creates).
To communicate with the parent the simplest way is to use command substitution in bash where we capture stdout:
Bash script:
#!/bin/bash
var=$(python myPythonScript.py)
echo "Value in bash: $var"
Python script:
print("Hollow world!")
Sample run:
$ bash gash.sh
Value in bash: Hollow world!
You have other print statements in python, you will need to filter out to only the data you require, possibly by marking the data with a well-known prefix.
If you have many print statements in python then this solution is not scalable, so you might need to use process substitution, like this:
Bash script:
#!/bin/bash
while read -r line
do
if [[ $line = ++++* ]]
then
# Strip out the marker
var=${line#++++}
else
echo "$line"
fi
done < <(python myPythonScript.py)
echo "Value in bash: $var"
Python script:
def someFunc():
return "Hollow World"
print("Running some code...")
VarFromFirstScript = someFunc()
# Prefix our data with a well-known marker
print("++++" + VarFromFirstScript)
print("Now I do other stuff")
Sample Run:
$ bash gash.sh
Running some code...
Now I do other stuff
Value in bash: Hollow World
I would source your script, this is the most commonly used method. This executes the script under the current shell instead of loading another one. Because this uses same shell env variables you set will be accessible when it exits. . /path/to/script.sh or source /path/to/script.sh will both work, . works where source doesn't sometimes.
This question already has answers here:
Python running in Windows subprocess.call with spaces and parameters
(2 answers)
Closed 4 months ago.
To run a command in python, for Windows, I do:
import subprocess
subprocess.check_output(lsCommand, shell=True)
where lsCommand is a list of strings that make up the bash command. This works, except when it contains some input with spaces in it. For example, copying + changing a name:
To try and do cp "test 123" test123:
lsCommand = ['cp', 'test 123', 'test123']
subprocess.check_output(lsCommand, shell=True)
fails because it thinks I am trying to do cp "test" "123" test123. Error (doing google storage stuff):
python: can't open file 'c:\GSUtil\gsutil.py cp -n gs://folderl/test': [Errno 22] Invalid argument
Then I try
subprocess.check_output('cp "test 123" test123', shell=True)
Same shit. Any ideas?
cp is not an internal command and therefore you don't need shell=True (though you might need to specify a full path to cp.exe).
The internal interface for starting a new subprocess on Windows uses a string i.e., it is up to the specific application how to interpret a command-line. The default MS C runtime rules (imlemented in subprocess.list2cmdline() that is called implicitly if you pass a list on Windows) should work fine in this case:
#!/usr/bin/env python
from subprocess import check_call
check_call(['cp', 'test 123', 'test123'])
If you want to use shell=True then the program that interprets the command line is cmd.exe and you should use its escape rules (e.g., ^ is a meta-character) and pass the command as a string as is (as you see it in the Windows console):
check_call('copy /Y /B "test 123" test123', shell=True)
Obviously, you don't need to start an external process, to copy a file in Python:
import shutil
shutil.copy('test 123', 'test123')
for Ubuntu:
subprocess.check_output(['list', 'of', 'commands with spaces'])
for Windows:
subprocess.check_output('single command "string with spaces"')
Thanks for info that I don't need shell=True.
This question already has answers here:
How to store the result of an executed shell command in a variable in python? [duplicate]
(4 answers)
Closed 7 years ago.
I'm running a dmidecode in linux to get the list of hardware information. What is the best way to read over the output and select certain bits of information? For example get the Product Name: part of the dmidecode?
At the moment I'm writing the subprocess output to a file then reading over the file for a given string. This seems such an inefficient way of doing things.
Also I know about the python dmidecode model but for the life of me I can't get it working it just keeps saying there's no bios attribute
If you know the specific keyword you are looking for you can type: dmidecode -s keyword
In your case it would be:
dmidecode -s system-product-name
You can also filter by type. For example:
To return System information:
dmidecode -t1
To return BaseBoard information:
dmidecode -t2
To return Chassis Information:
dmidecode -t3
There are multiple ways with which you can get the output of the command in your python script using subprocess module.
subprocess.Popen() - you can start the command line process using this Popen class specifying stdout as subprocess.PIPE and then use communicate function to get the results. Example -
import subprocess
p = subprocess.Popen(['dmidecode'] , stdout=subprocess.PIPE)
result = p.communicate()[0]
subprocess.check_output() - this function returns the output of the command (output to stdout) as a byte string after executing the command. Example -
import subprocess
result = subprocess.check_output(['dmidecode'])
For your particular case, subprocess.check_output() is most probably more suited as you do not need to provide any inputs to the process.
With subprocess.Popen() you can also need to provide inputs to the process , by PIPING the stdin for the process.