This question already has answers here:
subprocess.call() arguments ignored when using shell=True w/ list [duplicate]
(2 answers)
Python subprocess.call seems to ignore parameters
(2 answers)
Closed 4 years ago.
I am trying to automate nmap scans and I am using the subprocess module to do so. I am pretty much passing three variables to subprocess.call and having the command be run. Here is my code
import subprocess
TOOL = 'nmap'
joined = '-p1 5000'
target = 'localhost'
subprocess.call([TOOL, joined, target], shell=True)
This should lead to nmap -p1 5000 localhost being ran on my system which is a valid command, however, the call method seems to only be recognizing TOOL(nmap) and it just prints out the options for nmap. Does anyone know what I'm missing here?
I don't have nmap installed but you need set shell=False and split parameters:
import subprocess
TOOL = 'ls'
joined = '-a -l'
target = '/tmp'
print(subprocess.call([TOOL, *joined.split(), target], shell=False))
Related
This question already has answers here:
How can I specify working directory for a subprocess
(2 answers)
Closed 8 months ago.
im new to subprocessing and I have a question.
I can't find a proper solution online.
I want my path to be in a variable --> that will be passed to a function --> that will be passed to a subprocess.
I'm not allowed to show my real code, but this simple example (that I just can't get to work) would help me a lot.
This code snippet should do:
Just take my path from a variable.
"cd" the path in CMD.
Open a file that is located in this path.
So far I tried:
import subprocess
test_path = "C:/randome_path/.."
def Test_Function(test_path):
subprocess.call("cd", test_path, shell = True)
subprocess.call("python file.py", shell = True)
Test_Function()
My ErrorMessage is:
TypeError: Test_Function() missing 1 required positional argument: 'test_path'
Thank you for your time!
First you need to pass a parameter to your function because that's how you declarerd:
Test_Function(test_path) # here the function call with parameter
or using the key-value "approach"
another_path = # ...
Test_Function(test_path=another_path)
Second: the command is expecting a string not a further parameter
subprocess.call(f"python file.py", shell=True, cwd=test_path)
Note 1 to execute such command python file.py it is assumed that python's path is declared in some environment variable
Note 2 that subprocess may have some different "behaviour" under Windows
Try without shell=True. Now the commands should be given as a list of strings
def Test_Function(test_path):
subprocess.call(["python", "file.py"], cwd=test_path)
This question already has answers here:
Python, subprocess, how to pass multiples variables [duplicate]
(2 answers)
Closed 2 years ago.
I'm trying to call subprocess from a python script. Script would call 'lftp' on linux with specific parameters as shown below. The problem as that I can not pass filename (filename will be different every day).
I was trying almost every combination but without success (for example: ${fname}, $fname, {fname} and so on). I'm running out of ideas so I'm asking for a help.
Every time I get response from ftps server Access failed: 550 The system cannot find the file specified. I can properly log on and change folder.
import subprocess
import datetime
fname=different_every_day
proc=subprocess.call(
["lftp", "-u", "user:password", "ftps://servername:990", "-e",
"set ftp:ssl-protect-data true; set ftp:ssl-force true; "
"set ssl:verify-certificate no;get ${fname}"])
print(proc)
P.S. Close to proper answer was wagnifico, so i will accept his answer but for others who need solution it suppose to be as below:
proc=subprocess.call(["lftp","-u","user:pass","ftps://example.something","-e","set ftp:ssl-protect-data true; set ftp:ssl-force true; set ssl:verify-certificate no;cd Ewidencja;pget "+'"'+fname+'"'])
You are mixing python and environment variables.
When you use ${fname}, bash considers fname an environment variable, something known by your OS. Since it is not defined, it will use an empty value, thus, not finding the file.
You either need to define fname in your terminal and them call it in python, as in the question:
export fname='2020-10-29 - All computers.xls'
python your_code.py
Also, you need to add the flag shell=True when you call subprocess.call
Or define it entirely in python:
fname='2020-10-29 - All computers.xls'
proc=subprocess.call(
["lftp", "-u", "user:password", "ftps://servername:990", "-e",
"set ftp:ssl-protect-data true; set ftp:ssl-force true; "
"set ssl:verify-certificate no;get " + fname])
here try it:
import os
import time
def python_to_bash(cli_args):
output = os.popen(cli_args).read()
return output
file_name = str(time.time())+".xls"
python_to_bash("lftp -u user:password ftps://servername:990 -e set ftp:ssl-protect-data true set ftp:ssl-force true set ssl:verify-certificate no get "+file_name)
i dont know if the command that you need is right but when i need to make any dynamic names i use in this form
This question already has answers here:
How to use an existing Environment variable in subprocess.Popen()
(3 answers)
Closed 4 years ago.
Writing a python program running in a linux environment.
I cannot use paramiko etc in this environment.
I have written a series of methods to interact with the command line the one with the issue...
import subprocess
def echo(self, echo_arg):
cmd = subprocess.Popen(["echo", echo_arg], stdout=subprocess.PIPE)
return cmd.communicate()[0]
In linux I have an envionment variable UPFW_WORK_PATH
when I later call...
self.echo("$UPFW_WORK_PATH")
the console output returned is literally :
$UPFW_WORK_PATH
however when I type into the terminal...
echo $UPFW_WORK_PATH
I am returned (not actual path names):
/example/file/path
What is causing this discrepancy between manually typing "echo" to the terminal and my python method calling echo by subprocess
When you run echo $x from the shell, it is the shell that expands the variable into its value. So if the value of x is 5, for example, the argument that echo receives is 5. It will never know about the variable.
So the solution is to retrieve the value of the environment variable in your python program and pass that value to echo:
import subprocess
import os
echo_arg = os.environ['UPFW_WORK_PATH']
cmd = subprocess.Popen(["echo", echo_arg], stdout=subprocess.PIPE)
This question already has answers here:
How to store the result of an executed shell command in a variable in python? [duplicate]
(4 answers)
Closed 7 years ago.
I'm running a dmidecode in linux to get the list of hardware information. What is the best way to read over the output and select certain bits of information? For example get the Product Name: part of the dmidecode?
At the moment I'm writing the subprocess output to a file then reading over the file for a given string. This seems such an inefficient way of doing things.
Also I know about the python dmidecode model but for the life of me I can't get it working it just keeps saying there's no bios attribute
If you know the specific keyword you are looking for you can type: dmidecode -s keyword
In your case it would be:
dmidecode -s system-product-name
You can also filter by type. For example:
To return System information:
dmidecode -t1
To return BaseBoard information:
dmidecode -t2
To return Chassis Information:
dmidecode -t3
There are multiple ways with which you can get the output of the command in your python script using subprocess module.
subprocess.Popen() - you can start the command line process using this Popen class specifying stdout as subprocess.PIPE and then use communicate function to get the results. Example -
import subprocess
p = subprocess.Popen(['dmidecode'] , stdout=subprocess.PIPE)
result = p.communicate()[0]
subprocess.check_output() - this function returns the output of the command (output to stdout) as a byte string after executing the command. Example -
import subprocess
result = subprocess.check_output(['dmidecode'])
For your particular case, subprocess.check_output() is most probably more suited as you do not need to provide any inputs to the process.
With subprocess.Popen() you can also need to provide inputs to the process , by PIPING the stdin for the process.
This question already has answers here:
How to read/process command line arguments?
(22 answers)
Closed 8 years ago.
So i've been at this one for a little while and cant seem to get it. Im trying to execute a python script via terminal and want to pass a string value with it. That way, when the script starts, it can check that value and act accordingly. Like this:
sudo python myscript.py mystring
How can i go about doing this. I know there's a way to start and stop a script using bash, but thats not really what im looking for. Any and all help accepted!
Try the following inside ur script:
import sys
arg1 = str(sys.argv[1])
print(arg1)
Since you are passing a string, you need to pass it in quotes:
sudo python myscript.py 'mystring'
Also, you shouldn't have to run it with sudo.