How to execute any external windows command with multiple parameter using python - python

I need to execute the command below using python but unable to execute:
cmd="C:\Program Files\Java\jdk1.7.0_51\bin\java.exe" -classpath ./;sqljdbc4.jar InsertTestIncidentData -h 172.20.240.57 -p 1433 -u sa -w Recnex#1 -d ePO_WINEP02 -n 10

import os
cmd='"C:\Program Files\Java\jdk1.7.0_51\bin\java.exe" -classpath ./;sqljdbc4.jar InsertTestIncidentData -h 172.20.240.57 -p 1433 -u sa -w Recnex#1 -d ePO_WINEP02 -n 10'
os.system(cmd)
Since there's a whitespace in Program Files, you should quote the path with another double quote.

cmd='"C:\Program Files\Java\jdk1.7.0_51\bin\java.exe" -classpath ./;sqljdbc4.jar InsertTestIncidentData -h 172.20.240.57 -p 1433 -u sa -w Recnex#1 -d ePO_WINEP02 -n 10'
You can use os.system:
import os
os.system(cmd)
or with subprocess:
import subprocess
ret=subprocess.Popen([cmd])
print "Returning status",ret.wait()
if you are verifying some output from your command:
import subprocess
output=subprocess.chec_call([cmd])

Related

Executing a bash command in a docker container from python script on host fails

I am trying to execute a bash command from python script that is wrapped in a docker exec call as the command needs to be executed inside a container.
This script is being executing on the host machine:
command_line_string = f"java -cp {omnisci_utility_path}:{driver_path} com.mapd.utility.SQLImporter" \
f" -u {omni_user} -p {omni_pass} -db {database_name} --port {omni_port}" \
f" -t {self.table_name} -su {denodo_user} -sp {denodo_pass}" \
f" -c {self.reader.connection_string}"\
f" -ss \"{read_data_query}\""
# in prod we have docker so we wrap it in docker exec:
if(args.env_type == "prod"):
command_line_string = f"docker exec -t {args.container_id} /bin/bash -c \"{command_line_string}\""
command_line_args = shlex.split(command_line)
command_line_process = subprocess.Popen(
command_line_args,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
process_output, _ = command_line_process.communicate()
However, when I execute the command, supplying the arguments I get a "Java Usage" response suggesting that the java command I am invoking did not have the correct parameters:
2021-09-01:09:19:09 [default_omnisci_ingestion.py:64] INFO - docker exec -t 5d874bffcdf8 /bin/bash -c "java -cp /omnisci/bin/omnisci-utility-5.6.5.jar
:/root/denodo-8-vdp-jdbcdriver.jar com.mapd.utility.SQLImporter -u admin -p mypass -db omnisci --port 6274 -t MyTable -su sourceDBuser -sp sourceDBpass -c jdbc:vdb://sourceDBURL -ss "SELECT
basin as Basin,
reservoir as Reservoir, cast(case when wkt like '%M%' Then wkt Else replace(wkt, 'POLYGON ', 'MULTIPOLYGON (') || ')' End as varchar(999999)) as wkt
FROM
schema.myTable;""
2021-09-01:09:19:10 [command_executor.py:10] INFO - Usage: java [options] <mainclass> [args...]
2021-09-01:09:19:10 [command_executor.py:10] INFO - (to execute a class)2021-09-01:09:19:10 [command_executor.py:10] INFO - or java [options] -jar <jarfile> [args...]
2021-09-01:09:19:10 [command_executor.py:10] INFO - (to execute a jar file)
...
I know that the problem is due to the use of quotes but I just don't understand how to go about them.
For example, the java command I am nesting inside bin/bash -c needs to be wrapped with quotes like sl
bin/bash -c "java -cp ..."
Note: the command works fine if I execute it in our dev env where we do not have the "docker setup" and we execute the command as it is but on Stage we have the system running in a container thus the reason why I need to use docker exec` to invoke the same command in the contaner

Assign variable read from file in Makefile recipe

I am trying to do the following in a Makefile recipe. Get the server container-ip using a python script. Build the command to run within the docker container. Run the command in the docker container.
test:
SIP=$(shell python ./scripts/script.py get-server-ip)
CMD="iperf3 -c ${SIP} -p 33445"
docker exec server ${CMD}
I get this
$ make test
SIP=172.17.0.6
CMD="iperf3 -c -p 33445"
docker exec server
"docker exec" requires at least 2 arguments.
See 'docker exec --help'.
Usage: docker exec [OPTIONS] CONTAINER COMMAND [ARG...]
Run a command in a running container
make: *** [test] Error 1
I ended up with something like this.
SERVER_IP=$(shell python ./scripts/script.py get-server-ip); \
SERVER_CMD="iperf3 -s -p ${PORT} -4 --logfile s.out"; \
CLIENT_CMD="iperf3 -c $${SERVER_IP} -p ${PORT} -t 1000 -4 --logfile c.out"; \
echo "Server Command: " $${SERVER_CMD}; \
echo "Client Command: " $${CLIENT_CMD}; \
docker exec -d server $${SERVER_CMD}; \
docker exec -d client $${CLIENT_CMD};
This seems to work ok. Would love to hear if there are other ways of doing this.
You could write something like this. Here I used a target-specific variable assuming
the IP address is required only in this rule. iperf_command is defined as a variable
as the format looks rather fixed except the IP address which is injected by call function.
Also, as the rule doesn't seem to be supposed to produce the target as a file, I put .PHONY target as well.
iperf_command = iperf3 -c $1 -p 33445
.PHONY: test
test: iperf_server_ip = $(shell python ./scripts/script.py get-server-ip)
test:
docker exec server $(call iperf_command,$(iperf_server_ip))

How to run the docker commands from python?

I want to run a set of docker commands from python.
I tried creating a script like below and run the script from python using paramiko ssh_client to connect to the machine where the docker is running:
#!/bin/bash
# Get container ID
container_id="$(docker ps | grep hello | awk '{print $1}')"
docker exec -it $container_id sh -c "cd /var/opt/bin/ && echo $1 &&
echo $PWD && ./test.sh -q $1"
But docker exec ... never gets executed.
So I tried to run the below python script below, directly in the machine where the docker is running:
import subprocess
docker_run = "docker exec 7f34a9c1b78f /bin/bash -c \"cd
/var/opt/bin/ && ls -a\"".split()
subprocess.call(docker_run, shell=True)
I get a message: "Usage: docker COMMAND..."
But I get the expected results if I run the command
docker exec 7f34a9c1b78f /bin/bash -c "cd /var/opt/bin/ && ls -a"
directly in the machine
How to run multiple docker commands from the python script? Thanks!
You have a mistake in your call to subprocess.call. subprocess.call expects a command with a series of parameters. You've given it a list of parameter pieces.
This code:
docker_run = "docker exec 7f34a9c1b78f /bin/bash -c \"cd
/var/opt/bin/ && ls -a\"".split()
subprocess.call(docker_run, shell=True)
Runs this:
subprocess.call([
'docker', 'exec', '7f34a9c1b78f', '/bin/bash', '-c',
'"cd', '/var/opt/bin/', '&&', 'ls', '-a"'
], shell=True)
Instead, I believe you want:
subprocess.call([
'docker', 'exec', '7f34a9c1b78f', '/bin/bash', '-c',
'"cd /var/opt/bin/ && ls -a"' # Notice how this is only one argument.
], shell=True)
You might need to tweak that second call. I suspect you don't need the quotes ('cd /var/opt/bin/ && ls -a' might work instead of '"cd /var/opt/bin/ && ls -a"'), but I haven't tested it.
Following are a few methods worked:
Remove double quotes:
subprocess.call([
'docker', 'exec', '7f34a9c1b78f', '/bin/bash', '-c',
'cd /opt/teradata/tdqgm/bin/ && ./support-archive.sh -q 6b171e7a-7071-4975-a3ac-000000000241'
])
If you are not sure of how the command should be split up to pass it as an argument of subprocess method, shlex module:
https://docs.python.org/2.7/library/shlex.html#shlex.split

Execute a shell command with python variables

I had script on bash where I generated username, password, ssh-key for user.
Part for creating of ssh-key:
su $user -c "ssh-keygen -f /home/$user/.ssh/id_rsa -t rsa -b 4096 -N ''"
How can I do the same in Python with os.system? I tried this:
os.system('su %s -c "ssh-keygen -f /home/%s/.ssh/id_rsa -t rsa -b 4096 -N ''"', user)
TypeError: system() takes at most 1 argument (2 given)
Also I tried:
os.system('su user -c "ssh-keygen -f /home/user/.ssh/id_rsa -t rsa -b 4096 -N ''"')
Of course, it doesn't work either.
Format your instructions with the os package; for instance:
import os
user = 'joe'
ssh_dir = "/home/{}/.ssh/id_rsa".format(user)
os.system("ssh-keygen -f {} -t rsa -b 4096 -N ''".format(ssh_dir))
os.system is very close to a bash command line because it uses an underlying shell (like its cousins subprocess.call... using shell=True)
In your case, there's little interest using subprocess since your command runs a command, so you cannot really use argument protection by subprocess fully.
Pass the exact command, but the only change would be to protect the simple quotes, else python sees that as string end+string start (your string is protected by simple quotes already) and they're eliminated.
Check this simpler example:
>>> 'hello '' world'
'hello world'
>>> 'hello \'\' world'
"hello '' world"
that's a kind of worst-case when you cannot use either double or simple quotes to protect the string because you're using the other flavour within. In that case, escape the quotes using \:
os.system('su $user -c "ssh-keygen -f /home/$user/.ssh/id_rsa -t rsa -b 4096 -N \'\'"')
Use the subprocess module:
import subprocess
username = 'user'
result, err = subprocess.Popen(
'su %s -c "ssh-keygen -f /home/%s/.ssh/id_rsa -t rsa -b 4096 -N ''"' % (username, username),
stdout=subprocess.PIPE,
shell=True
).communicate()
if err:
print('Something went wrong')
else:
print(result)
Edit: this is the 'fast' way to do that, you should't use shell=True if you can't control the input since it allows code execution as said here

How to do this Python subprocess call without using shell=True?

For example, in /tmp I have files ending in .txt, .doc, and .jpg that I'd like to delete in one step using shred and subprocess.
The following does the job:
subprocess.call('bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"', shell=True)
How would I do this command without using shell=True. I've tried the following:
subprocess.call(['bash', '-c', '"shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}"'])
subprocess.call(['bash', '-c', 'shred', '-n 10', '-uz', '/tmp/{*.txt,*.pdf,*.doc}'])
Any suggestions?
I believe that other guy is spot on (haven't tried it myself though). However if you ever find yourself having similar issues again shlex.split(s) might be helpful. It takes the string 's' and splits it "using shell-like syntax".
In [3]: shlex.split(s)
Out[3]: ['bash', '-c', 'shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}']
subprocess.call(['bash', '-c', 'shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}'])
You can tell how a command is expanded and split up with:
$ printf "Argument: %s\n" bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"
Argument: bash
Argument: -c
Argument: shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}
In the more general case (but overkill here), if you're ever in doubt of what's executed by something with which parameters, you can use strace:
$ cat script
import subprocess
subprocess.call('bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"', shell=True)
$ strace -s 1000 -fe execve python script
...
execve("/bin/bash", ["bash", "-c", "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"], [/* 49 vars */]) = 0
...
$
If the command is coming from a trusted source e.g., it is hardcoded then there is nothing wrong in using shell=True:
#!/usr/bin/env python
from subprocess import check_call
check_call("shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}",
shell=True, executable='/bin/bash')
/bin/bash is used to support {} inside the command.
This command doesn't run /bin/sh

Categories

Resources