using subprocess module in python to run isql command - python

I have to run isql command using python.
Currently i'm doing it in this way
ps = subprocess.Popen("""./isql -I /app/sybase/interfaces_global -S %s -U %s -P %s -D %s -s "|" -w 99999 <<EOF
SET NOCOUNT ON
%s
go
EOF""" %(mdbserver,muserid,mpassword,mdatabase,User_Query),stdout=subprocess.PIPE,shell=True,cwd=sybase_path)
But this method is dependent on the /tmp directory of my server because of the here document, everytime when i run it, it creates a tmp file in the /tmp directory and when the /tmp directory is full the script fails to run the Query onto the database.
How can i use the same command with shell=False, So that i can get rid of the here document """ and the temporary file creation.
this doesn't works
ps = subprocess.Popen("./isql","-I","/app/sybase/interfaces_global","-S",mdbserver,"-U",muserid,"-P",mpassword,"-D",mdatabase,"-s","|","-w","99999","\nSET NOCOUNT ON\n",User_Query,"\ngo",stdout=subprocess.PIPE,shell=False,cwd=sybase_path)

You could replace the here-document by setting stdin=PIPE and providing the input as a string using .communicate() method as #Hans Then suggested:
from subprocess import Popen, PIPE
from textwrap import dedent
isql = Popen(['./isql', '-I', '/app/sybase/...',
'-S', mdbserver,
'-U', muserid, ...,
'-w', '99999'], stdin=PIPE, stdout=PIPE, cwd=sybase_path)
output = isql.communicate(dedent("""\
SET NOCOUNT ON
{}
go
""".format(User_Query)))[0]

Check out the subprocess communicate() command. You can use it to send isql commands to the interpreter.

Related

Run shell script in python with specific parameters

I wish to run a script, lets call it api.sh. The script takes various arguments,
-t token
-r rules.json
-s data.json
and it is going to create a new json file, e.g. data_2.json.
When I run this in terminal I use the following command:
./api.sh -t token -r rules.json -s data.json > data_2.json
However, I wish to run this command line in Python. Any suggestions are appreciated.
Thanks,
I don't know if it supports python but you can use getopts.
Look here
Does this test.py work:
import subprocess
from subprocess import Popen
path_to_output_file = 'data_2.json'
myoutput = open(path_to_output_file,'w+')
p = Popen(["./api.sh", "-t" , "token", "-r", "rules.json", "-s", "data.json"], stdout=myoutput, stderr=subprocess.PIPE, universal_newlines=True)
output, errors = p.communicate()
You can refer to this for details.

subprocess command execution

What is the best way to execute the below command in Python in a single line?
echo $(readlink /sys/dev/block/$(mountpoint -d /))
Tried using individual os.system(cmd) by separating - "mountpoint -d /" first and taking the output and appending to "readlink /sys/dev/block/${0}".format(out.strip()) and doing an echo works. Tried using subprocess and subprocess.Popen and subprocess.check_output but it raises raise CalledProcessError
cmd = "echo $(readlink /sys/dev/block/$(mountpoint -d /))"
You have to call the subcommand separately. And you can use python methods to read the link:
import subprocess
import os
path = "/"
device = subprocess.run(["mountpoint", "-d", path], stdout=subprocess.PIPE, encoding="utf8").stdout.strip()
link = os.readlink("/sys/dev/block/" + device)
print(link)
You probably want to use something like the following:
cmd = "bash -c 'echo $(readlink /sys/dev/block/$(mountpoint -d /))'"
echo doesn't substitute $() blocks, that's what your shell does, so you have to call the shell. os.system(cmd) should work then.

wget missing url with shlex and subprocess

I'm struggling to understand why this fails with a wget: missing URL error:
import shlex
import subprocess
copy_command = "wget -O - 'http://example.com/somepath/somefile.txt?someparam=test' | sshpass -p pass ssh user#localhost -p 2222 \"cat - > /upload/somefile.txt\""
cmd = shlex.split(copy_command, posix=False)
with subprocess.Popen(
cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=True
) as proc:
output, error = proc.communicate()
What am I missing here? If I just give subprocess the copy_command string directly, then it works without issues.
To set up a pipeline requires the parent process to spawn all the programs involved and the connect (pipe) the stdio of one to another.
The Python documentation for subprocess explains how to do this.
It works with string argument andshell=True because then it just hands off the command line to a sub shell, and that shell handles all those details.

How to run the bash command as a system user without giving that user the right to run commands as any user

I have written a python script which includes this line:
response = subprocess.check_output(['/usr/bin/sudo /bin/su - backup -c "/usr/bin/ssh -q -o StrictHostKeyChecking=no %s bash -s" <<\'EOF\'\nPATH=/usr/local/bin:$PATH\nmvn --version|grep -i Apache|awk \'{print $3}\'|tr -d \'\n\'\nEOF' % i], shell=True)
This is in a for loop that goes through a list of hostnames and each one I want to check the result of the command on it. This works fine when I run it myself, however, this script is to be run by a system user (shinken - a nagios fork) and at that point I hit an issue.
shinken ALL=(ALL) NOPASSWD: ALL
However, I wanted to restrict the user to only allow it to run as the backup user:
shinken ALL=(backup) NOPASSWD: ALL
But when I run the script I get:
sudo: no tty present and no askpass program specified
I have read around this and tried a few things to fix it. I tried adding -t to my ssh command, but that didn't help. I believe I should be able to run the command with something similar to:
response = subprocess.check_output(['/usr/bin/sudo -u backup """ "/usr/bin/ssh -q -o StrictHostKeyChecking=no %s bash -s" <<\'EOF\'\nPATH=/usr/local/bin:$PATH\njava -version|grep -i version|awk \'{print $3}\'|tr -d \'\n\'\nEOF""" ' % i], shell=True)
But then I get this response:
subprocess.CalledProcessError: Command '['/usr/bin/sudo -u backup """ "/usr/bin/ssh -q -o StrictHostKeyChecking=no bamboo-agent-01 bash -s" <<\'EOF\'\nPATH=/usr/local/bin:$PATH\njava -version|grep -i version|awk \'{print $3}\'|tr -d \'\n\'\nEOF""" ']' returned non-zero exit status 1
If I run the command manually I get:
sudo: /usr/bin/ssh: command not found
Which is strange because that's where it lives.... I've no idea if what I'm trying is even possible. Thanks for any suggestions!
As for sudo:
shinken ALL=(backup) NOPASSWD: ALL
...only works when you switch directly from shinken to backup. You aren't doing that here. sudo su - backup is telling sudo to switch to root, and to run the command su - backup as root. Obviously, then, if you're going to use sudo su (which I've advised against elsewhere), you need your /etc/sudoers configuration to support that.
Because your /etc/sudoers isn't allowing direct the switch to root you're requesting, it's trying to prompt for a password, which requires a TTY, which is thus causing a failure.
Below, I'm rewriting the script to switch directly from shinken to backup, without going through root and running su:
As for the script:
import subprocess
remote_script='''
PATH=/usr/local/bin:$PATH
mvn --version 2>&1 | awk '/Apache/ { print $3 }'
'''
def maven_version_for_host(hostname):
# storing the command lets us pass it when constructing a CalledProcessError later
# could move it directly into the Popen creation if you don't need that.
cmd = [
'sudo', '-u', 'backup', '-i', '--',
'ssh', '-q', '-o', 'StrictHostKeyChecking=no', str(hostname),
'bash -s' # arguments in remote-command position to ssh all get concatenated
# together, so passing them as one command aids clarity.
]
proc = subprocess.Popen(cmd,
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
response, error_string = proc.communicate(remote_script)
if proc.returncode != 0:
raise subprocess.CalledProcessError(proc.returncode, cmd, error_string)
return response.split('\n', 1)[0]

Python subprocess piping to stdin

I'm trying to use python Popen to achieve what looks like this using the command line.
echo "hello" | docker exec -i $3 sh -c 'cat >/text.txt'
The goal is to pipe the "hello" text into the docker exec command and have it written to the docker container.
I've tried this but can't seem to get it to work.
import subprocess
from subprocess import Popen, PIPE, STDOUT
p = Popen(('docker', 'exec', '-i', 'nginx-ssl', 'sh', '-c', 'cat >/text.txt'), stdin=subprocess.PIPE)
p.stdin.write('Hello')
p.stdin.close()
You need to give stdin the new line also:
p.stdin.write('Hello\n')
That is the same thing even with sys.stdout. You don't need to give print a new line because it does that for you, but any writing to a file that you do manually, you need to include it. You should use p.communicate('Hello') instead, though. It's made for that.

Categories

Resources