Python subprocess calling mysqldump with .cnf file fails when shell=False - python

What I want to do is call mysqldump to make a backup of a database table.
I have assembled a list of command line arguments, where db and tbl are the database and table names, respectively, and cnf is a properly formatted mysql configuration file (containing host, user and password),
args = ["mysqldump","--defaults-extra-file="+cnf,db,tbl]
print " ".join(args)
The output of the print is,
mysqldump --defaults-extra-file=dbserver.cnf test mytable
When I copy/paste the above line into the (bash) shell, it works.
When I use subprocess.Popen, the command fails,
import subprocess as sp
...
proc = sp.Popen(args, shell=False, stdin=sp.PIPE, stdout=sp.PIPE)
stdout = proc.communicate()
retcode = proc.wait()
if retcode>0: print "error",retcode
But when I join the args together, and call subprocess.Popen with shell=True,
the command works as desired,
cmd = " ".join(args)
proc = sp.Popen(cmd, shell=True, stdin=sp.PIPE, stdout=sp.PIPE)
stdout = proc.communicate()
retcode = proc.wait()
if retcode>0: print "error",retcode
Further investigation reveals that the first command does not seem to use the user/password credentials from the config file, but the second variant does.
Responses to other questions about subprocess.Popen make it clear that one should avoid the shell=True, but the first approach above fails.
Can anyone identify what I am doing wrong above?

Related

Python Expect Like Behavior to capture /dev/tty (Provide passphrase to ssh / ssh-add)

I would like to respond to ssh / ssh-add password prompt when launched using Python subprocess, but since ssh reads password directly from /dev/ttyXX and not stdin, standards PIPEs do not work.
Is there a way to wrap subprocess with a virtual tty that I can control in Python so that I can send a password to the vty programmatically (like Expect seems to be able to do)? I would rather not use an Expect like module for Python if possible.
# Does not work, still grabs tty
my_env = os.environ.copy()
my_env['SSH_ASKPASS'] = my_env['HOME'] + '/bin/foo'
keyfile = './id_rsa'
keypass = b'password\n'
proc = subprocess.Popen('/usr/bin/ssh-add -t 86400 ' + keyfile,
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
env=my_env)
# Always results in timeout as keypass is sent to stdin NOT /dev/ttyXX
stdout_data, stderr = result.communicate(input=keypass, timeout=60)
if stderr:
print("#LOG ERROR - Could not add password to ssh-agent [" + keyfile + "].")
sys.exit(1)

How can I read the error content outputs of mysql command executed from python

bash_fc = rf"mysql -u {source_user_name} -h {source_ipv4_addr} --password={source_db_password}"
When I use the following functions for the above command,
from subprocess import PIPE,run
def cons_r(cmd):
response = run(cmd, stdout=PIPE, stderr=PIPE, universal_newlines=True, shell=True)
return response.stdout
response = os.popen(bash_fc)
mysql: [Warning] Using a password on the command line interface can be insecure.
mysql: Unknown OS character set 'cp857'.
mysql: Switching to the default character set 'utf8mb4'.
ERROR 1045 (28000): Access denied for user 'root'#'pc.mshome.net' (using password: YES)
I can't read the output, is there a method you know of so I can read it?
You can read errors from standard error
import subprocess as sp
ret = sp.run(['ls', '*.bad'], stderr=sp.PIPE,
stdout=sp.PIPE, shell=True, encoding="cp857")
if ret.returncode == 0:
print(ret.stdout)
else:
print(ret.stderr)
This change fixed the problem
from subprocess import PIPE, run, STDOUT
def cons_r(cmd):
response = run(cmd, stdout=PIPE, stderr=STDOUT, universal_newlines=True, shell=True)
return response.stdout

Popen / Wait - Wait never finishes

import sys
import os
from subprocess import Popen, PIPE, STDOUT
# Transfer Database
print ('Transferring from ' + mysql_source_database)
mysql = Popen(f"mysql -h {mysql_dest_host} -P 3306 -u {mysql_dest_username} -p{mysql_dest_pw} {mysql_dest_database}".split(), stdin=PIPE, stdout=PIPE)
dbnamerewrite = Popen(f"sed s/{mysql_source_database}/{mysql_dest_database}/g".split(), stdin=PIPE, stdout=mysql.stdin)
mysqldump = Popen(f"mysqldump --set-gtid-purged=OFF --column-statistics=0 -h {mysql_source_host} -P 3306 -u {mysql_source_username} -p{mysql_source_pw} {mysql_source_database}".split(), stdout=dbnamerewrite.stdin)
mysql_stdout = mysql.communicate()[0]
mysqldump.wait()
The above code does what I want it to but never stops waiting. Does anyone know how to fix the wait. If I ctrl-c it after the SQL work has finished this is the given error:
^CERROR 1064 (42000) at line 3829: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 3
Traceback (most recent call last):
File "test.py", line 19, in <module>
mysql_stdout = mysql.communicate()[0]
File "/usr/lib/python3.8/subprocess.py", line 1028, in communicate
stdout, stderr = self._communicate(input, endtime, timeout)
File "/usr/lib/python3.8/subprocess.py", line 1868, in _communicate
ready = selector.select(timeout)
File "/usr/lib/python3.8/selectors.py", line 415, in select
fd_event_list = self._selector.poll(timeout)
KeyboardInterrupt
One thing is that you should drop the explicit call to mysqldump.wait(). According to the docs:
Note: This will deadlock when using stdout=PIPE or stderr=PIPE and the child process generates enough output to a pipe such that it blocks waiting for the OS pipe buffer to accept more data. Use Popen.communicate() when using pipes to avoid that.
mysql.communicate is sufficient in this case, because it will not receive an EOF until all the elements up the pipeline send one. So mysql.communicate() returning directly implies that the other two processes are done.
Another problem is that with the ordering of processes that you have, you will have to call communicate on all of them in reverse order to get data flowing through the pipeline. One solution is to do just that:
db_param = ['-h', mysql_dest_host, '-P', '3306', '-u', mysql_dest_username, f'p{mysql_dest_pw}', mysql_dest_database]
mysql = Popen(['mysql'] + db_param,
stdin=PIPE, stdout=PIPE)
dbnamerewrite = Popen(['sed', f's/{mysql_source_database}/{mysql_dest_database}/g'],
stdin=PIPE, stdout=mysql.stdin)
mysqldump = Popen(['mysqldump', '--set-gtid-purged=OFF', '--column-statistics=0'] + db_param,
stdout=dbnamerewrite.stdin)
mysqldump.communicate()
dbnamerewrite.communicate()
mysql_stdout = mysql.communicate()[0]
The other alternative is to set up your pipe in the opposite order, in which case you only need to communicate with the last process:
db_param = ['-h', mysql_dest_host, '-P', '3306', '-u', mysql_dest_username, f'p{mysql_dest_pw}', mysql_dest_database]
mysqldump = Popen(['mysqldump', '--set-gtid-purged=OFF', '--column-statistics=0'] + db_param,
stdout=PIPE)
dbnamerewrite = Popen(['sed', f's/{mysql_source_database}/{mysql_dest_database}/g'],
stdin=mysqldump.stdout, stdout=PIPE)
mysql = Popen(['mysql'] + db_param, stdin=dbnamerewrite.stdout, stdout=PIPE)
mysql_stdout = mysql.communicate()[0]

Mailx won't send through python

I'm rewriting a shell script to python and a part of it includes sending notifications via mailx.
I can't seem to get the subprocess right.
result = subprocess.run(["/bin/mailx", "-r", "sender#email.com", "-s", "Test", "recipient#email.com"], check=True)
When I run this on the server the command returns a blank row, "won't complete" and I thought it might be because mailx is waiting for the email body because when I try sending through bash without a body I get sort of the same problem, so I got these tips:
1.
result = subprocess.run(["echo", "Testing", "|", /bin/mailx", "-r", "sender#email.com", "-s", "Test", "recipient#email.com"], check=True) and2. result = subprocess.run(["/bin/mailx", "-r", "sender#email.com", "-s", "Test", "recipient#email.com", b"Testingtesting"], check=True)
When testing 1, it just echoes out everything after echo.
When testing 2, I get the blank row again.
Using subprocess.Popen you can do it as below :
import subprocess
cmd = """
echo 'Message Body' | mailx -s 'Message Title' -r sender#someone.com receiver#example.com
"""
result = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE)
output, errors = result.communicate()
Regarding shell=True from documentation
shell=False disables all shell based features, but does not suffer from this vulnerability; see the Note in the Popen constructor documentation for helpful hints in getting shell=False to work.
The use of shell=True is strongly discouraged in cases where the command string is constructed from external input
In your case, if you are not taking user input to pass it to subprocess.Popen you are safe.

Cannot get subprocess output in Django view

I want to emulate command line call of my script (TensorFlow neaural chatbot model) in Django view and get output from console to variable.
When i do manually in terminal of my server:
python3 var/www/engine/chatbot/udc_predict.py --model_dir=var/www/engine/chatbot/runs/1486057482/
the output is good and printed out.
So in my Django view i do:
import subprocess
answer = subprocess.check_output(['python3', 'var/www/engine/chatbot/udc_predict.py','--model_dir=var/www/engine/chatbot/runs/1486057482/'], shell=True, stderr=subprocess.STDOUT, timeout=None)
print('answer', answer)
And answer variable is printed as b'' in Apache error log.
I cannot figure out what's wrong in my call.
The answer is to use .communicate() and PIPE:
from subprocess import Popen, PIPE
proc = Popen(
"python3 var/www/engine/chatbot/udc_predict.py --model_dir=var/www/engine/chatbot/runs/1486057482/",
shell=True,
stdout=PIPE, stderr=PIPE
)
proc.wait()
res = proc.communicate()
if proc.returncode:
print(res[1])
print('result:', res[0])
answer = res[0]

Categories

Resources