mysql source command do nothing inside docker container - python

Description
I'm running docker container with mysql in it and I want to run python script after mysql started, which will apply dump on it.
Here is a snippet of Dockerfile:
FROM mysql:5.6
RUN apt-get update && \
apt-get install -y python
ADD apply_dump.py /usr/local/bin/apply_dump.py
ADD starter.sh /usr/local/bin/starter.sh
CMD ["/usr/local/bin/starter.sh"]
starter.sh:
nohup python '/usr/local/bin/apply_dump.py' &
mysqld
apply_dump.py:
import os
import urllib
import gzip
import shutil
import subprocess
import time
import logging
import sys
# wait for mysql server
time.sleep(5)
print "Start dumping"
dumpName = "ggg.sql"
dumpGzFile = dumpName + ".gz"
dumpSqlFile = dumpName + ".sql"
print "Loading dump {}...".format(dumpGzFile)
urllib.urlretrieve('ftp://ftpUser:ftpPassword#ftpHost/' + dumpGzFile, '/tmp/' + dumpGzFile)
print "Extracting dump..."
with gzip.open('/tmp/' + dumpGzFile, 'rb') as f_in:
with open('/tmp/' + dumpSqlFile, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
print "Dropping database..."
subprocess.call(["mysql", "-u", "root", "-proot", "-e", "drop database if exists test_db"])
print "Creating database..."
subprocess.call(["mysql", "-u", "root", "-proot", "-e", "create schema test_db"])
print "Applying dump..."
subprocess.call(["mysql", "--user=root", "--password=root", "test_db", "-e" "source /tmp/{}".format(dumpSqlFile)])
print "Done"
content of ggg.sql.gz is pretty simple:
CREATE TABLE test_table (id INT NOT NULL,PRIMARY KEY (id));
Problem
Database created, but table is not. If I'll go to container and will run this script manually, table will be created. If I'll replace source command with direct sql create statement that will work as well. But in reality dump file will be pretty big and only source command will cope with this (or not only it?). Am I doing something wrong?
Thanks in advance.

Try passing your source SQL file into the MySQL command like this, instead of using the -e flag:
subprocess.call(["mysql", "--user=root", "--password=root", "test_db", "<", "/tmp/%s" % dumpSqlFile])
This will call import your SQL file using the widely used syntax:
mysql --user=root --password=root test_db < /tmp/source.sql

Related

Python 2.7, PgAdmin III & BASH 4.3.48: Error Invalid Provider when running py script inside bash script

This is my bash script;
# Binarize raster for later polygon creation
path=/home/rose/Desktop/test/DSM_BM24_2017_1000_4735.tif
pathdir=$(echo $path | cut -d "/" -f 1,2,3,4,5)
gdal_calc.py -A $path --outfile=$pathdir/binary.tif --NoDataValue=-9999 --calc "1 * (A != -9999)"
# Polygonize raster
binary=$pathdir/binary.tif
b=$(basename $binary)
basebinary=$(echo $b | cut -d "." -f 1)
gdal_polygonize.py $binary $pathdir/polygon$basebinary.shp polygon$basebinary
# Make EPSG:2193 projection file for polygon
ogr2ogr -a_srs EPSG:2193 -f "ESRI Shapefile" /home/rose/Desktop/test/finalpolygonbinary.shp /home/rose/Desktop/test/polygonbinary.shp
# Define variable to import to py and postgis script 'pypostgis'
polygon='$pathdir/finalpolygon$basebinary.shp'
# Call python script
python pypostgis.py $polygon
This is my python script that is being called in my bash script, which grabs the polygon shapefile and imports it into a table in PgAdmin III;
import psycopg2
import qgis.utils
import os
import sys
from qgis.core import *
from qgis.core import QgsVectorLayer, QgsVectorLayerImport
from qgis.core import QgsDataSourceURI, QgsVectorLayer
# import layer into database shp2pgs function
def shp2pgs(layer):
conn = psycopg2.connect("dbname='rosespace' host='localhost' port = '5432' user='rose' password='postgres' ")
cursor = conn.cursor()
uri = "dbname='rosespace' host='localhost' port=5432 user='rose' password='postgres' table=\"public\".\"%s\" (geom) sql=" %layer.name().lower()
importvector = QgsVectorLayerImport.importLayer(layer, uri, "postgres", layer.crs(), False, False)
print importvector
# Import and use $polygon variable from bash script for layer, which will eventually be called by the function above
shapefile = sys.argv[1]
print shapefile
basename_shp = (os.path.splitext(os.path.basename(shapefile))[0]).lower()
layer = QgsVectorLayer(shapefile, basename_shp, 'ogr')
#Call function with shapefile variable - layer, originally from above bash script
shp2pgs(layer)
When I ran the bash script, this was the output of print importvector;
print importvector
(9, u'Unable to load postgres provider')
Would any of you happen to know what this error is pointing to / means and if there is a solution?
I'm guessing it's something to do with my imported variable from bash - sys.argv[1] but I've checked it, and it is a string type. I have also put quotes around the variable and it still didn't work.
I have also checked the python script substituting sys.argv[1] with a normal string filepath and it worked perfectly within python console in QGIS 2.18.

Postgres - how can you execute a multiline query from Python using psql on Windows?

I'd like to be able to run multiline queries with psql, and was trying to write a library function to do so, but I get a "Permission denied" error -
import os
import tempfile
sql = 'select 1;'
with tempfile.NamedTemporaryFile('w') as f:
f.write(sql)
cmd = f'psql --file "{f.name}"'
os.system(cmd) # error: Permission denied
This is not as nice looking, but it works:
f = tempfile.NamedTemporaryFile('w', delete=False)
f.write(sql)
f.close()
cmd = f'psql --file "{f.name}"'
os.system(cmd)
os.remove(f.name)
The error seems to be due to Windows not being able to open a file a second time -
Whether the name can be used to open the file a second time, while the named temporary file is still open, varies across platforms (it can be so used on Unix; it cannot on Windows NT or later).
https://docs.python.org/3.6/library/tempfile.html#tempfile.NamedTemporaryFile

How to reference env directories of openshift in a python script?

OpenShift has these default dir's:
# $_ENV['OPENSHIFT_INTERNAL_IP'] - IP Address assigned to the application
# $_ENV['OPENSHIFT_GEAR_NAME'] - Application name
# $_ENV['OPENSHIFT_GEAR_DIR'] - Application dir
# $_ENV['OPENSHIFT_DATA_DIR'] - For persistent storage (between pushes)
# $_ENV['OPENSHIFT_TMP_DIR'] - Temp storage (unmodified files deleted after 10 days)
How do reference them in a python script?
Example script "created a log file in log directory and log in data directory?
from time import strftime
now= strftime("%Y-%m-%d %H:%M:%S")
fn = "${OPENSHIFT_LOG_DIR}/test.log"
fn2 = "${OPENSHIFT_DATA_DIR}/test.log"
#fn = "test.txt"
input = "appended text " + now + " \n"
with open(fn, "ab") as f:
f.write(input)
with open(fn2, "ab") as f:
f.write(input)
Can these script be used with cron?
EDIT the BASH File:
#! /bin/bash
#date >> ${OPENSHIFT_LOG_DIR}/new.log
source $OPENSHIFT_HOMEDIR/python-2.6/virtenv/bin/activate
python file.py
date >> ${OPENSHIFT_DATA_DIR}/new2data.log
import os
os.getenv("OPENSHIFT_INTERNAL_IP")
should work.
So with your example, modify to:-
import os
OPENSHIFT_LOG_DIR = os.getenv("OPENSHIFT_LOG_DIR")
fn = os.path.join(OPENSHIFT_LOG_DIR, "test.log")
And, yes, you can call this python script with a cron by referencing your bash script if you want... Like this for example:-
#!/bin/bash
date >> ${OPENSHIFT_LOG_DIR}/status.log
chmod +x status
cd ${OPENSHIFT_REPO_DIR}/wsgi/crawler
nohup python file.py 2>&1 &
Those variables OPENSHIFT_* are provided as environment variables on OpenShift -- so the $_ENV["OPENSHIFT_LOG_DIR"] is an example to get the value inside a php script.
In python, the equivalent would just be os.getenv("OPENSHIFT_LOG_DIR").
Made edits to Calvin's post above and submitted 'em.
Re: the question of where file.py exists -- use os.getenv("OPENSHIFT_REPO_DIR") as the base directory where all your code would be located on the gear where you app is running.
So if your file is located in .openshift/misc/file.py -- then just use:
os.path.join(os.getenv("OPENSHIFT_REPO_DIR"), ".openshift", "misc", "file.py")
to get the full path.
Or in bash, the equivalent would be:
$OPENSHIFT_REPO_DIR/.openshift/misc/file.py
HTH

Automated database restore from *.sql files

I am trying to write a python script with which to restore our database. We store all our tables (individually) in the repository. Since typing "source table1.sql, source table2.sql,.." Will be cumbersome I've written a script to do this automatically.
I've found a solution using Popen.
process = Popen('mysql %s -u%s -p%s' % (db, "root", ""), stdout=PIPE, stdin=PIPE, shell=True)
output = process.communicate('source' + file)[0]
The method appears to work very well, however, for each table, it prompts me for a password. How do I bypass this to either get it prompt for a password only once or have the subprocess read the password from a config file?
Is there a better way to do this? I've tried to do this using a windows batch script, but as you'll expect, this is a lot less flexible than using python (for e.g)
Since apparently you have an empty password, remove the -p option, -p without a password makes mysql prompt
from subprocess import Popen, PIPE
process = Popen('mysql %s -u%s' % (db, "root"), stdout=PIPE, stdin=PIPE, shell=True)
output = process.communicate('source' + file)[0]
In order to prevent exposing the password to anyone with permission to see running processes, it's best to put the password in a config file, and call that from the command-line:
The config file:
[client]
host=host_name
user=user_name
password=your_pass
Then the command-line:
mysql --defaults-extra-file=your_configfilename
Well, you could pass it on in the command line after reading it from a file
with open('secret_password.txt', 'r') as f:
password = f.read()
process = Popen('mysql %s -u%s -p%s' % (db, "root", password), stdout=PIPE, stdin=PIPE,
Otherwise you could investigate pexpect, which lets you interact with processes. Other alternatives for reading from a config file (like ConfigParser) or simply making it a python module and importing the password as a variable.

python sub-process

I usually execute a Fortran file in Linux (manually) as:
Connect to the server
Go to the specific folder
ifort xxx.for -o xxx && ./xxx (where 'xxx.for' is my Fortran file and 'xxx' is Fortran executable file)
But I need to call my fortran file (xxx.for) from python (I'm a beginner), so I used subprocess with the following command:
cmd = ["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
But I get an error, and I'm not sure what's wrong. Here's the full code:
import string
import subprocess as subProc
from subprocess import Popen as ProcOpen
from subprocess import PIPE
import numpy
import subprocess
userID = "pear"
serverName = "say4"
workDir = "/home/pear/2/W/fortran/"
Fortrancmd = "ifort"
jobname = "rad.for"
exeFilename = "rad"
sshConnect=userID+"#"+servername
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
**#command to execute fortran files in Linux
**#ifort <filename>.for -o <filename> && ./<filename> (press enter)
**#example:ifort xxx.for -o xxx && ./xxx (press enter)
print cmd
How can I write a python program that performs all 3 steps described above and avoids the error I'm getting?
there are some syntax errors...
original:
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
I think you mean:
cmd = [
"ssh",
sshConnect,
"cd %s;" % (workDir,),
"%s %s -o %s && ./%s" % (Fortrancmd, jobname, exeFilename, exeFilename)
]
A few notes:
a tuple with one element requires a comma at the end of the first argument see (workDir,) to be interpreted as a tuple (vs. simple order-of-operations parens)
it is probably easier to contruct your fortan command with a single string format operation
PS - For readability it is often a good idea to break long lists into multiple lines :)
my advice
I would recommend looking at this stackoverflow thread for ssh instead of using subprocess
For the manual part you may want to look into pexpect or for windows wexpect. These allow you to perform subprocesses and pass input under interactive conditions.
However most of what you're doing sounds like it would work well in a shell script. For simplicity, you could make a shell script on the server side for your server side operations, and then plug in the path in the ssh statement:
ssh user#host "/path/to/script.sh"
one error:
you have an unquoted %s in your list of args, so your string formatting will fail.
Here is a complete example of using the subprocess module to run a remote command via ssh (a simple echo in this case) and grab the results, hope it helps:
>>> import subprocess
>>> proc = subprocess.Popen(("ssh", "remoteuser#host", "echo", "1"), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> stdout, stderr = proc.communicate()
Which in this case returns: ('1\n', '')
Note that to get this to work without requiring a password you will likely have to add your local user's public key to ~remoteuser/.ssh/authorized_keys on the remote machine.
You could use fabric for steps 1 and 2.
This is the basic idea:
from fabric.api import *
env.hosts = ['host']
dir = '/home/...'
def compile(file):
with cd(dir):
run("ifort %s.for -o %s" %(file,file))
run("./%s > stdout.txt" % file)
Create fabfile.py
And you run fab compile:filename
do you have to use python?
ssh user#host "command"

Categories

Resources