REST API for Azure SQL DB on Linux Machine - python

Question. I would like to create a REST API for the data stored in an Azure SQL DB that will allow me to do GET and POST operation using Python. Currently I managed to print the results of my query on the terminal but how do I convert it to JSON format and allow it to run 24/7 on linux (perhaps change port?)? Below is my script:
import pyodbc
from flask import Flask, jsonify, request
from flask_restful import Resource, Api
app = Flask(__name__)
api = Api(app)
class Energy(Resource):
def get(self):
server = 'testserver.database.windows.net'
database = 'testdb'
username = 'admin'
password = '735t'
driver= '{ODBC Driver 13 for SQL Server}'
connexion = pyodbc.connect('DRIVER='+driver+';PORT=1433;SERVER='+server+';PORT=1443;DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = connexion.cursor()
cursor.execute("SELECT TOP (100) * FROM [dbo].[Power_Meter]")
row = cursor.fetchone()
while row:
GeneratedCode = str(row[0])
ReportedDate = str(row[1])
print (str(row[0]) + " " + str(row[1]))
row = cursor.fetchone()
rest_row = jsonify(row)
return rest_row
api.add_resource(Energy, '/DPM')
if __name__ == '__main__':
app.run(debug=True)
and this is the output result on localhost:5000/DPM
null
Can anyone suggest me how to go about solving this issue? Thanks

If you want to run your script 24/7 on Linux, you could execute it as a background task. Using nohup python sql.py>> test.log &
man nohup
nohup - run a command immune to hangups, with output to a non-tty
& to the command line to run in the background:
If you want to change port, just change like below:
app.run(host='0.0.0.0',port=5000)
I suggest you could store output to a file. Then you could parse data to json format.

Related

creating a webservice in flask that queries a Mysql database and returns json

Please help:
I am trying to create a webservice in flask (this is my first time) that takes a str, queries an external mysql database and returns 1 row as json.
I am sure there are other of issues with the code below (all suggestions much appreciated), but I cannot even see them yet, because when I try to access example.com/webservice/vin/ I get "Internal Server Error
The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application." from nginx.
PLease can someone guide me where I am going wrong?
example.com return Hello from app_name running in docker...yayy. Its the other route that isnt working.
from flask.ext.mysqldb import MySQL
import json
app.config['MYSQL_HOST'] = 'xxx'
app.config['MYSQL_USER'] = 'xxx'
app.config['MYSQL_PASSWORD'] = 'xxx'
app.config['MYSQL_DB'] = 'xxx'
mysql = MySQL(app)
#app.route("/")
def index():
# Use os.getenv("key") to get environment variables
app_name = os.getenv("APP_NAME")
if app_name:
return f"Hello from {app_name} running in a Docker container behind Nginx!"
return "Hello from Flask"
#app.route('/webservice/vin/<vin>', methods=['GET'])
def get_vehicle(vin):
sql = "SELECT * FROM `table` where column = '(%s )';" %(vin)
cur = mysql.connection.cursor()
cur.execute(sql)
row_headers=[x[0] for x in cur.description] #this will extract row headers
rv = cur.fetchall()
json_data=[]
for result in rv:
json_data.append(dict(zip(row_headers,result)))
return json.dumps(json_data)
from flask import make_response
#app.errorhandler(404)
def not_found(error):
return make_response(jsonify({'error': 'Not found'}), 404)

Flask API facing InterfaceError with PostgreSQL

I have a Flask API based on Flask RestPlus extension and is hosted on Google App Engine. The API does a basic job of fetching data from a Google Cloud SQL PostgreSQL. The API is working fine otherwise but sometimes it starts returning InterfaceError: cursor already closed.
Strangely, when I do a gcloud app deploy, the API starts working fine again.
Here's a basic format of the API:
import simplejson as json
import psycopg2
from flask import Flask, jsonify
from flask_restplus import Api, Resource, fields
from psycopg2.extras import RealDictCursor
app = Flask(__name__)
app.config['SWAGGER_UI_JSONEDITOR'] = True
api = Api(app=app,
doc='/docs',
version="1.0",
title="Title",
description="description")
app.config['SWAGGER_UI_JSONEDITOR'] = True
ns_pricing = api.namespace('cropPricing')
db_user = "xxxx"
db_pass = "xxxx"
db_name = "xxxxx"
cloud_sql_connection_name = "xxxxxx"
conn = psycopg2.connect(user=db_user,
password=db_pass,
host='xxxxx',
dbname=db_name)
#ns_pricing.route('/list')
class States(Resource):
def get(self):
"""
list all the states for which data is available
"""
cur = conn.cursor(cursor_factory=RealDictCursor)
query = """
SELECT
DISTINCT state
FROM
db.table
"""
conn.commit()
cur.execute(query)
states = json.loads(json.dumps(cur.fetchall()))
if len(states) == 0:
return jsonify(data=[],
status="Error",
message="Requested data not found")
else:
return jsonify(status="Success",
message="Successfully retreived states",
data=states)
What should I fix to not see the error anymore?
It would be good to use the ORMs such as SQLAlchemy / Flask-SQLAlchemy which would handle the establishing / re-establishing the connection part.
Though, if using psycopg2. you can use try except to catch the exception and re-establish the connection again.
try:
cur.execute(query)
except psycopg2.InterfaceError as err:
print err.message
conn = psycopg2.connect(....)
cur = conn.cursor()
cur.execute(query)

Azure tranlation api don't deliver results while handover data from arangoDB

I struggle a little with getting an return on my azure translation api call.
My code is based on this code https://github.com/MicrosoftTranslator/PythonConsole and it work perfectly.
I furthermore have a arangoDB with some test data. Which does it work and give me this:Result on db test
However, if i combine both as follow:
from xml.etree import ElementTree
from auth import AzureAuthClient
from arango import ArangoClient
import requests
client = ArangoClient(
protocol='http',
host='localhost',
port=32768,
username='root',
password='password',
enable_logging=True
)
db = client.database('testdb')
test = db.collection('testcol')
def GetTextAndTranslate(finalToken):
fromLangCode = "en"
toLangCode = "de"
textToTranslate = " "
for t in test:
#text to translate
textToTranslate = t['name']
# Call to Microsoft Translator Service
headers = {"Authorization ": finalToken}
translateUrl = "http://api.microsofttranslator.com/v2/Http.svc/Translate?text={}&to={}".format(textToTranslate, toLangCode)
translationData = requests.get(translateUrl, headers = headers)
# parse xml return values
translation = ElementTree.fromstring(translationData.text.encode('utf-8'))
# display translation if needed
print (translation.text)
if __name__ == "__main__":
#Add your client secret in the next line
client_secret = 'azurepassword'
auth_client = AzureAuthClient(client_secret)
bearer_token = 'Bearer ' + auth_client.get_access_token()
I just get nothing. The console needs less then a second and then I can enter new command on the terminal. But no result displayed, also tried to put it into a file. Azure tell me that I called the API, but I can't see what was processed there.
Thanks for your help!
I tried to test your code for calling Azure Translator API, but I discovered the translator part of your code works fine and the Arango part also works fine. Under your code is not complete for me, the only issue I guess is that the function GetTextAndTranslate(finalToken) shoud be defined as GetTextAndTranslate(test, finalToken) which can be passed the argument test collection like below.
def GetTextAndTranslate(test, finalToken):
# Your code
........
if __name__ == "__main__":
client = ArangoClient(
protocol='http',
host='localhost',
port=32768,
username='root',
password='password',
enable_logging=True
)
db = client.database('testdb')
test = db.collection('testcol')
#Add your client secret in the next line
client_secret = 'azurepassword'
auth_client = AzureAuthClient(client_secret)
bearer_token = 'Bearer ' + auth_client.get_access_token()
GetTextAndTranslate(test, bearer_token)
Hope it helps. Any update, please feel free to let me know.

mySQL UPDATE doesn't work for GAE when called from browser

I am writing a verify email address python file for Google App Engine. (ya I know django has stuff, but I wanted to write my own because that is how I learn)
Below is the python code. The code returns "Email Account Verified" which seems to me that the queries worked. However when I look at the "active" column in the database, it is still 0.
If I run the query string that logging.info("%s",db_query) in the database itself, it works and is updated to 1.
All my other python code (with UPDATES) works fine, the only difference is those python files are called from my ios app and this one is called from a browser.
#Make the libs folder with 3rd party libraries and common methods
import sys
sys.path.insert(0, 'libs')
#Imports
import logging
import webapp2
from django.utils.html import strip_tags
import common
import MySQLdb
import json
VERIFIED_HTML = """\
<html>
<body>
<h1>Email Account Verified</h1>
</body>
</html>
"""
ERROR_HTML = """\
<html>
<body>
<h1>ERROR</h1>
</body>
</html>
"""
class VerifyEmail(webapp2.RequestHandler):
def get(self):
user_email = strip_tags(self.request.get('user_email').lower().strip())
user_activation_hash = strip_tags(self.request.get('user_activation_hash').strip())
logging.info("User Email = %s", user_email)
logging.info("User Activation Hash = %s", user_activation_hash)
#Insert the information into the users table
#Get the database connection to Google Cloud SQL
db = common.connect_to_google_cloud_sql()
db_cursor = db.cursor(MySQLdb.cursors.DictCursor)
#Check to see if user already exists
#Query for user
db_query = """SELECT \
email, activation_hash \
FROM users WHERE email='%s' AND activation_hash='%s'""" % (user_email, user_activation_hash)
db_cursor.execute(db_query)
#If there is one record containing the username check password
if(db_cursor.rowcount == 1):
db_query = """UPDATE users SET active=%s WHERE email='%s';""" % (1, user_email)
logging.info("%s" % db_query)
if(db_cursor.execute(db_query)):
self.response.write(VERIFIED_HTML)
else:
self.response.write(ERROR_HTML)
else: #either no user, or activation_hash doesn't match
self.response.write(ERROR_HTML)
Connect to Google Cloud SQL
def connect_to_google_cloud_sql():
#hostname = DEV_DB_HOSTNAME
#hostname = PROD_DB_HOSTNAME
db_username = 'dummy_user' #not real
db_password = 'dummypassword' # not real
#If PROD or Deployed Testing, use unix_socket
if(os.getenv('SERVER_SOFTWARE') and os.getenv('SERVER_SOFTWARE').startswith('Google App Engine/')):
db = MySQLdb.connect(unix_socket='/cloudsql/' + _DATABASE_HOSTNAME, db='dummydbname', user=db_username, passwd=db_password)
else: #Local Testing uses host
db = MySQLdb.connect(host=_DATABASE_HOSTNAME, port=3306, db='dummydbname', user=db_username, passwd=db_password)
logging.info("Got DB Connection")
return db
Any suggestions? Is it GAE Cloud SQL Privledges????
Maybe because I was using my browser with the local app engine running on my local ip???
You need to call .commit() on MySQLdb cursors after executing queries. This is why your update is failing. It updates inside the transaction, but when your code ends without committing the transaction, the changes to the DB are rolled back, despite having told the user of success on update.
You can also use the following method to ensure commits when using a cursor: db_cursor.autocommit(True).

Convert Python2.6 to Google App Engine compatible (multiprocessing)

I've made a little script in Python which uses multiprocessing. I've thought of running it on the Google App Engine as a cron-job, but unfortunately Google App Engine doesn't support multiprocessing. Can anyone help me convert this into Google App Engine compatible code (perhaps using Google App Engine tasks?)?
from multiprocessing import Pool
import MySQLdb
import urllib;
import urllib2;
def f(email_url):
url = "http://my-domain.com/cron.php"
values = { "email" : email_url[0], "url" : email_url[1] }
data = urllib.urlencode(values)
req = urllib2.Request(url, data)
urllib2.urlopen(req)
if __name__ == '__main__':
p = Pool()
emails_urls = list()
conn = MySQLdb.connect(host = "XXX.XXX.XXX.XXX", user = "USERNAME",
passwd = "PASSWORD", db = "MY-DATABASE")
cursor = conn.cursor()
cursor.execute ("SELECT email, url FROM data")
rows = cursor.fetchall()
for row in rows:
emails_urls.append((row[0], row[1]))
cursor.close()
conn.close()
p.map(f, emails_urls)
Take a look at Task Queues.
ca can insert an amount of work into a task Queue (=> Thread) and set the number of jobs in a queue which are executed simultaniously.
Take a look here: http://code.google.com/intl/de-DE/appengine/docs/python/taskqueue/

Categories

Resources