I've made a little script in Python which uses multiprocessing. I've thought of running it on the Google App Engine as a cron-job, but unfortunately Google App Engine doesn't support multiprocessing. Can anyone help me convert this into Google App Engine compatible code (perhaps using Google App Engine tasks?)?
from multiprocessing import Pool
import MySQLdb
import urllib;
import urllib2;
def f(email_url):
url = "http://my-domain.com/cron.php"
values = { "email" : email_url[0], "url" : email_url[1] }
data = urllib.urlencode(values)
req = urllib2.Request(url, data)
urllib2.urlopen(req)
if __name__ == '__main__':
p = Pool()
emails_urls = list()
conn = MySQLdb.connect(host = "XXX.XXX.XXX.XXX", user = "USERNAME",
passwd = "PASSWORD", db = "MY-DATABASE")
cursor = conn.cursor()
cursor.execute ("SELECT email, url FROM data")
rows = cursor.fetchall()
for row in rows:
emails_urls.append((row[0], row[1]))
cursor.close()
conn.close()
p.map(f, emails_urls)
Take a look at Task Queues.
ca can insert an amount of work into a task Queue (=> Thread) and set the number of jobs in a queue which are executed simultaniously.
Take a look here: http://code.google.com/intl/de-DE/appengine/docs/python/taskqueue/
Related
I have a Flask API based on Flask RestPlus extension and is hosted on Google App Engine. The API does a basic job of fetching data from a Google Cloud SQL PostgreSQL. The API is working fine otherwise but sometimes it starts returning InterfaceError: cursor already closed.
Strangely, when I do a gcloud app deploy, the API starts working fine again.
Here's a basic format of the API:
import simplejson as json
import psycopg2
from flask import Flask, jsonify
from flask_restplus import Api, Resource, fields
from psycopg2.extras import RealDictCursor
app = Flask(__name__)
app.config['SWAGGER_UI_JSONEDITOR'] = True
api = Api(app=app,
doc='/docs',
version="1.0",
title="Title",
description="description")
app.config['SWAGGER_UI_JSONEDITOR'] = True
ns_pricing = api.namespace('cropPricing')
db_user = "xxxx"
db_pass = "xxxx"
db_name = "xxxxx"
cloud_sql_connection_name = "xxxxxx"
conn = psycopg2.connect(user=db_user,
password=db_pass,
host='xxxxx',
dbname=db_name)
#ns_pricing.route('/list')
class States(Resource):
def get(self):
"""
list all the states for which data is available
"""
cur = conn.cursor(cursor_factory=RealDictCursor)
query = """
SELECT
DISTINCT state
FROM
db.table
"""
conn.commit()
cur.execute(query)
states = json.loads(json.dumps(cur.fetchall()))
if len(states) == 0:
return jsonify(data=[],
status="Error",
message="Requested data not found")
else:
return jsonify(status="Success",
message="Successfully retreived states",
data=states)
What should I fix to not see the error anymore?
It would be good to use the ORMs such as SQLAlchemy / Flask-SQLAlchemy which would handle the establishing / re-establishing the connection part.
Though, if using psycopg2. you can use try except to catch the exception and re-establish the connection again.
try:
cur.execute(query)
except psycopg2.InterfaceError as err:
print err.message
conn = psycopg2.connect(....)
cur = conn.cursor()
cur.execute(query)
Question. I would like to create a REST API for the data stored in an Azure SQL DB that will allow me to do GET and POST operation using Python. Currently I managed to print the results of my query on the terminal but how do I convert it to JSON format and allow it to run 24/7 on linux (perhaps change port?)? Below is my script:
import pyodbc
from flask import Flask, jsonify, request
from flask_restful import Resource, Api
app = Flask(__name__)
api = Api(app)
class Energy(Resource):
def get(self):
server = 'testserver.database.windows.net'
database = 'testdb'
username = 'admin'
password = '735t'
driver= '{ODBC Driver 13 for SQL Server}'
connexion = pyodbc.connect('DRIVER='+driver+';PORT=1433;SERVER='+server+';PORT=1443;DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = connexion.cursor()
cursor.execute("SELECT TOP (100) * FROM [dbo].[Power_Meter]")
row = cursor.fetchone()
while row:
GeneratedCode = str(row[0])
ReportedDate = str(row[1])
print (str(row[0]) + " " + str(row[1]))
row = cursor.fetchone()
rest_row = jsonify(row)
return rest_row
api.add_resource(Energy, '/DPM')
if __name__ == '__main__':
app.run(debug=True)
and this is the output result on localhost:5000/DPM
null
Can anyone suggest me how to go about solving this issue? Thanks
If you want to run your script 24/7 on Linux, you could execute it as a background task. Using nohup python sql.py>> test.log &
man nohup
nohup - run a command immune to hangups, with output to a non-tty
& to the command line to run in the background:
If you want to change port, just change like below:
app.run(host='0.0.0.0',port=5000)
I suggest you could store output to a file. Then you could parse data to json format.
I struggle a little with getting an return on my azure translation api call.
My code is based on this code https://github.com/MicrosoftTranslator/PythonConsole and it work perfectly.
I furthermore have a arangoDB with some test data. Which does it work and give me this:Result on db test
However, if i combine both as follow:
from xml.etree import ElementTree
from auth import AzureAuthClient
from arango import ArangoClient
import requests
client = ArangoClient(
protocol='http',
host='localhost',
port=32768,
username='root',
password='password',
enable_logging=True
)
db = client.database('testdb')
test = db.collection('testcol')
def GetTextAndTranslate(finalToken):
fromLangCode = "en"
toLangCode = "de"
textToTranslate = " "
for t in test:
#text to translate
textToTranslate = t['name']
# Call to Microsoft Translator Service
headers = {"Authorization ": finalToken}
translateUrl = "http://api.microsofttranslator.com/v2/Http.svc/Translate?text={}&to={}".format(textToTranslate, toLangCode)
translationData = requests.get(translateUrl, headers = headers)
# parse xml return values
translation = ElementTree.fromstring(translationData.text.encode('utf-8'))
# display translation if needed
print (translation.text)
if __name__ == "__main__":
#Add your client secret in the next line
client_secret = 'azurepassword'
auth_client = AzureAuthClient(client_secret)
bearer_token = 'Bearer ' + auth_client.get_access_token()
I just get nothing. The console needs less then a second and then I can enter new command on the terminal. But no result displayed, also tried to put it into a file. Azure tell me that I called the API, but I can't see what was processed there.
Thanks for your help!
I tried to test your code for calling Azure Translator API, but I discovered the translator part of your code works fine and the Arango part also works fine. Under your code is not complete for me, the only issue I guess is that the function GetTextAndTranslate(finalToken) shoud be defined as GetTextAndTranslate(test, finalToken) which can be passed the argument test collection like below.
def GetTextAndTranslate(test, finalToken):
# Your code
........
if __name__ == "__main__":
client = ArangoClient(
protocol='http',
host='localhost',
port=32768,
username='root',
password='password',
enable_logging=True
)
db = client.database('testdb')
test = db.collection('testcol')
#Add your client secret in the next line
client_secret = 'azurepassword'
auth_client = AzureAuthClient(client_secret)
bearer_token = 'Bearer ' + auth_client.get_access_token()
GetTextAndTranslate(test, bearer_token)
Hope it helps. Any update, please feel free to let me know.
I'm trying to create REST API endpoints using flask framework. This is my fully working script:
from flask import Flask, jsonify
from flask_restful import Resource, Api
from flask_restful import reqparse
from sqlalchemy import create_engine
from flask.ext.httpauth import HTTPBasicAuth
from flask.ext.cors import CORS
conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server"
auth = HTTPBasicAuth()
#auth.get_password
def get_password(username):
if username == 'x':
return 'x'
return None
app = Flask(__name__)
cors = CORS(app)
api = Api(app)
class Report(Resource):
decorators = [auth.login_required]
def get(self):
parser = reqparse.RequestParser()
parser.add_argument('start', type = str)
parser.add_argument('end', type = str)
args = parser.parse_args()
e = create_engine(conn_string)
conn = e.connect()
stat = """
select x from report
"""
query = conn.execute(stat)
json_dict = []
for i in query.cursor.fetchall():
res = {'x': i[0], 'xx': i[1]}
json_dict.append(res)
conn.close()
e.dispose()
return jsonify(results=json_dict)
api.add_resource(Report, '/report')
if __name__ == '__main__':
app.run(host='0.0.0.0')
The issue is that I get results when I call this API only for a day or so after which I stop getting results unless I restart my script (or sometimes even my VM) after which I get results again. I reckon there is some issue with the database connection pool or something but I'm closing the connection and disposing it as well. I have no idea why the API gives me results only for some time being because of which I have to restart my VM every single day. Any ideas?
Per my experience, the issue was caused by coding create_engine(conn_string) to create db pool inside the Class Report so that always do the create & destory operations of db pool for per restful request. It's not correct way for using SQLAlchemy ORM, and be cause IO resouce clash related to DB connection, see the engine.dispose() function description below at http://docs.sqlalchemy.org/en/rel_1_0/core/connections.html#sqlalchemy.engine.Engine:
To resolve the issue, you just need to move e = create_engine(conn_string) to the below of the code conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server" and remove the code e.dispose() both in the Class Report, see below.
conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server"
e = create_engine(conn_string) # To here
In the def get(delf) function:
args = parser.parse_args()
# Move: e = create_engine(conn_string)
conn = e.connect()
and
conn.close()
# Remove: e.dispose()
return jsonify(results=json_dict)
I am writing a verify email address python file for Google App Engine. (ya I know django has stuff, but I wanted to write my own because that is how I learn)
Below is the python code. The code returns "Email Account Verified" which seems to me that the queries worked. However when I look at the "active" column in the database, it is still 0.
If I run the query string that logging.info("%s",db_query) in the database itself, it works and is updated to 1.
All my other python code (with UPDATES) works fine, the only difference is those python files are called from my ios app and this one is called from a browser.
#Make the libs folder with 3rd party libraries and common methods
import sys
sys.path.insert(0, 'libs')
#Imports
import logging
import webapp2
from django.utils.html import strip_tags
import common
import MySQLdb
import json
VERIFIED_HTML = """\
<html>
<body>
<h1>Email Account Verified</h1>
</body>
</html>
"""
ERROR_HTML = """\
<html>
<body>
<h1>ERROR</h1>
</body>
</html>
"""
class VerifyEmail(webapp2.RequestHandler):
def get(self):
user_email = strip_tags(self.request.get('user_email').lower().strip())
user_activation_hash = strip_tags(self.request.get('user_activation_hash').strip())
logging.info("User Email = %s", user_email)
logging.info("User Activation Hash = %s", user_activation_hash)
#Insert the information into the users table
#Get the database connection to Google Cloud SQL
db = common.connect_to_google_cloud_sql()
db_cursor = db.cursor(MySQLdb.cursors.DictCursor)
#Check to see if user already exists
#Query for user
db_query = """SELECT \
email, activation_hash \
FROM users WHERE email='%s' AND activation_hash='%s'""" % (user_email, user_activation_hash)
db_cursor.execute(db_query)
#If there is one record containing the username check password
if(db_cursor.rowcount == 1):
db_query = """UPDATE users SET active=%s WHERE email='%s';""" % (1, user_email)
logging.info("%s" % db_query)
if(db_cursor.execute(db_query)):
self.response.write(VERIFIED_HTML)
else:
self.response.write(ERROR_HTML)
else: #either no user, or activation_hash doesn't match
self.response.write(ERROR_HTML)
Connect to Google Cloud SQL
def connect_to_google_cloud_sql():
#hostname = DEV_DB_HOSTNAME
#hostname = PROD_DB_HOSTNAME
db_username = 'dummy_user' #not real
db_password = 'dummypassword' # not real
#If PROD or Deployed Testing, use unix_socket
if(os.getenv('SERVER_SOFTWARE') and os.getenv('SERVER_SOFTWARE').startswith('Google App Engine/')):
db = MySQLdb.connect(unix_socket='/cloudsql/' + _DATABASE_HOSTNAME, db='dummydbname', user=db_username, passwd=db_password)
else: #Local Testing uses host
db = MySQLdb.connect(host=_DATABASE_HOSTNAME, port=3306, db='dummydbname', user=db_username, passwd=db_password)
logging.info("Got DB Connection")
return db
Any suggestions? Is it GAE Cloud SQL Privledges????
Maybe because I was using my browser with the local app engine running on my local ip???
You need to call .commit() on MySQLdb cursors after executing queries. This is why your update is failing. It updates inside the transaction, but when your code ends without committing the transaction, the changes to the DB are rolled back, despite having told the user of success on update.
You can also use the following method to ensure commits when using a cursor: db_cursor.autocommit(True).