I have python file called testing_file.py:
from datetime import datetime
import MySQLdb
# Open database connection
class DB():
def __init__(self, server, user, password, db_name):
db = MySQLdb.connect(server, user, password, db_name )
self.cur = db.cursor()
def time_statistic(self, start_date, end_date):
time_list = {}
sql = "SELECT activity_log.datetime, activity_log.user_id FROM activity_log"
self.cur.execute(sql)
self.date_data = self.cur.fetchall()
for content in self.date_data:
timestamp = str(content[0])
datetime_object = datetime.strptime(timestamp, '%Y-%m-%d %H:%M:%S')
timestamps = datetime.strftime(datetime_object, "%Y-%m-%d")
if start_dt <= timestamps and timestamps <= end_dt:
if timestamps not in time_list:
time_list[timestamps]=1
else:
time_list[timestamps]+=1
return json.dumps(time_list)
start_date = datetime.strptime(str('2017-4-7'), '%Y-%m-%d')
start_dt = datetime.strftime(start_date, "%Y-%m-%d")
end_date = datetime.strptime(str('2017-5-4'), '%Y-%m-%d')
end_dt = datetime.strftime(end_date, "%Y-%m-%d")
db = DB("host","user_db","pass_db","db_name")
db.time_statistic(start_date, end_date)
I want to access the result (time_list) thru API using Flask. This is what i've wrote so far, doesn't work and also I've tried another way:
from flask import Flask
from testing_api import *
app = Flask(__name__)
#app.route("/")
def get():
db = DB("host","user_db","pass_db","db_name")
d = db.time_statistic()
return d
if __name__ == "__main__":
app.run(debug=True)
Question: This is my first time work with API and Flask. Can anyone please help me thru this. Any hints are appreciated. Thank you
I've got empty list as result {}
There are many things wrong with what you are doing.
1.> def get(self, DB) why self? This function does not belong to a class. It is not an instance function. self is a reference of the class instance when an instance method is called. Here not only it is not needed, it is plain and simple wrong.
2.> If you look into flask's routing declaration a little bit, you will see how you should declare a route with parameter. This is the link. In essence you should something like this
#app.route("/path/<variable>")
def route_func(variable):
return variable
3.> Finally, one more thing I would like to mention, Please do not call a regular python file test_<filename>.py unless you plan to use it as a unit testing file. This is very confusing.
Oh, and you have imported DB from your module already no need to pass it as a parameter to a function. It should be anyway available inside it.
There are quite a few things that are wrong (ranging from "useless and unclear" to "plain wrong") in your code.
wrt/ the TypeError: as the error message says, your get() function expects two arguments (self and DB) which won't be passed by Flask - and are actually not used in the function anyway. Remove both arguments and you'll get rid of this error - just to find out you now have a NameError on the first line of the get() function (obviously since you didn't import time_statistic nor defined start_date and end_date).
Related
I hope everything is going well.
I'm working in a project really big and it wasn't set up by me. The project is buils using flask and cors.
I'm trying to create a query to update a row with SQLAlchemy follow the structure that the project has. so basically is like that:
#app.route("/update-topic", methods=['PATCH'])
async def update_by_id():
input_data = request.get_json()
await update_record(input_data)
return ApplicationTopicSchema(many=True).dump(data)
As you see in the code above is just a simple endpoint with PATCH method that get the input data and pass it to a function update_record(), that function is in charge to update the record like you can see in the next code:
from sqlalchemy import and_, update
class AppTopics(Base):
__tablename__ = AppTopics.__table__
async def update_record(self, data):
id_data = data['id']
query = self.__tablename__.update().\
where(self.__tablename__.c.id == id_data).values(**data).returning(self.__tablename__)
await super().fetch_one(query=query)
return 'updated'
Basically is something like that, and when I try to use the endpoint I get the next message error:
TypeError: The response value returned by the view function cannot be None
Executing <Handle <TaskWakeupMethWrapper object at 0x000001CAD3970F10>(<Future f
inis...events.py:418>) created at C:\Python\Python380\lib\asyncio\tasks.py:881>
Also, I'm trying to structure the query in another like this:
from sqlalchemy import and_, update
class AppTopics(Base):
__tablename__ = AppTopics.__table__
async def update_record(self, data):
u = update(self.__tablename__)
u = u.values({"topic": data['topic']})
u = u.where(self.__tablename__.c.id == data['id'])
await super().fetch_one(query=u)
return 'updated'
However I got the same error.
May you guys knows what is happening and what means this error:
TypeError: The response value returned by the view function cannot be None
Executing <Handle <TaskWakeupMethWrapper object at 0x000001B1B4861100>(<Future f
inis...events.py:418>) created at C:\Python\Python380\lib\asyncio\tasks.py:881>
Thanks in advance for your help and time.
Have a good day, evening, afternoon :)
The error message "TypeError: The response value returned by the view function cannot be None" is indicating that the view function (in this case, the update_by_id function) is not returning a value.
It seems that the function update_record does not return anything. If you want to return the string "updated" after updating the record, you should use a return statement like this:
async def update_record(self, data):
# update code here
return 'updated'
And on the update_by_id function you should call the return value of await update_record(input_data) to return it.
async def update_by_id():
input_data = request.get_json()
result = await update_record(input_data)
return result
Another point is that in the second example, you are not returning anything either, you should add a return statement before the end of the function.
Also, you are returning 'ApplicationTopicSchema(many=True).dump(data)' but the input data data is not being defined in the function, you should use the 'result' variable returned by update_record function instead.
async def update_by_id():
input_data = request.get_json()
result = await update_record(input_data)
return ApplicationTopicSchema(many=True).dump(result)
It's important to note that in the first example, the update_record function seems to be missing the self parameter, and could be causing some issues with the class.
It's also important to check if the fetch_one function from super() is waiting for the query with await keyword, and also if the fetch_one is returning something, otherwise it could be the cause of the None return value.
My understanding and knowledge is limited, but I hope this helps. Feel free to shoot me any further questions.
My Flask app, will get data from an url only from certain time. If it is outside the range of time, it will used the last query data from the url that save in Cache. Outside the range of time, the url will return no data. Thus, I want to reuse the last data in cache
from flask_app import app
from flask import jsonify,abort,make_response,request
from flask.ext.sqlalchemy import SQLAlchemy
from flask.ext.cache import Cache
from datetime import datetime, time
app.config['CACHE_TYPE'] = 'simple'
app.cache = Cache(app)
#app.route('/top', methods=['GET'])
#app.cache.cached(timeout=60)
def top():
now = datetime.now()
now_time = now.time()
if now_time >= time(10,30) and now_time <= time(16,30):
print "within time, use data save in cache"
# function that use last data query from url, save in cache
else:
page = requests.get('http://www.abvfc.com')
data = re.findall(r'items:(.*)',page.content)
return jsonify(data)
The problem is I can't get the last Cache data. If there is no access to the api /top in the last 60 seconds, there will be no data.
Cache the data one minutes before the url return no data at 16.30
User can use the cache data outside range of time
I am not familiar with cache, so may be my current idea is not the best way.
i am not a flask user but perhaps this is your wanted decorator
def timed_cache(cache_time:int, nullable:bool=False):
result = ''
timeout = 0
def decorator(function):
def wrapper(*args, **kwargs):
nonlocal result
nonlocal timeout
if timeout <= time.time() or not (nullable or result):
result = function(*args, **kwargs)
timeout = time.time() + cache_time
return result
return wrapper
return decorator
Assuming that you only want your cache to be working between 10:30h and 16:30h, I'd change a bit the approach you are using with the cache.
The problem I see with your current implementation is, apart from the undesirable cache expiration during the critical time range, that you'll also need to disable it at the moments you want to actually return an updated response.
That said, I'll use a different strategy: saving to cache every time I compute an updated response and retrieving the response from the cache in the critical time period.
Regarding the Flask-Cache documentation and the information in this tutorial, I'd modify your code as follows:
from flask_app import app
from flask import jsonify,abort,make_response,request
from flask.ext.sqlalchemy import SQLAlchemy
from flask.ext.cache import Cache
from datetime import datetime, time
app.config['CACHE_TYPE'] = 'simple'
app.cache = Cache(app)
#app.route('/top', methods=['GET'])
def top():
now = datetime.now()
now_time = now.time()
if now_time >= time(10,30) and now_time <= time(16,30):
print "within time, use data save in cache"
return app.cache.get('last_top_response')
page = requests.get('http://www.abvfc.com')
data = re.findall(r'items:(.*)',page.content)
response = jsonify(data)
app.cache.set('last_top_response', response)
return response
I hope this to suit your needs
I'm trying to serve a Flask app and would like to reload a pickle file at a specific time window (e.g. 9AM every day). I've tried to put a while loop into the end of my flask app with a time counter, but this ends up hanging my application. Currently the set up is...
# main.wsgi
from main import app as application
# main.py
data = pickle.load("/path/to/pickle.file")
#app.route("/")
def func():
return render_template("base.html", data_to_serve = data)
# Can I write something here to reload the data at specific time points?
I am assuming the goal here is to do what I call a "poor man's cache". Ideally you'd opt to use something like pymemcache and Flask's cache utils, but the snippet below will accomplish what you want. You can refactor this if you want to reload the pickle each time; kind of would be defeating the purpose I think.
Additionally, note that I have used a range of time to return the pickle data; 9 AM to 12 PM. You can also do something like if now.time() == time(hour=9) to accomplish what you want.
import pickle
from datetime import datetime, time
cached_data = pickle.load("/path/to/pickle.file")
START_TIME = time(hour=9)
END_TIME = time(hour=12) # Can also use something like timedelta
def in_range():
now = datetime.now()
if START_TIME <= now.time() <= END_TIME:
return True
return False
app.route("/")
def func():
if in_range():
return render_template("base.html", data_to_serve = cached_data)
# else do normal business
data = 'compute new data...'
return render_template("base.html", data_to_serve = data)
Happy coding!
You want to reload the data at specific point of time then you have 2 options:
Do it from the client size using javascript and ajax requests using some timer.
Use web sockets. There is a library for flask called flask-socketio. There is an example on how to use it.
I'm trying to find a way to store a function in a Django Model.
My use case is an emailer system with planned and conditional sending.
"send this email tomorrow if John has not logged in".
Ideally I would like to create emails with a test function field.
def my_test(instance):
if (now() - instance.user.last_login) > timedelta(days=1):
return False
return True
Email(send_later_date=now() + timedelta(days=1),
conditional=my_test)
Here is what I'm thinking about
import importlib
Email(send_later_date=now() + timedelta(days=1),
conditional='my_app.views.my_test')
class Email(models.Model):
send_later_date = models.DateTimeField(null=True, blank=True)
conditional = models.TextField()
def send_email(self):
if self.send_later_date < now():
if not self.conditional:
function_send_email()
else:
function_string = self.conditional
mod_name, func_name = function_string.rsplit('.',1)
mod = importlib.import_module(mod_name)
func = getattr(mod, func_name)
if func():
function_send_email()
How would you do that? I was thinking of storing the function name in a Text field. How would you run it once needed? ImportLib seems interesting...
Storing the function name is a valid aproach, since it is actually the one used by the pickle module. This has the benefit (in your case) that if you update the code of a function, it will automatically apply to existing e-mails (but be careful with backward-compatibility).
For convenience and safety (especially if the function name could come from user inputs), you may want to keep all such functions in the same module, and store only the function name in DB, not the full import path. So you could simply import the function repository and read it with getattr. I imagine you will also want to give parameters to these functions. If you don't want to restrict yourself to a given number / order / type of arguments, you could store in DB a dictionary as a JSON string, and expand it with the ** operator.
import my_app.func_store as store
import json
func = getattr(store, self.conditional)
params = json.loads(self.conditional_params)
if func(**params):
function_send_email()
I'm brand new at Python and I'm trying to write an extension to an app that imports GA information and parses it into MySQL. There is a shamfully sparse amount of infomation on the topic. The Google Docs only seem to have examples in JS and Java...
...I have gotten to the point where my user can authenticate into GA using SubAuth. That code is here:
import gdata.service
import gdata.analytics
from django import http
from django import shortcuts
from django.shortcuts import render_to_response
def authorize(request):
next = 'http://localhost:8000/authconfirm'
scope = 'https://www.google.com/analytics/feeds'
secure = False # set secure=True to request secure AuthSub tokens
session = False
auth_sub_url = gdata.service.GenerateAuthSubRequestUrl(next, scope, secure=secure, session=session)
return http.HttpResponseRedirect(auth_sub_url)
So, step next is getting at the data. I have found this library: (beware, UI is offensive) http://gdata-python-client.googlecode.com/svn/trunk/pydocs/gdata.analytics.html
However, I have found it difficult to navigate. It seems like I should be gdata.analytics.AnalyticsDataEntry.getDataEntry(), but I'm not sure what it is asking me to pass it.
I would love a push in the right direction. I feel I've exhausted google looking for a working example.
Thank you!!
EDIT: I have gotten farther, but my problem still isn't solved. The below method returns data (I believe).... the error I get is: "'str' object has no attribute '_BecomeChildElement'" I believe I am returning a feed? However, I don't know how to drill into it. Is there a way for me to inspect this object?
def auth_confirm(request):
gdata_service = gdata.service.GDataService('iSample_acctSample_v1.0')
feedUri='https://www.google.com/analytics/feeds/accounts/default?max-results=50'
# request feed
feed = gdata.analytics.AnalyticsDataFeed(feedUri)
print str(feed)
Maybe this post can help out. Seems like there are not Analytics specific bindings yet, so you are working with the generic gdata.
I've been using GA for a little over a year now and since about April 2009, i have used python bindings supplied in a package called python-googleanalytics by Clint Ecker et al. So far, it works quite well.
Here's where to get it: http://github.com/clintecker/python-googleanalytics.
Install it the usual way.
To use it: First, so that you don't have to manually pass in your login credentials each time you access the API, put them in a config file like so:
[Credentials]
google_account_email = youraccount#gmail.com
google_account_password = yourpassword
Name this file '.pythongoogleanalytics' and put it in your home directory.
And from an interactive prompt type:
from googleanalytics import Connection
import datetime
connection = Connection() # pass in id & pw as strings **if** not in config file
account = connection.get_account(<*your GA profile ID goes here*>)
start_date = datetime.date(2009, 12, 01)
end_data = datetime.date(2009, 12, 13)
# account object does the work, specify what data you want w/
# 'metrics' & 'dimensions'; see 'USAGE.md' file for examples
account.get_data(start_date=start_date, end_date=end_date, metrics=['visits'])
The 'get_account' method will return a python list (in above instance, bound to the variable 'account'), which contains your data.
You need 3 files within the app. client_secrets.json, analytics.dat and google_auth.py.
Create a module Query.py within the app:
class Query(object):
def __init__(self, startdate, enddate, filter, metrics):
self.startdate = startdate.strftime('%Y-%m-%d')
self.enddate = enddate.strftime('%Y-%m-%d')
self.filter = "ga:medium=" + filter
self.metrics = metrics
Example models.py: #has the following function
import google_auth
service = googleauth.initialize_service()
def total_visit(self):
object = AnalyticsData.objects.get(utm_source=self.utm_source)
trial = Query(object.date.startdate, object.date.enddate, object.utm_source, ga:sessions")
result = service.data().ga().get(ids = 'ga:<your-profile-id>', start_date = trial.startdate, end_date = trial.enddate, filters= trial.filter, metrics = trial.metrics).execute()
total_visit = result.get('rows')
<yr save command, ColumnName.object.create(data=total_visit) goes here>