app.py
from flask import Flask, render_template, request,jsonify,json,g
import mysql.connector
app = Flask(__name__)
**class TestMySQL():**
#app.before_request
def before_request():
try:
g.db = mysql.connector.connect(user='root', password='root', database='mysql')
except mysql.connector.errors.Error as err:
resp = jsonify({'status': 500, 'error': "Error:{}".format(err)})
resp.status_code = 500
return resp
#app.route('/')
def input_info(self):
try:
cursor = g.db.cursor()
cursor.execute ('CREATE TABLE IF NOT EXISTS testmysql (id INT NOT NULL AUTO_INCREMENT PRIMARY KEY, name VARCHAR(40) NOT NULL, \
email VARCHAR(40) NOT NULL UNIQUE)')
cursor.close()
test.py
from app import *
class Test(unittest.TestCase):
def test_connection1(self):
with patch('__main__.mysql.connector.connect') as mock_mysql_connector_connect:
object=TestMySQL()
object.before_request() """Runtime error on calling this"
I am importing app into test.py for unit testing.On calling 'before_request' function into test.py ,it is throwing RuntimeError: working outside of application context
same is happening on calling 'input_info()'
Flask has an Application Context, and it seems like you'll need to do something like:
def test_connection(self):
with app.app_context():
#test code
You can probably also shove the app.app_context() call into a test setup method as well.
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///todo.db'
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
db = SQLAlchemy(app)
app.app_context().push()
Run in terminal
>python
>>>from app import app
>>>from app import db
>>>db.create_all()
Now it should work
I followed the answer from #brenns10 when I ran into a similar problem when using pytest.
I followed the suggestion of putting it into test setup, this works:
import pytest
from src.app import app
#pytest.fixture
def app_context():
with app.app_context():
yield
def some_test(app_context):
# <test code that needs the app context>
I am using python3.8 and had to use a small variation to the answers already posted. I included the the below in pytests and didn't have to change anything else in the rest of the test file.
from flask import Flask
#pytest.fixture(autouse=True)
def app_context():
app = Flask(__name__)
with app.app_context():
yield
This can also be used with a context manager as well.
The main different to note here is the creation of the Flask app within the test file rather than it being imported from the main application file.
Related
I have a simple Flask app like this :
simple_app/lib/oracle.py:
import cx_Oracle
from flask import current_app
def create_pool():
# Create session pool
pool = cx_Oracle.SessionPool(user=current_app.config['USER'],
password=current_app.config['PASSWD'],
dsn=current_app.config['DSN'],
min=4,
max=10,
increment=1,
threaded=True,
getmode=cx_Oracle.SPOOL_ATTRVAL_WAIT)
return pool
simple_app/routes/init.py:
from .hello import hello_bp
from .oracle_pdb_backup import oracle_pdb_backup_bp
def init_app(app):
app.register_blueprint(hello_bp)
app.register_blueprint(oracle_pdb_backup_bp)
simple_app/routes/hello.py:
from flask import Blueprint, jsonify
hello_bp = Blueprint('hello', __name__)
#hello_bp.route('/hello', methods=['GET'])
def hello_get():
return jsonify(message = f'hello world!'), 200
simple_app/routes/oracle_pdb_backup.py:
from flask import Blueprint, jsonify
from ..lib.oracle import *
oracle_pdb_backup_bp = Blueprint('oracle_pdb_backup', __name__)
#oracle_pdb_backup_bp.route('/oracle/pdb/backup', methods=['GET'])
def oracle_pdb_backup_post():
pool = ?????? how to access pool defined at app creation ??????
# Acquire a connection from the pool
connection = pool.acquire()
# Use the pooled connection
cursor = connection.cursor()
cursor.execute('select * from dual')
result = cursor.fetchall()
cursor.close()
# Release the connection to the pool
db_pool.release(connection)
return jsonify(message = result), 200
simple_app/init.py:
from flask import Flask
from flask_cors import CORS
from .lib.oracle import *
from .routes import *
def create_app():
app = Flask(__name__)
CORS(app)
app.secret_key = 'dev'
app.config.from_object('configSIMPLE')
with app.app_context():
create_pool()
routes.init_app(app)
return app
I start the app with the following commands :
export PATH=~/python3/bin:$PATH
export LD_LIBRARY_PATH=~/instantclient_19_12
cd ~/myapps
export FLASK_RUN_HOST=`hostname`
export FLASK_RUN_PORT=8080
export FLASK_APP=simple_app
export FLASK_ENV=development
export FLASK_DEBUG=1
flask run
How can I create the cx_Oracle.SessionPool ONCE in create_app() and use it in all my routes
Note : the pool is created in create_pool() function in simple_app/lib/oracle.py
First, I don't know if :
with app.app_context():
create_pool()
is the correct way to initialize the pool at app creation ?
Second, how to use the created pool in my routes ?
see the following line in simple_app/routes/oracle_pdb_backup.py:
pool = ?????? how to access pool defined at app creation ??????
Could you please tell me what I should modify to make it work.
Thanks for your inputs
I've been struggling with this for awhile now. I Have a flask app that is executed in my app.py file. In this file I have a bunch of endpoints that call different functions from other files. In another file, extensions.py, I've instantiated a class that contains a redis connection. See the file structure below.
#app.py
from flask import Flask
from extensions import redis_obj
app = Flask(__name__)
#app.route('/flush-cache', methods=['POST'])
def flush_redis():
result = redis_obj.flush_redis_cache()
return result
# extensions.py
from redis_class import CloudRedis
redis_obj = CloudRedis()
# redis_class
import redis
class CloudRedis:
def __init__(self):
self.conn = redis.Redis(connection_pool=redis.ConnectionPool.from_url('REDIS_URL',
ssl_cert_reqs=None))
def flush_redis_cache(self):
try:
self.conn.flushdb()
return 'OK'
except:
return 'redis flush failed'
I've been attempting to use monkeypatching in a test patch flush_redis_cache, so when I run flush_redis() the call to redis_obj.flush_redis_cache() will just return "Ok", since I've already tested the CloudRedis class in other pytests. However, no matter what I've tried I haven't been able to successfully patch this. This is what I have below.
from extensions import redis_obj
from app import app
#pytest.fixture()
def client():
yield app.test_client()
def test_flush_redis_when_redis_flushed(client, monkeypatch):
# setup
def get_mock_flush_redis_cache():
return 'OK'
monkeypatch.setattr(cloud_reids, 'flush_redis_cache', get_mock_flush_redis_cache)
cloud_redis.flush_redis = get_mock_flush_redis_cache
# act
res = client.post('/flush-cache')
result = flush_redis()
Does anyone have any ideas on how this can be done?
I'm now trying to make an application that processes HTTP requests with REST APIs. The application uses Flask for the web framework, and it uses Celery for the asynchronous tasks.
Here's the application structure.
app.py
celery_app.py
controller/
controller.py
task/
task.py
(1) app.py
There are no lines for the Celery configuration.
from flask import Flask
...
app = Flask(__name__)
...
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000, debug=True)
(2) celery.py
from consts.consts import Consts
from kombu.utils.url import quote
from celery import Celery
from config.config import Config
config = Config()
db_uri = 'db+' + config.get_db_uri()
aws_access_key = quote(config.get_config(Consts.AWS, Consts.AWS_ACCESS_KEY))
aws_secret_key = quote(config.get_config(Consts.AWS, Consts.AWS_SECRET_KEY))
broker_url = "sqs://{aws_access_key}:{aws_secret_key}#".format(
aws_access_key=aws_access_key, aws_secret_key=aws_secret_key
)
celery = Celery(
broker=broker_url,
backend=db_uri,
include=['task']
)
(3) task/task.py
I put all the tasks here.
from celery_app import celery
from flask import current_app as app
from model.model import db
#celery.task
def check_friend_status_from_db(user1, user2):
status = db.engine.execute(
"QUERY").fetchone()
return status
Now, the controller/controller.py file imports and calls the tasks as follows.
(4) controller/controller.py
from flask import Blueprint, request, json, render_template, jsonify
from mysql.connector import Error
from sqlalchemy import or_, and_
from consts.consts import StatusCode
from consts.consts import Consts
from model.model import db, FriendRequest
from outbound.request import Request
from util.logging import logger
from task.task import check_friend_status_from_db
controller_blueprint = Blueprint('controller_blueprint', __name__)
outbound_request = Request()
#controller_blueprint.route('/friend/status/<requested_from>/<requested_to>', methods=['GET'])
def check_friend_status(requested_from, requested_to):
logger.info('Checking the friend status')
try:
status = check_friend_status_from_db.apply_async((requested_from, requested_to)).get()
if status is None:
response = {
Consts.STATUS_CODE: StatusCode.OK,
Consts.FRIEND_STATUS: StatusCode.NO_RELATION
}
else:
response = {
Consts.STATUS_CODE: StatusCode.OK,
Consts.FRIEND_STATUS: status
}
except Error as e:
logger.error("TypeError:", e)
response = {
Consts.STATUS_CODE: StatusCode.ERROR
}
json_response = jsonify(response)
logger.info(json_response)
return json_response
When I run the code, I get the error as I mentioned on the title.
RuntimeError: No application found. Either work inside a view function or push an application context
and it turns out to be this part under the try block in the controller where the error is coming from.
status = check_friend_status_from_db.apply_async((requested_from, requested_to)).get()
Any solutions, please?
It looks like you need to register your blueprint controller_blueprint. Since this blueprint is not registered to your app, you are working outside the context of you application and hence the error.
You can do so in your app.py:
from controller.controller import controller_blueprint
app = Flask(__name__)
app.register_blueprint(controller_blueprint)
I am using flask to create simple api. The api simply returns values from mongoDB. Everything works great if i do the connection within same function. I am not doing connection simply at start of file because i am using uwsgi and nginx server on ubuntu. If i do that then there will be a problem of fork.
However, I have to use this connection with other api so thought to make a seperate class for connection and each api will simply call it . I m using this functionality to make codes manageable. However when i try the these codes it always shows internal server error. I tried making this function static too , still the error exists.
Note - I have replaced mongodb address with xxx as i am using mongodbatlas account here
from flask import Flask
from flask import request, jsonify
from flask_pymongo import pymongo
from pymongo import MongoClient
from flask_restful import Resource, Api, reqparse
app = Flask(__name__)
api = Api(app)
#client = MongoClient("xxx")
#db = client.get_database('restdb')
#records = db.stars
class dbConnect():
def connect(self):
client = MongoClient("xxx")
db = client.get_database('restdb')
records = db.stars
return records
class Order(Resource):
def get(self):
#client = MongoClient("xxx")
#db = client.get_database('restdb')
#records = db.stars
#star = records
star = dbConnect.connect
output = []
for s in star.find():
output.append({'name' : s['name'], 'distance' : s['distance']})
return jsonify({'result' : output})
api.add_resource(Order, '/')
if __name__ == "__main__":
app.run(host='0.0.0.0')
ERROR {"message": "Internal Server Error"}
Preliminary investigation suggests that you haven't instantiated your dbConnect class. Also, you haven't called the method connect properly.
class Order(Resource):
def get(self):
db = dbConnect() # This was missing
star = db.connect() # This is how you make method call properly.
output = []
for s in star.find():
output.append({'name' : s['name'], 'distance' : s['distance']})
return jsonify({'result' : output})
Also class dbConnect() should be declared as class dbConnect:.
I'm trying to create REST API endpoints using flask framework. This is my fully working script:
from flask import Flask, jsonify
from flask_restful import Resource, Api
from flask_restful import reqparse
from sqlalchemy import create_engine
from flask.ext.httpauth import HTTPBasicAuth
from flask.ext.cors import CORS
conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server"
auth = HTTPBasicAuth()
#auth.get_password
def get_password(username):
if username == 'x':
return 'x'
return None
app = Flask(__name__)
cors = CORS(app)
api = Api(app)
class Report(Resource):
decorators = [auth.login_required]
def get(self):
parser = reqparse.RequestParser()
parser.add_argument('start', type = str)
parser.add_argument('end', type = str)
args = parser.parse_args()
e = create_engine(conn_string)
conn = e.connect()
stat = """
select x from report
"""
query = conn.execute(stat)
json_dict = []
for i in query.cursor.fetchall():
res = {'x': i[0], 'xx': i[1]}
json_dict.append(res)
conn.close()
e.dispose()
return jsonify(results=json_dict)
api.add_resource(Report, '/report')
if __name__ == '__main__':
app.run(host='0.0.0.0')
The issue is that I get results when I call this API only for a day or so after which I stop getting results unless I restart my script (or sometimes even my VM) after which I get results again. I reckon there is some issue with the database connection pool or something but I'm closing the connection and disposing it as well. I have no idea why the API gives me results only for some time being because of which I have to restart my VM every single day. Any ideas?
Per my experience, the issue was caused by coding create_engine(conn_string) to create db pool inside the Class Report so that always do the create & destory operations of db pool for per restful request. It's not correct way for using SQLAlchemy ORM, and be cause IO resouce clash related to DB connection, see the engine.dispose() function description below at http://docs.sqlalchemy.org/en/rel_1_0/core/connections.html#sqlalchemy.engine.Engine:
To resolve the issue, you just need to move e = create_engine(conn_string) to the below of the code conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server" and remove the code e.dispose() both in the Class Report, see below.
conn_string = "mssql+pyodbc://x:x#x:1433/x?driver=SQL Server"
e = create_engine(conn_string) # To here
In the def get(delf) function:
args = parser.parse_args()
# Move: e = create_engine(conn_string)
conn = e.connect()
and
conn.close()
# Remove: e.dispose()
return jsonify(results=json_dict)