I know it is possible to create session object using session_transaction() method. However, is there a way to access the current session object which gets created when for example "/" route gets hit? I did from flask import session to access the session but it's empty. Let me know if it is possible. Thanks.
This is what you're looking for. As it says however, you'd have to use the instantiation you create in your with statement.
with app.test_client() as c:
with c.session_transaction() as sess:
sess['a_key'] = 'a value'
# once this is reached the session was stored
result = app.test_client.get('/a_url')
# NOT part of the 2nd context
Note that this won't work if you run your test within the scope of the with c.session_transaction() as sess statement, it needs to be run after that block.
If you want to read the session data written in your view from the test, one way is to mock the session view as a dict and verify the session in your test. Here's an example using Python's unittest.mock:
app.py
from flask import Flask, session, request
app = Flask(__name__)
app.config["SECRET_KEY"] = "my secret key"
#app.route("/", methods=["POST"])
def index():
session["username"] = request.form["username"]
return "Username saved in session"
test_index.py
from unittest.mock import patch
from app import app
def test_index():
with patch("app.session", dict()) as session:
client = app.test_client()
response = client.post("/", data={
"username": "test"
})
assert session.get("username") == "test"
assert response.data == b"Username saved in session"
You can use any mocking solution you prefer, of course.
Related
I'm getting started with Flask and Pytest in order to implemente a rest service with unit test, but i'm having some troouble.
I'll like to make a simple test for my simple endpoint but i keep getting a Working outside of application context. error when running the test.
This is the end point:
from flask import jsonify, request, Blueprint
STATUS_API = Blueprint('status_api', __name__)
def get_blueprint():
"""Return the blueprint for the main app module"""
return STATUS_API
#STATUS_API.route('/status', methods=['GET'])
def get_status():
return jsonify({
'status' : 'alive'
})
And this is how I'm trying to test it (i know it should fail the test):
import pytest
from routes import status_api
def test_get_status():
assert status_api.get_status() == ''
I'm guessing I just cant try the method with out building the whole app. But if that's the case i dont really know how to aproach this problem
The Flask documentation on testing is pretty good.
Instead of importing the view functions, you should create a so called test client, e.g. as a pytest fixture.
For my last Flask app this looked like:
#pytest.fixture
def client():
app = create_app()
app.config['TESTING'] = True
with app.app_context():
with app.test_client() as client:
yield client
(create_app is my app factory)
Then you can easily create tests as follows:
def test_status(client):
rv = client.get('/stats')
assert ...
As mentioned at the beginning, the official documentation is really good.
Have you considered trying an API client/development tool? Insomnia and Postman are popular ones. Using one may be able to resolve this for you.
I'm having difficulty with my Cloud Function in GCP that is simply supposed to return the raw XML stored in a GCS Bucket when invoked with a basic GET request. It works fine without any type of authentication, however since I added the Flask-HTTPAuth package to the mix in order to add some measure of security before exposing the endpoint, the application deploys fine, but crashes without any sort of hint as to why as soon as it is invoked. The error in SD Logging is as follows:
severity: "DEBUG"
textPayload: "Function execution took 1847 ms, finished with status: 'crash'"
timestamp: "2020-07-15T17:22:15.158036700Z"
The function in question (anonymized):
from flask import Flask, request, jsonify, make_response, abort
from flask_httpauth import HTTPBasicAuth
from google.cloud import storage, secretmanager
import google.cloud.logging
import logging
import sys
app = Flask(__name__)
auth = HTTPBasicAuth()
PROJECT_ID = 'example_project'
GCS_BUCKET = 'example_bucket'
users = ['example_user']
# Instantiate logger
client = google.cloud.logging.Client()
client.get_default_handler()
client.setup_logging()
#auth.verify_password
def verify_password(username, password):
# Instantiate the Secret Manager client.
sm_client = secretmanager.SecretManagerServiceClient()
# Load secrets
name = sm_client.secret_version_path(PROJECT_ID, 'example_secrets_ref', 1)
secrets_pass = sm_client.access_secret_version(name)
passwords = [secrets_pass]
if username in users and password in passwords:
logging.info('auth success')
return username
logging.info('auth fail')
return abort(403)
#app.route('/')
#auth.login_required
def latest_xml():
try:
request_json = request.get_json()#silent=True)
storage_client = storage.Client(project=PROJECT_ID)
bucket = storage_client.get_bucket(GCS_BUCKET)
blob = bucket.get_blob('latest_pull.xml')
latest_xml = blob.download_as_string()
logging.info('Loaded blob from GCS')
return(latest_xml)
except exception as e:
logging.error(str(e))
logging.error("Failed to load blob from GCS")
sys.exit(1)
if __name__ == '__main__':
app.run()
I've tried setting the entrypoint as both the main function as well as the auth function to no avail. My question is: is it possible to even use basic auth in a GCP Cloud Function or am I barking up the wrong tree here?
Your function doesn't enforce the standard signature for http function
def latest_xml(request):
...
Here you use a flask web server, which is not need, and not used by Cloud Functions. However, I recommend you to have a look to Cloud Run, and to add a simple and generic Dockerfile to deploy . You can deploy your "function" as-is in a container and to have the same behavior as Cloud Functions.
EDIT
When you use flask, the request object is global for each request. You use it like this:
request_json = request.get_json()#silent=True)
With Cloud Functions, this object is caught by the Cloud Functions platform and passed in parameter to your function.
In the request object, you have the body of the request, useless in GET for example. But also, all the request context: headers, user agent, source ip,...
I am trying to locally test a Python function that I hope to deploy as a Google Cloud Function. These functions seem to be essentially Flask based, and I have found that the best way to return JSON is to use Flask's jsonify function. This seems to work fine when deployed, but I want to set up some local unit tests, and here is where I got stuck. Simply adding the line to import jsonify, results in the following error:
RuntimeError: Working outside of application context.
There are several posts here on Stackoverflow that seem relevant to this issue, and yet Google Cloud Functions do not really follow the Flask pattern. There is no app context, as far as I can tell, and there are no decorators. All of the examples I've found have not been useful to this particular use case. Can anyone suggest a method for constructing a unit test that will respect the application context and still jibe with the GCF pattern here.
I have a unittest, which I can share, but you will see the same error when you run the following, with the method invocation inside of main.
import os
import json
from flask import jsonify
from unittest.mock import Mock
def dummy_request(request):
request_json = request.get_json()
if request_json and 'document' in request_json:
document = request_json['document']
else:
raise ValueError("JSON is invalid, or missing a 'docuemnt' property")
data = document
return jsonify(data)
if __name__ == '__main__':
data = {"document":"This is a test document"}
request = Mock(get_json=Mock(return_value=data), args=data)
result = dummy_request(request)
print(result)
You don't really need to test whether flask.jsonify works as expected, right? It's a third-party function.
What you're actually trying to test is that flask.jsonify was called with the right data, so instead you can just patch flask.jsonify, and make assertions on whether the mock was called:
import flask
from unittest.mock import Mock, patch
def dummy_request(request):
request_json = request.get_json()
if request_json and 'document' in request_json:
document = request_json['document']
else:
raise ValueError("JSON is invalid, or missing a 'docuemnt' property")
data = document
return flask.jsonify(data)
#patch('flask.jsonify')
def test(mock_jsonify):
data = {"document": "This is a test document"}
request = Mock(get_json=Mock(return_value=data), args=data)
dummy_request(request)
mock_jsonify.assert_called_once_with("This is a test document")
if __name__ == '__main__':
test()
I'd recommend you to take a look at Flask's documentation on how to test Flask apps, it's described pretty well how to setup a test and get an application context.
P.S. jsonify requires application context, but json.dumps is not. Maybe you can use the latter?
I came across the same issue. As you've said the flask testing doesn't seem to fit well with Cloud Functions and I was happy with how the code worked so didn't want to change that. Adding an application context in setUp() of testing then using it for the required calls worked for me. Something like this...
import unittest
import main
from flask import Flask
class TestSomething(unittest.TestCase):
def setUp(self):
self.app = Flask(__name__)
def test_something(self):
with self.app.app_context():
(body, code) = main.request_something()
self.assertEqual(200, code, "The request did not return a successful response")
if __name__ == '__main__':
unittest.main()
I am trying to test the following code with nose. The app.py file is as below:
from flask import Flask, session, redirect, url_for, request
app = Flask(__name__)
#app.route('/')
def index():
session['key'] = 'value'
print('>>>> session:', session)
return redirect(url_for("game"))
The test file is below:
from nose.tools import *
from flask import session
from app import app
app.config['TESTING'] = True
web = app.test_client()
def test_index():
with app.test_request_context('/'):
print('>>>>test session:', session)
assert_equal(session.get('key'), 'value')
When I run the test file, I get an assertion error None != 'value'
and the print statement in the test file prints an empty session object. Moreover, the print statement in the app.py file does not print anything. Does this mean that the index function isn't running?
Why is this happening? According to the flask documentation (http://flask.pocoo.org/docs/1.0/testing/#other-testing-tricks),
I should have access to the session contents through test_request_context().
Also, if I write the test_index function like this instead, the test works (and both print statements in the app.py and test files are executed):
def test_index():
with app.test_client() as c:
rv = c.get('/')
print('>>>>test session:', session)
assert_equal(session.get('key'), 'value')
What is the difference between using Flask.test_client() and Flask.test_request_context in a 'with' statement? As I understand it, the point in both is to keep the request context around for longer.
You're only setting up your request context. You need to actually have your app dispatch the request to have anything happen-- similar to your c.get() in your full Client.
Give the following a try and I think you'll have better luck:
def test_index():
with app.test_request_context('/'):
app.dispatch_request()
print('>>>>test session:', session)
assert_equal(session.get('key'), 'value')
I have a Flask REST API, running with a gunicorn/nginx stack. There is global SQLAlchemy session set up once for each thread that the API runs on. I set up an endpoint /test/ for running the unit tests for the API. One test makes a POST request to add something to the database, then has a finally: clause to clean up:
def test_something():
try:
url = "http://myposturl"
data = {"content" : "test post"}
headers = {'content-type': 'application/json'}
result = requests.post(url, json=data, headers=headers).json()
validate(result, myschema)
finally:
db.sqlsession.query(MyTable).filter(MyTable.content == "test post").delete()
db.sqlsession.commit()
The problem is that the thread to which the POST request is made now has a "test post" object in its session, but the database has no such object because the thread on which the tests ran deleted that thing from the database. So when I make a GET request to the server, about 1 in 4 times (I have 4 gunicorn workers), I get the "test post" object, and 3 in 4 times I do not. This is because the threads each have their own session object, and they are getting out of sync, but I don't really know what to do about it....
Here is my setup for my SQLAlchemy session:
def connectSQLAlchemy():
import sqlalchemy
import sqlalchemy.orm
engine = sqlalchemy.create_engine(connection_string(DBConfig.USER, DBConfig.PASSWORD, DBConfig.HOST, DBConfig.DB))
session_factory = sqlalchemy.orm.sessionmaker(bind=engine)
Session = sqlalchemy.orm.scoped_session(session_factory)
return Session()
# Create a global session for everyone
sqlsession = connectSQLAlchemy()
Please use flask-sqlalchemy if you're using flask, it takes care of the lifecycle of the session for you.
If you insist on doing it yourself, the correct pattern is to create a session for each request instead of having a global session. You should be doing
Session = scoped_session(session_factory, scopefunc=flask._app_ctx_stack.__ident_func__)
return Session
instead of
Session = scoped_session(session_factory)
return Session()
And do
session = Session()
every time you need a session. By virtue of the scoped_session and the scopefunc, this will return you a different session in each request, but the same session in the same request.
Figured it out. What I did was to add a setup and teardown to the request in my app's __init__.py:
#app.before_request
def startup_session():
db.session = db.connectSQLAlchemy()
#app.teardown_request
def shutdown_session(exception=None):
db.session.close()
still using the global session object in my db module:
db.py:
....
session = None
....
The scoped_session handles the different threads, I think...
Please advise if this is a terrible way to do this for some reason. =c)