Suppose my main.py is like this (this is a simplified example, in my app I use an actual database and I have two different database URIs for development and testing):
from fastapi import FastAPI
from pydantic import BaseSettings
app = FastAPI()
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
settings = Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[settings.ENVIRONMENT]
#app.get("/")
def read_root():
return {"Environment": database}
while the .env is
ENVIRONMENT=dev
Suppose I want to test my code and I want to set ENVIRONMENT=test to use a testing database. What should I do? In FastAPI documentation (https://fastapi.tiangolo.com/advanced/settings/#settings-and-testing) there is a good example but it is about dependencies, so it is a different case as far as I know.
My idea was the following (test.py):
import pytest
from fastapi.testclient import TestClient
from main import app
#pytest.fixture(scope="session", autouse=True)
def test_config(monkeypatch):
monkeypatch.setenv("ENVIRONMENT", "test")
#pytest.fixture(scope="session")
def client():
return TestClient(app)
def test_root(client):
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"Environment": "Testing"}
but it doesn't work.
Furthermore I get this error:
ScopeMismatch: You tried to access the 'function' scoped fixture 'monkeypatch' with a 'session' scoped request object, involved factories
test.py:7: def test_config(monkeypatch)
env\lib\site-packages\_pytest\monkeypatch.py:16: def monkeypatch()
while from pytest official documentation it should work (https://docs.pytest.org/en/3.0.1/monkeypatch.html#example-setting-an-environment-variable-for-the-test-session). I have the latest version of pytest installed.
I tried to use specific test environment variables because of this: https://pydantic-docs.helpmanual.io/usage/settings/#field-value-priority.
To be honest I'm lost, my only real aim is to have a different test configuration (in the same way Flask works: https://flask.palletsprojects.com/en/1.1.x/tutorial/tests/#setup-and-fixtures). Am I approaching the problem the wrong way?
PydanticSettings are mutable, so you can simply override them in your test.py:
from main import settings
settings.ENVIRONMENT = 'test'
This is a simple way that works for me. Consider that you have a configuration file named APPNAME.cfg with the following settings:
DEV_DSN='DSN=my_dev_dsn; UID=my_dev_user_id; PWD=my_dev_password'
PROD_DSN='DSN=my_prod_dsn; UID=my_prod_user_id; PWD=my_prod_password'
Set your environment according to your OS or Docker variable. For Linux you could enter:
export MY_ENVIORONMENT=DEV
Now consider the following settings.py:
from pydantic import BaseSettings
import os
class Settings(BaseSettings):
DSN: str
class Config():
env_prefix = f"{os.environ['MY_ENVIORONMENT']}_"
env_file = "APPNAME.cfg"
Your app would simply need to do the following:
from settings import Settings
s = Settings()
db = pyodbc.connect(s.DSN)
Bumping an old thread because I found a solution that was a bit cleaner for my use case. I was having trouble getting test specific dotenv files to load only while tests were running and when I had a local development dotenv in the project dir.
You can do something like the below where test.enviornment is a special dotenv file that is NOT an env_file path in the settings class Config. Because env vars > dotenv for BaseSettings, this will override any settings from a local .env as long as this is run in conftest.py before your settings class is imported. It also guarantees that your test environment is only active when tests are being run.
#conftest.py
from dotenv import load_dotenv
load_dotenv("tests/fixtures/test.environment", override=True)
from app import settings # singleton instance of BaseSettings class
It's really tricky to mock environment with pydantic involved.
I only achieved desired behaviour with dependency injection in fastapi and making get_settings function, which itself seems to be good practice since even documentation says to do so.
Suppose you have
...
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
def get_settings() -> Settings:
return Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[get_settings().ENVIRONMENT]
#app.get("/")
def read_root():
return {"Environment": database}
And in your tests you would write:
import pytest
from main import get_settings
def get_settings_override() -> Settings:
return Settings(ENVIRONMENT="dev")
#pytest.fixture(autouse=True)
def override_settings() -> None:
app.dependency_overrides[get_settings] = get_settings_override
You can use scope session if you'd like.
This would override your ENVIRONMENT variable and wouldn't touch rest of configuration variables.
Related
I am trying to include a router in the main FastAPI router:
from fastapi import FastAPI
from test.main.app.google_calendar_wrapper import app as calendar_manager_router
app = FastAPI()
#
app.include_router(calendar_manager_router, prefix="/calendar_manager", tags=["calendar_manager"])
#app.get("/")
def root():
return {"message": "Home Page"}
However, when running
uvicorn test.main.app.webhook.router:app --port 8050 --reload
I get an error:
AttributeError: 'FastAPI' object has no attribute 'default_response_class'
My file structure is:
test
| main
| app
| google_calendar_wrapper
| endpoints.py
| __init__.py
| webhooks
| router.py
So far I have tried:
Not including the router, in this case the application starts normally
google_calendar_wrapper with and without __init__.py. If with an __init__.py, I tried exporting the google_calendar_wrapper and it still raises the same error
Both routers work independently of each other but nothing has helped so far and I have not found any solutions.
Here is the calendar_manager_router definition:
from fastapi import FastAPI
app = FastAPI()
#app.get("/")
def root():
return {"message": "Hello World"}
#app.get("/health")
def health():
"""Api health endpoint."""
return {"Api is up and running"}
FastAPI's include_router accepts an APIRouter, but the object you imported in the main file, calendar_manager_router, is another FastAPI object. In your google_calendar_wrapper, you should be defining an APIRouter and that's what you import and include in your main app.
In google_calendar_wrapper, change it to:
from fastapi import APIRouter
router = APIRouter() # <---------
#router.get("/")
def root():
return {"message": "Hello World"}
#router.get("/health")
def health():
"""Api health endpoint."""
return {"Api is up and running"}
Notice the change to use APIRouter.
Then in your main app:
from test.main.app.google_calendar_wrapper import router as calendar_manager_router
...
app = FastAPI()
app.include_router(
calendar_manager_router,
prefix="/calendar_manager",
tags=["calendar_manager"]
)
See the FastAPI tutorials on Bigger Applications - Multiple Files:
You want to have the path operations related to your users separated from the rest of the code, to keep it organized.
But it's still part of the same FastAPI application/web API (it's part of the same "Python Package").
You can create the path operations for that module using APIRouter.
...
You can think of APIRouter as a "mini FastAPI" class.
I have a django project, defautly testing on Django only works on sql database, but I need to work on mongodb and mongoengine.
I use Django 1.9, mongoengine 0.9 cause it supports django.
I follow the docs here https://mongoengine.readthedocs.io/en/v0.9.0/django.html
and django docs for test https://docs.djangoproject.com/en/1.8/topics/testing/tools/
The problem is how I can config the test file to tell it I want to use mongodb database. Without any setup, the test file look like this:
import unittest
from django.test import Client
from .models import User
class UserTests(unittest.TestCase):
def setUp(self):
self.client = Client()
def test_create_user(self):
self.client.post('/users/', {'first_name': 'aaa', 'last_name': 'bbb',
'username': 'xxx', 'email': 'abc#gmail.com'})
...
The error when run python manage.py test will be:
raise ImproperlyConfigured("settings.DATABASES is improperly configured. "
ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details.
In settings.py:
from mongoengine import connect
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.dummy',
},
}
connect(
host='mongodb://localhost/book'
)
1. Define custom DiscoverRunner for NoSQLTests
For example in yourapp/tests.py
from django.test.runner import DiscoverRunner
class NoSQLTestRunner(DiscoverRunner):
def setup_databases(self, **kwargs):
pass
def teardown_databases(self, old_config, **kwargs):
pass
2. Define custom TestCase class for NoSQLTests.
For example in yourapp/tests.py
from django.test import TestCase
class NoSQLTestCase(TestCase):
def _fixture_setup(self):
pass
def _fixture_teardown(self):
pass
3. Change default TEST_RUNNER in your settings.py
TEST_RUNNER = 'yourapp.tests.NoSQLTestRunner'
4. Write tests
Tests that not require database:
class YourTest(NoSQLTestCase):
def test_foo(self):
to_compare = 'foo'
assumed = 'foo'
self.assertEqual(to_compare, assumed)
Tests that require database, use mocking:
https://docs.mongoengine.org/guide/mongomock.html
Step by step
Install mongomock pip install mongomock
Write test:
from mongoengine import connect, disconnect, Document, StringField
class Foo(Document):
content = StringField()
class TestFoo(NoSQLTestCase):
#classmethod
def setUpClass(cls):
connect('mongoenginetest', host='mongomock://localhost', alias='testdb')
#classmethod
def tearDownClass(cls):
disconnect(alias='testdb)
def test_thing(self):
foo = Foo(content='bar')
foo.save()
fresh_foo = Foo.objects().first()
assert fresh_foo.content == 'bar'
Testing of django while using mongodb can be done by creating a custom testcase in which setUp function connects to the mongodb using mongoengine and tearDown will drop the testing database and disconnects.
from django.test import TestCase
import mongoengine
class MongoTestCase(TestCase):
def setUp(self):
mongoengine.connection.disconnect()
mongoengine.connect(
host=settings.MONGO['host'],
port=settings.MONGO['port'],
db=settings.MONGO['db'],
username=settings.MONGO['username'],
password=settings.MONGO['password']
)
super().setUpClass()
def tearDown(self):
from mongoengine.connection import get_connection, disconnect
connection = get_connection()
connection.drop_database(settings.MONGO['db'])
disconnect()
super().tearDownClass()
Then you can use this test case in your tests. For ex assuming we have a model named 'App'
from models import App
class AppCreationTest(MongoTestCase):
def test(self):
app = App(name="first_app")
app.save()
assert App.objects.first().name == app.name
You can run these tests by python manage.py test
Here is a small gist for your reference
I have a project and I've defined my db.py module as:
app = get_global_flask_app()
app.config['SQLALCHEMY_DATABASE_URI'] = "postgresql://foo:bar#127.0.0.1:5432/test"
app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
db = SQLAlchemy(app)
db.create_all()
Then I import db.db into modules that need to query the database and insert data (db.session.query()).
However, this means that when I write test code (pytest) to test any module that imports db.py, I will need to define "SQLALCHEMY_DATABASE_URI". One solution is to have db be a lazy attribute so that the code above is executed in tests only if the database is used/tested. Is there a common design pattern for Flask() + SQLA + SQLALCHEMY_DATABASE_URI out there that I'm missing? How would you solve this problem? Flask-config?
The way we normally solve this problem is with an application factory and a config.
This means you have a function somewhere in your project that looks something like this (taken from the documentation with modifications):
def create_app(config_filename, settings_override=None):
app = Flask(__name__)
app.config.from_pyfile(config_filename)
app.config.from_object(settings_override)
from yourapplication.model import db
db.init_app(app)
from yourapplication.views.admin import admin
from yourapplication.views.frontend import frontend
app.register_blueprint(admin)
app.register_blueprint(frontend)
return app
Then (and hopefully you're using pytest) in your root test directory, you have a conftest file which automatically prepares your test environment something like this:
import pytest
from your_project import create_app
class TestConfig:
SQLALCHEMY_DATABASE_URI = "postgresql://foo:bar#127.0.0.1:5432/test"
SQLALCHEMY_TRACK_MODIFICATIONS = False
ANY OTHER SETTINGS...
#pytest.fixture(autouse=True)
def app(request):
app = create_app(settings_override=TestConfig)
ctx = app.app_context()
ctx.push()
def teardown():
ctx.pop()
request.addfinalizer(teardown)
return app
Typically, we create another fixture which is also autouse=True that handles that DB set-up, flush, and possibly loading fixtures, and only use that fixture in tests which need to access the DB (integration or functional tests), which means simply that we include in a conftest file in the same directory as our integration tests.
Instead of flask-peewee I'm using plain peewee package.
Here's the way I'm initializing the database:
import os
# just extending the BaseFlask with yaml config reader
from . extensions.flask_app import Flask
# peewee's wrapper around the database
from playhouse.flask_utils import FlaskDB
db_wrapper = FlaskDB()
# define the application factory
def create_app(env):
app = Flask(__name__)
# load config depending on the environment
app.config.from_yaml(os.path.join(app.root_path, 'config.yml'), env)
# init extensions
db_wrapper.init_app(app)
# ...
I know that I should call this to create tables:
from . models import User
db_wrapper.database.connect()
db_wrapper.database.create_tables([User])
But where do I put the table creation code, so that the database would be already initialized?
Edit
Looking at the docs I found out that I can use User.create_table(fail_silently=True) like that:
# in app/__init__.py
# define the application factory
def create_app(env):
app = Flask(__name__)
# load config depending on the environment
app.config.from_yaml(os.path.join(app.root_path, 'config.yml'), env)
# init extensions
db_wrapper.init_app(app)
create_tables();
# rest of the initialization
def create_tables():
from . models import User
User.create_table(fail_silently=True)
Is it alright to do it here? Or is there a better way/tool for this?
Edit
Figured it out. Please, see my answer below.
Update
I didn't know about the built-in CLI support in Flask. I don't know whether you should consider such an approach at all, since you can do things out of the box (see documntation).
I can utilize the flask-script package. I've done it before, just overlooked it.
Activate your virtual environment and run:
pip install flask-script
Then create manage.py file in your root directory, add these lines:
import os
from app import create_app, db_wrapper
from app.models import *
from flask_script import Manager, Shell
# create the application instance
app = create_app(os.getenv('FLASK_ENV', 'development'))
# instantiate script manager
manager = Manager(app)
def make_shell_context():
return dict(app=app, db_wrapper=db_wrapper, User=User)
#manager.command
def run_shell():
Shell(make_context = make_shell_context).run(no_ipython=True, no_bpython=True)
# here's my simple command
#manager.command
def create_tables():
User.create_table(fail_silently=True)
#manager.command
def run_tests():
import unittest
tests = unittest.TestLoader().discover('tests')
unittest.TextTestRunner(verbosity=2).run(tests)
# run it
if __name__ == '__main__':
manager.run()
In order to simplify the __init__.py main module, I want to push helper functionality to a different file/class. This requires passing many flask extensions instances when initializing the class, which seems inelegant. My current structure is as follows:
__init__.py:
from flask import Flask, render_template,request
from flask.ext.sqlalchemy import SQLAlchemy
from flask_mail import Mail
from FEUtils import FEUtils
# .. and more imports of various extensions ..
db = SQLAlchemy()
app = Flask(__name__)
db.init_app(app)
mail = Mail(app)
fe_utils = FEUtils(db,mail,app.config)
# Flask code..
if __name__ == '__main__':
app.run()
and FEUtils.py:
from models import User
class FEUtils(object):
def __init__(self,db,mail,config):
self.session = db.session # to access database
self.mail = mail # to send emails
self.config = config # to access app config dictionary
def count_users(self): # example helper method
return self.session.query(User).count()
This all works fine, but seems cumbersome. I'd like the helper class to inherit the various extension instances from the main module, and be able to access the flask config parameters from within the helper class, without passing each when the helper class is instantiated.
Asked differently, is there a way to have the helper class behave as if each of its methods was defined in the main module in an elegant way?