Multiple files sharing 1 Main instance - python

Multiple files in a project
db.py
Setup and handle firebase_admin connection
model.py
Abstract class for model CRUD
session.py, config.py...
Some classes extend models
main.py
Import files, coordinate them
main.py basically:
import db
class App():
....
main = App()
main.db = db.start(options)
model.py basically:
from abc import abstractmethod, ABC
class Model():
...
#abstractmethod
def gen_key(self):
pass
def create(self):
main.db.users_ref.set({
self.gen_key() : self.output_data()
})
but model.py cannot access main
I tried
global main
main = App()
in main.py, doesn't work.
Tried import main.py in model.py, got circular import error
Do I need to pass main to every single instance? Seemed so troublesome
I make model.py because I'm trying to make different parts of firebase more consistent. I learn this kind of things from PHP yii framework, it's for MySql, but make sense to me so I copy that kind of things.
You can see a gen_key in Model. I hope every "Table" have a gen_key method.
For example, Session.gen_key will return self.date
Config.key will return self.type+"_"+self.date
With that, I can just define the gen key rule, and run
record = new Session
record.date = today
record.save
Then things would automatically organise themselves in firebase

Related

Django setUpTestData does not deepcopy related files

I want to use django setUpTestData to prepare some heavy data shared between multiples unit tests. Instances of my Model are created with some file fields.
However, from one test to another, file contents are not renewed, it has side effects between my tests
here is a minimalist example :
models.py
from django.db import models
class MyModel(models.Model):
my_file = models.FileField(upload_to="tests/")
test.py
from django.core.files.base import ContentFile
from django.test import TestCase
from core.models import MyModel
class Test(TestCase):
#classmethod
def setUpTestData(cls):
cls.instance = MyModel()
cls.instance.my_file.save("file.txt", ContentFile("Hello from setUpTestData"))
def test_a(self):
with open(self.instance.my_file.path, "r") as fp:
self.assertEqual(fp.read(), "Hello from setUpTestData")
self.instance.my_file.save("file.txt", ContentFile("Modified data"))
def test_b(self):
with open(self.instance.my_file.path, "r") as fp:
self.assertEqual(fp.read(), "Hello from setUpTestData")
self.instance.my_file.save("file.txt", ContentFile("Modified data"))
Running any of the two test alone works, however running one after the other one fails:
AssertionError: 'Modified datatUpTestData' != 'Hello from setUpTestData'
- Modified datatUpTestData
+ Hello from setUpTestData
How to ensure that file are correctly reset? Am I concerned by theses lines from the documentation ?
Objects assigned to class attributes in setUpTestData() must support creating deep copies with copy.deepcopy()
I feel like fileField should be handled by default by Django but it doesn't work, what should I do? Should I try to override __deepcopy__ for my models? Modifying my code for testing purpose is a bad pattern.
I've found a solution by using setUp and tearDown
def setUp(self) : # copy original files
shutil.copytree(settings.MEDIA_ROOT, "/tmp/tests", dirs_exist_ok=True)
def tearDown(self) : # restore original files
shutil.copytree("/tmp/tests", settings.MEDIA_ROOT, dirs_exist_ok=True)

Dependency Injection problem with FastAPI on Python

Good day! Please tell me how you can solve the following problem in Python + FastAPI.
There is a test project:
app / main.py - main file
app / routes / users.py -set of api methods
app / repos / factory.py - repository factory
app / repos / user_repository.py - repositories
app / handlers / factory.py - handler factory
app / handlers / users.py - handlers
app / domain / user.py - data class
The main and routes structure is the same as in the example https://fastapi.tiangolo.com/tutorial/bigger-applications/
In the routes/users.py file:
from fastapi import APIRouter, Depends
from ..handlers import factory
router = APIRouter()
#router.get("/users/", tags=["users"])
def read_users(handler=Depends(factory.get_handler)):
return handler.get_all()
In the handlers/factory.py:
from fastapi import Depends
from .users import UserHandler1
def get_handler(handler=Depends(UserHandler1)):
return handler
In the handlers/users.py:
from fastapi import Depends
from ..repos import factory
class UserHandler1:
def __init__(self):
pass
def get_all(self, repo=Depends(factory.get_repo)):
return repo.get_all()
repos/factory.py:
from fastapi import Depends
from ..repos.user_repository import UserRepository
def get_repo(repo=Depends(UserRepository)):
return repo
repos/user_repository.py:
from ..domain.user import User
class UserRepository:
def __init__(self):
pass
def get_all(self):
return [User(1, 'A'), User(2, 'B'), User(3, 'C')]
domain/user.py:
class User:
id: int
name: str
def __init__(self, id, name):
self.id = id
self.name = name
Then I run hypercorn server: app.main:app --reload
Try call api method: http://127.0.0.1:8000/users/
And get the error AttributeError: 'Depends' object has no attribute 'get_all'
If you remove the handlers layer and do this, then everything will work.
routes/users.py:
from fastapi import APIRouter, Depends
from ..repos import factory
router = APIRouter()
#router.get("/users/", tags=["users"])
def read_users(repo=Depends(factory.get_repo)):
return repo.get_all()
It also works if you completely remove all Depends and create
UserRepository and UserHandler1 directly in factories.
Question 1: How do I use "Depends" in this case and why doesn't it work?
In general, the factory does not look like a good solution to this problem. I saw an example of DI implementation using multiple inheritance but as for me it is the same as factory method.
I also tried to use the Pinject library, but it requires the initial construction of a graph, which needs to be saved somewhere in order to access it in api handlers.
Question 2 (more important): How Dependency Injection can be applied in this case ?
The __call__ method must be implemented in the class.
As noted in the comments, a dependency can be anything that is a callable and thus a class as well. The only caveat in the latter case is that the class will only be initialized (i.e. only the __init__(...) function will be called).
So, in order to have a class as dependency, as in the example of https://fastapi.tiangolo.com/tutorial/dependencies/classes-as-dependencies/#shortcut you just need to call the target functions within the init and set the values as attributes of the class.
from ..domain.user import User
class UserRepository:
def __init__(self):
self.get_all()
def get_all(self):
self.users = [User(1, 'A'), User(2, 'B'), User(3, 'C')]
from fastapi import Depends
from ..repos.user_repository import UserRepository
def get_repo(repo=Depends(UserRepository)):
print(repo.users) # This will print the list of users
return repo
QUESTION 2
NB
This is a modelling question. Here I propose what I believe is
suitable from my point of view. It does not necessarily have to
be best or simplest approach.
Answering your second question, I would not advice for such complex dependencies. If the dependencies are at the router level, you can simply add them to the router, using the parameter depends=[...] and providing a list of dependency classes/functions.
Alternatively, you could declare all of the dependencies as function parameters of the endpoint, as you did for the factory. This method may lead to big chunks of code getting copied and pasted, so I advise for the above approach.
If you need to process the data parameters, then you add them to the request and access them from within the endpoint. See FastAPI get user ID from API key for a minimal example.

Import Python Module to Class Variable

I am trying to import a class in Python and after importing the class set a class variable as the imported class. I have searched Google as well as stackoverflow for an answer to this, but have not been able to find one.
For Example:
DB.py:
class DB:
def __init__(self):
#init sets up the db connection
def FetchAll():
#Fetchall fetches all records from database
Ex.py:
class Ex:
def __init__(self):
import DB.py as DB
self.db = DB.DB()
def FetchAll(self):
result_set = self.db.FetchAll()
I would like to be able to access the FetchAll() in the DB class from Ex class through a variable. I know in PHP this is possible by using "Protected" keyword in a class.
Thank you for any help you can offer.
Just
import DB
You provide a module name (that can be located given the current module search path), not a file name.
Check the tutorial for a beginner’s overview of how modules and imports work.
You can either just use the name of the class as the class variable:
import DB
class Ex:
# define methods
# call FetchAll with DB.FetchAll()
Or create a new class variable with the 'as' key:
import DB as x
class Ex:
# define methods
# call FetchAll with x.FetchAll()
And like the others are saying, import all modules at the top of the script.

Flask initialisation for unit test and app

I've got a Flask application which I'd like to run some unit tests on. To do so, I create a new Flask object, initialise blueprints, SQLA and a few other packages and execute the test case.
However, I've noticed that some endpoints on the test flask object are missing, which made me wonder about the general way of how initialisation is handled in flask.
Taking a minimal example, an endpoint would be created like so:
from flask import Flask
app = Flask(__name__)
#app.route('/')
def hello_world():
return 'Hello World!'
if __name__ == '__main__':
app.run()
If I was to create a new Flask object somewhere in my testcase's setUp method, it would most certainly not contain a route '/' as this route was created from another flask object (the one from the file above). So my question is: How should a test case be written and how is the initialisation meant to work in general? Somewhere I read that one should avoid initialisation at import (i.e. on a module level), but this seems to be impossible if annotations are used.
You don't create a new Flask object in your test cases. You import your existing app instead.
In many project setups you already added all your extensions to that app. In many of mine I have a factory method that'll take a configuration object and returns the fully initialized app object for me; I use this to create a base test case:
import unittest
import project
class Config(object):
DEBUG = False
TESTING = True
CACHE_NO_NULL_WARNING = True # silence Flask-Cache warning
SECRET_KEY = 'SECRET_KEY'
class ProjectCoreViewCase(unittest.TestCase):
"""Base test case for the Project core app"""
def setUp(self, **kwargs):
config = Config()
config.__dict__.update(kwargs)
app = project.create_app(config)
self.app = app.test_client()
and any tests can then use self.app as the test client in all tests.
This is a base test case, you'd inherit from it; the setUp() method allows for additional configuration to be set, by passing in keyword arguments to a super() call:
class ConcreteTestCase(ProjectCoreViewCase):
def setUp(self):
super(ConcreteTestCase, self).setUp(
SQLALCHEMY_DATABASE_URI='your_test_specific_connection_uri',
)

Python - Accessing subclasses' variables from parent class while calling classmethods

i'm trying to build sort of a "mini django model" for working with Django and MongoDB without using the norel Django's dist (i don't need ORM access for these...).
So, what i'm trying to do is to mimic the standart behavior or "implementation" of default models of django... that's what i've got so far:
File "models.py" (the base)
from django.conf import settings
import pymongo
class Model(object):
#classmethod
def db(cls):
db = pymongo.Connection(settings.MONGODB_CONF['host'], settings.MONGODB_CONF['port'])
#classmethod
class objects(object):
#classmethod
def all(cls):
db = Model.db() #Not using yet... not even sure if that's the best way to do it
print Model.collection
File "mongomodels.py" (the implementation)
from mongodb import models
class ModelTest1(models.Model):
database = 'mymongodb'
collection = 'mymongocollection1'
class ModelTest2(models.Model):
database = 'mymongodb'
collection = 'mymongocollection2'
File "views.py" (the view)
from mongomodels import ModelTest1, ModelTest2
print ModelTest1.objects.all() #Should print 'mymongocollection1'
print ModelTest2.objects.all() #Should print 'mymongocollection2'
The problem is that it's not accessing the variables from ModelTest1, but from the original Model... what's wrong??
You must give objects some sort of link to class that contains it. Currently, you are just hard-coding it to use Model()s atttributes. Because you are not instantiating these classes, you will either have to use either a decorator or a metaclass to create the object class for you in each subclass of Model().

Categories

Resources