Django. How to create model instance in test db using objects fabric? - python

I want to make my unit tests setUp function clear from repeating tons of model creation lines like 1) create user 2) now create employee with fk to this user and etc.
In order to do that I've made a simple factory of dummy objects but I might've done some mistakes or just misunderstood something. Here's a piece of factory (dummy_data is just a bunch of dicts):
from abc import ABC
from users.models import User
from employees.models import Employee
from .dummy_data import(
user_data,
employee_data,
)
class DummyObjectFactory(ABC):
"""Fabric representing dummy test objects"""
def get_dummy_object(self):
"""Return dummy object"""
class DummyUser(DummyObjectFactory):
def get_dummy_object(self) -> User:
return User.objects.create_user(**user_data)
class DummyEmployee(DummyObjectFactory):
def get_dummy_object(self) -> Employee:
user = DummyUser().get_dummy_object()
return Employee.objects.create(**employee_data, user=user)
dummy_factory = {
"User": DummyUser().get_dummy_object(),
"Employee": DummyEmployee().get_dummy_object(),
}
dummy_factory = dot_dict(dummy_factory)
Then I make a dot notaion dictionary of all kinds of fabrics for easy calling them buy dummy_factory.Name . My intetion was that I call fabric with the desired model name and it creates it's instance.
The problem is: when I call it in some test's setUp method like so test_user = dummy_factory.User it creates object in actual database but I want it to be in test database.
Example of test:
class TestEmployeesListView(TestCase):
def setUp(self):
self.test_user = dummy_factory.User
self.test_employee = dummy_factory.Employee
self.client = Client()
def test_listview_deny_anonymous_user(self):
response = self.client.get(reverse('employees:employees-list'))
self.assertRedirects(response, '/login/?next=/employees/')
Yes, I've searched for the solution and found Factory boy and Faker libraries, but I want to complete my fabric, make it work properly. Thanks for your attention.

So I made it work. What I did was:
Added #abstractmethod decorator in the abstract class.
Every concrete factory methods must have a #classmethod decorator and recieve cls as an argument:
class DummyUser(DummyObjectFactory):
#classmethod
def get_dummy_object(cls) -> User:
return User.objects.create_user(**user_data)
It just work as intended: factory creates objects in test db. Thank you folks for participation.

Related

FactoryBoy is accessing normal DB instead of TEST DB

I'm trying to create some objects in setUp method of Django test case. I use FactoryBoy that helps me with creating the objects. But it seems that FactoryBoy can't find any objects in the database.
factories.py
class ProductFactory(DjangoModelFactory):
...
market_category = factory.fuzzy.FuzzyChoice(list(MarketplaceCategory.objects.all()))
class Meta:
model = Product
tests.py
from django.test import TestCase
from marketplaces.models import MarketplaceCategory
class MyTestCase(TestCase):
def setUp(self) -> None:
...
self.marketplace_category = MarketplaceCategoryFactory.create()
print(MarketplaceCategory.objects.first().pk) # prints 1
self.product = ProductFactory(created_by=self.user)
As you can see, ProductFactory tries to populate Product.market_category by random MarketCategory object.
The problem is that it seems like it does not exist even when I've created it before and made sure it is in the db (it has pk).
EDIT: It chose a MarketCategory object with pk=25 but there is only one such objects in the test db with pk=1. I think it accesses Django development DB instead of testing one.
The error:
psycopg2.errors.ForeignKeyViolation: insert or update on table "products_product" violates foreign key constraint "products_product_market_category_id_2d634517_fk"
DETAIL: Key (market_category_id)=(25) is not present in table "marketplaces_marketplacecategory".
Do you have any idea why it behaves this way? It looks like the Factory is accessing the real DB instead of testdb for some reason.
Defining the "market_category" field like that is going to cause issues, the queryset that populates the choices is going to be executed at some random time whenever the module is imported and the instances returned may no longer exist. You should use a SubFactory
class ProductFactory(DjangoModelFactory):
market_category = factory.SubFactory(MarketplaceCategoryFactory)
class Meta:
model = Product
Pass the queryset directly to FuzzyChoice to get a random existing value, don't convert it to a list
class ProductFactory(DjangoModelFactory):
market_category = factory.fuzzy.FuzzyChoice(MarketplaceCategory.objects.all())
class Meta:
model = Product
This will then create an instance whenever you create a product but you can pass "market_category" to the factory to override it
class MyTestCase(TestCase):
def setUp(self) -> None:
self.marketplace_category = MarketplaceCategoryFactory.create()
self.product = ProductFactory(created_by=self.user, market_category =self.marketplace_category)

How to manage a peewee database in a separate module?

I want to have my database implementation in a separate module or class. But I am struggling with a few details. A simple example:
from peewee import *
db = SqliteDatabase(':memory:')
class BaseModel(Model):
class Meta:
database = db
class User(BaseModel):
name = CharField()
db.connect()
db.create_tables([User,])
db.commit()
#db.atomic()
def add_user(name):
User.create(name=name).save()
#db.atomic()
def get_user(name):
return User.get(User.name == name)
So far this is working fine. I can implement my interface to the database here and import this as a module.
Now I want to be able to choose the database file at runtime. So I need a way to define the Model classes without defining SqliteDatabase('somefile') before. I tried to encapsulate everything in a new Database class, which I can later import and create an instance from:
from peewee import *
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
class BaseModel(Model):
class Meta:
database = self.db
class User(BaseModel):
name = CharField()
self.User = User
self.db.connect()
self.db.create_tables([User,])
self.db.commit()
#self.db.atomic() # Error as self is not known on this level
def add_user(self, name):
self.User.create(name=name).save()
#self.db.atomic() # Error as self is not known on this level
def get_user(self, name):
return self.User.get(self.User.name == name)
Now I can call for example database = Database('database.db') or choose any other file name. I can even use multiple database instance in the same program, each with its own file.
However, there are two problems with this approach:
I still need to specify the database driver (SqliteDatabase) before defining the Model classes. To solve this I define the Model classes within the __init__() method, and then create an alias to with self.User = User. I don't really like this approach (it just doesn't feel like neat code), but at least it works.
I cannot use the #db.atomic() decorator since self is not known at class level, I would an instance here.
So this class approach does not seem to work very well. Is there some better way to define the Model classes without having to choose where you want to store your database first?
If you need to change database driver at the runtime, then Proxy is a way to go
# database.py
import peewee as pw
proxy = pw.Proxy()
class BaseModel(pw.Model):
class Meta:
database = proxy
class User(BaseModel):
name = pw.CharField()
def add_user(name):
with proxy.atomic() as txn:
User.create(name=name).save()
def get_user(name):
with proxy.atomic() as txn:
return User.get(User.name == name)
From now on each time you load the module, it won't need a database to be initialized. Instead, you can initialize it at the runtime and switch between multiple as follows
# main.py
import peewee as pw
import database as db
sqlite_1 = pw.SqliteDatabase('sqlite_1.db')
sqlite_2 = pw.PostgresqlDatabase('sqlite_2.db')
db.proxy.initialize(sqlite_1)
sqlite_1.create_tables([db.User], safe=True)
db.add_user(name="Tom")
db.proxy.initialize(sqlite_2)
sqlite_2.create_tables([db.User], safe=True)
db.add_user(name="Jerry")
But if the connection is the only thing that matters, then init() method will be enough.
Now I want to be able to choose the database file at runtime. So I
need a way to define the Model classes without defining
SqliteDatabase('somefile') before. I tried to encapsulate everything
in a new Database class, which I can later import and create an
instance from
Peewee uses the meta class to define the name of the table (Model.Meta.db_table) and database( Model.Meta.database)
set these attribute before calling a Model specific code ( either to create a table or to DML statements)
'Allow to define database dynamically'
Question: I cannot use the #db.atomic() decorator since self is not known at class level
Do it, as you do it with self.User.
I wonder about atomic() instead of atomic, but you tell is working fine.
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
...
#self.db.atomic()
def __add_user(self, name):
self.User.create(name=name).save()
self.add_user = __add_user
#self.db.atomic()
def __get_user(self, name):
return self.User.get(self.User.name == name)
self.get_user = __get_user
Related: Define models separately from Database() initialization

Inheriting setUp method Python Unittest

I have a question regarding unittest with Python! Let's say that I have a docker container set up that handles a specific api endpoint (let's say users, ex: my_site/users/etc/etc/etc). There are quite a few different layers that are broken up and handled for this container. Classes that handle the actual call and response, logic layer, data layer. I am wanting to write tests around the specific calls (just checking for status codes).
There are a lot of different classes that act as Handlers for the given endpoints. There are a few things that I would have to set up differently per one, however, each one inherits from Application and uses some methods from it. I am wanting to do a setUp class for my unittest so I don't have to re-establish this each time. Any advice will help. So far I've mainly seen that inheritance is a bad idea with testing, however, I am only wanting to use this for setUp. Here's an example:
class SetUpClass(unittest.TestCase):
def setUp(self):
self._some_data = data_set.FirstOne()
self._another_data_set = data_set.SecondOne()
def get_app(self):
config = Config()
return Application(config,
first_one=self._some_data,
second_one=self._another_data_set)
class TestFirstHandler(SetUpClass, unittest.TestCase):
def setUp(self):
new_var = something
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)
class TestSecondHandler(SetUpClass, unittest.TestCase):
def setUp(self):
different_var_thats_specific_to_this_handler = something_else
def tearDown(self):
pass
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users/account/?something_custom={}'.format('WOW'))
self.assertEqual(res.code, 200)
Thanks again!!
As mentioned in the comments, you just need to learn how to use super(). You also don't need to repeat TestCase in the list of base classes.
Here's the simple version for Python 3:
class TestFirstHandler(SetUpClass):
def setUp(self):
super().setUp()
new_var = something
def tearDown(self): # Easier to not declare this if it's empty.
super().tearDown()
def test_this_handler(self):
# This specific handler needs the application to function
# but I don't want to define it in this test class
res = self.fetch('some_url/users')
self.assertEqual(res.code, 200)

Python - Accessing subclasses' variables from parent class while calling classmethods

i'm trying to build sort of a "mini django model" for working with Django and MongoDB without using the norel Django's dist (i don't need ORM access for these...).
So, what i'm trying to do is to mimic the standart behavior or "implementation" of default models of django... that's what i've got so far:
File "models.py" (the base)
from django.conf import settings
import pymongo
class Model(object):
#classmethod
def db(cls):
db = pymongo.Connection(settings.MONGODB_CONF['host'], settings.MONGODB_CONF['port'])
#classmethod
class objects(object):
#classmethod
def all(cls):
db = Model.db() #Not using yet... not even sure if that's the best way to do it
print Model.collection
File "mongomodels.py" (the implementation)
from mongodb import models
class ModelTest1(models.Model):
database = 'mymongodb'
collection = 'mymongocollection1'
class ModelTest2(models.Model):
database = 'mymongodb'
collection = 'mymongocollection2'
File "views.py" (the view)
from mongomodels import ModelTest1, ModelTest2
print ModelTest1.objects.all() #Should print 'mymongocollection1'
print ModelTest2.objects.all() #Should print 'mymongocollection2'
The problem is that it's not accessing the variables from ModelTest1, but from the original Model... what's wrong??
You must give objects some sort of link to class that contains it. Currently, you are just hard-coding it to use Model()s atttributes. Because you are not instantiating these classes, you will either have to use either a decorator or a metaclass to create the object class for you in each subclass of Model().

GAE/Python - What is the best way to pass the request from the controller to the model?

In the following example code I illustrate how I solved passing the request from the controller to the model. Is there a better way to do this?
models/MyModel.py
from google.appengine.ext import db
class MyModel(db.model):
a = db.StringProperty(required=True)
b = db.StringProperty(required=True)
c = db.StringProperty(required=True)
class Update:
def __init__(self, x)
self.x = x
def do_something(self)
myquery = db.GqlQuery('SELECT * FROM MyModel WHERE a = :1' self.x)
v = myquery.get()
if not v:
somevalue = MyModel(a = self.request.get('a'), # Doesn't work unless
b = self.request.get('b'), # the request has been
c = self.request.get('c')) # passed
somevalue.put()
controllers/MyController.py
import webapp2
from models.MyModel import Update
class WebHandler(webapp2.RequestHandler):
def get(self):
var_x = "string"
var_y = "string"
z = Update(var_x)
z.request = self.request # <--- Is there a better way?
z.do_something()
Edit: (5/30/2012)
I ended up creating a function to pass the request variables to the Model as arguments. I was hoping there was some python magic that would avoid all the repetition but I didn't find one.
handing over the request to the model means you're mixing model and controller.
normaly the model shouldn't even be aware that ther even is a request. geting the parameters from the request is the controller's job, not the models. if your do_something method has to work with parameters, then it should get them as arguments and return a result (usually an entity).
You need a little more encapsulation. Another level of abstraction is useful.
Create a class representing the actual thing that you're using MyModel to persist, with a, b, and c attributes or properties:
class Thing(object):
def __init__(a, b, c):
self.a = a
self.b = b
self.c = c
This class doesn't know anything about requests, or the datastore, or persistence. It's where all the actual business logic surrounding the thing itself lives.
You can instantiate this class without needing a request or a database, and you can unit test all of the business logic without having to mock out any of GAE's plumbing.
You can then write code in your request handler that instantiates this thing, e.g.:
def get(request):
thing = Thing(a=request.get('a'), b=request.get('b'), c=request.get('c'))
u = Update(some_params)
u.do_something(thing)
You can add a class method to MyModel that puts a thing, e.g.:
#classmethod
def put_thing(thing):
model = MyModel(a=thing.a, b=thing.b, c=thing.c)
model.put()
and then your do_something logic can just call MyModel.put_thing(thing) if it needs to add it to the database.
You can add update as a class method to the MyModel class with a #classmethod decorator and call MyModel.update(a=x)

Categories

Resources