I'm trying to create some objects in setUp method of Django test case. I use FactoryBoy that helps me with creating the objects. But it seems that FactoryBoy can't find any objects in the database.
factories.py
class ProductFactory(DjangoModelFactory):
...
market_category = factory.fuzzy.FuzzyChoice(list(MarketplaceCategory.objects.all()))
class Meta:
model = Product
tests.py
from django.test import TestCase
from marketplaces.models import MarketplaceCategory
class MyTestCase(TestCase):
def setUp(self) -> None:
...
self.marketplace_category = MarketplaceCategoryFactory.create()
print(MarketplaceCategory.objects.first().pk) # prints 1
self.product = ProductFactory(created_by=self.user)
As you can see, ProductFactory tries to populate Product.market_category by random MarketCategory object.
The problem is that it seems like it does not exist even when I've created it before and made sure it is in the db (it has pk).
EDIT: It chose a MarketCategory object with pk=25 but there is only one such objects in the test db with pk=1. I think it accesses Django development DB instead of testing one.
The error:
psycopg2.errors.ForeignKeyViolation: insert or update on table "products_product" violates foreign key constraint "products_product_market_category_id_2d634517_fk"
DETAIL: Key (market_category_id)=(25) is not present in table "marketplaces_marketplacecategory".
Do you have any idea why it behaves this way? It looks like the Factory is accessing the real DB instead of testdb for some reason.
Defining the "market_category" field like that is going to cause issues, the queryset that populates the choices is going to be executed at some random time whenever the module is imported and the instances returned may no longer exist. You should use a SubFactory
class ProductFactory(DjangoModelFactory):
market_category = factory.SubFactory(MarketplaceCategoryFactory)
class Meta:
model = Product
Pass the queryset directly to FuzzyChoice to get a random existing value, don't convert it to a list
class ProductFactory(DjangoModelFactory):
market_category = factory.fuzzy.FuzzyChoice(MarketplaceCategory.objects.all())
class Meta:
model = Product
This will then create an instance whenever you create a product but you can pass "market_category" to the factory to override it
class MyTestCase(TestCase):
def setUp(self) -> None:
self.marketplace_category = MarketplaceCategoryFactory.create()
self.product = ProductFactory(created_by=self.user, market_category =self.marketplace_category)
Related
I've implemented a simple flask application with mongodb that now needs some upgrades.
Let's say to have a class model for Foo and a class model for Bar in which there is a reference field to Foo
class Foo(Document):
title = StringField()
class Bar(Document):
name = StringField()
foo = ReferenceField('Foo')
Let the flask application runs doing its job for a while, so that now there are some data in the DB.
Due to requirements changes, we need to refactor the Foo class subclassing it from a new super class:
class SuperFoo(Document):
meta = { 'allow_inheritance': True,}
#[...]
class Foo(SuperFoo):
#[...]
class Bar(Document):
name = StringField()
foo = ReferenceField('Foo')
The code above works well with an empty database.
But in case of some data in it, mongoengine raises an Exception when a flask admin tries to show a Bar instance (in edit mode)
File "[...]/site-packages/mongoengine/fields.py", line 1124, in __get__
raise DoesNotExist('Trying to dereference unknown document %s' % value)
mongoengine.errors.DoesNotExist: Trying to dereference unknown document DBRef('super_foo', ObjectId('5617a08939c6c70cbaa2af6e'))
I suppose data model needs to be migrated in some way.
How?
thanks,
alessandro.
After a little analyis I came up to solve the problem.
Mongoengine creates a new collection super_foo.
Documents of every inherited class goes into this super_foo collection with an additional attribute _cls.
The value is the CamelCased hierarchy path of that class. In this example documents will have
'_cls': 'SuperFoo.Foo' field.
What I've done is to copy every document from the old foo collection into the new super_foo one, adding the field {'_cls': u'SuperPlesso.Plesso'} to each.
The migration function should look like:
def migrationFunc():
from pymongo.errors import DuplicateKeyError
from my.app import models
_cls = {'_cls': u'SuperFoo.Foo'}
fromOldCollection = models.Foo._collection
toSuperCollection = models.Superfoo._collection
for doc in fromOldCollection.find():
doc.update(_cls)
try:
toSuperCollection.insert(doc)
except DuplicateKeyError:
logger.error('...')
Then I updated the base code of the models with the actual new hierarchy:
class SuperFoo(Document):
meta = { 'allow_inheritance': True,}
#[...]
# was class Foo(Document)
class Foo(SuperFoo):
#[...]
Al back references to Foo in Bar collections, or elsewhere, are preserved.
I'm trying to wrap peewee models and classes into other interface and i want to dynamically assign model to database. I'm using peewee.Proxy class for this, but i don't want to use global variable for making initialization of this proxy available. I wanted to make class method for changing Meta (inner) class of base model, but i get following error:
AttributeError: type object 'BaseModel' has no attribute 'Meta'
Code that i have:
import peewee as pw
class BaseModel(pw.Model):
class Meta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls.Meta.database.initialize(database)
Of course i could access this variable by calling BaseModel.Meta.database but it is less intuitive in my opinion.
Have you got any suggestions?
Peewee transforms the inner "Meta" class into an object accessible at "ModelClass._meta" after the class is constructed:
Change ".Meta" to "._meta":
class BaseModel(pw.Model):
class Meta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls._meta.database.initialize(database)
I don't know exactly why you are having this problem, and I'd be interested in the full answer.
The problem is with the name Meta. I'm guessing there's something by that name defined in pw.Model but I haven't been through it all yet.
That said, this (for example) works:
import peewee as pw
class BaseModel(pw.Model):
class MyMeta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls.MyMeta.database.initialize(database)
I want to have my database implementation in a separate module or class. But I am struggling with a few details. A simple example:
from peewee import *
db = SqliteDatabase(':memory:')
class BaseModel(Model):
class Meta:
database = db
class User(BaseModel):
name = CharField()
db.connect()
db.create_tables([User,])
db.commit()
#db.atomic()
def add_user(name):
User.create(name=name).save()
#db.atomic()
def get_user(name):
return User.get(User.name == name)
So far this is working fine. I can implement my interface to the database here and import this as a module.
Now I want to be able to choose the database file at runtime. So I need a way to define the Model classes without defining SqliteDatabase('somefile') before. I tried to encapsulate everything in a new Database class, which I can later import and create an instance from:
from peewee import *
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
class BaseModel(Model):
class Meta:
database = self.db
class User(BaseModel):
name = CharField()
self.User = User
self.db.connect()
self.db.create_tables([User,])
self.db.commit()
#self.db.atomic() # Error as self is not known on this level
def add_user(self, name):
self.User.create(name=name).save()
#self.db.atomic() # Error as self is not known on this level
def get_user(self, name):
return self.User.get(self.User.name == name)
Now I can call for example database = Database('database.db') or choose any other file name. I can even use multiple database instance in the same program, each with its own file.
However, there are two problems with this approach:
I still need to specify the database driver (SqliteDatabase) before defining the Model classes. To solve this I define the Model classes within the __init__() method, and then create an alias to with self.User = User. I don't really like this approach (it just doesn't feel like neat code), but at least it works.
I cannot use the #db.atomic() decorator since self is not known at class level, I would an instance here.
So this class approach does not seem to work very well. Is there some better way to define the Model classes without having to choose where you want to store your database first?
If you need to change database driver at the runtime, then Proxy is a way to go
# database.py
import peewee as pw
proxy = pw.Proxy()
class BaseModel(pw.Model):
class Meta:
database = proxy
class User(BaseModel):
name = pw.CharField()
def add_user(name):
with proxy.atomic() as txn:
User.create(name=name).save()
def get_user(name):
with proxy.atomic() as txn:
return User.get(User.name == name)
From now on each time you load the module, it won't need a database to be initialized. Instead, you can initialize it at the runtime and switch between multiple as follows
# main.py
import peewee as pw
import database as db
sqlite_1 = pw.SqliteDatabase('sqlite_1.db')
sqlite_2 = pw.PostgresqlDatabase('sqlite_2.db')
db.proxy.initialize(sqlite_1)
sqlite_1.create_tables([db.User], safe=True)
db.add_user(name="Tom")
db.proxy.initialize(sqlite_2)
sqlite_2.create_tables([db.User], safe=True)
db.add_user(name="Jerry")
But if the connection is the only thing that matters, then init() method will be enough.
Now I want to be able to choose the database file at runtime. So I
need a way to define the Model classes without defining
SqliteDatabase('somefile') before. I tried to encapsulate everything
in a new Database class, which I can later import and create an
instance from
Peewee uses the meta class to define the name of the table (Model.Meta.db_table) and database( Model.Meta.database)
set these attribute before calling a Model specific code ( either to create a table or to DML statements)
'Allow to define database dynamically'
Question: I cannot use the #db.atomic() decorator since self is not known at class level
Do it, as you do it with self.User.
I wonder about atomic() instead of atomic, but you tell is working fine.
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
...
#self.db.atomic()
def __add_user(self, name):
self.User.create(name=name).save()
self.add_user = __add_user
#self.db.atomic()
def __get_user(self, name):
return self.User.get(self.User.name == name)
self.get_user = __get_user
Related: Define models separately from Database() initialization
I am generating a Django model based on an abstract model class AbstractAttr and a normal model (let's say Foo).
I want my foo/models.py to look like this:
from bar.models import Attrs
# ...
class Foo(models.Model):
....
attrs = Attrs()
In the Attrs class which mimics a field I have a contribute_to_class that generates the required model using type(). The generated model c is called FooAttr.
Everything works. If I migrate, I see FooAttr appear in the proper table.
EXCEPT FOR ONE THING.
I want to be able to from foo.models import FooAttr. Somehow my generated FooAttr class is not bound to the models.py file in which it is generated.
If I change my models.py to this:
class Foo(models.Model):
# ...
FooAttr = generate_foo_attr_class(...)
it works, but this is not what I want (for example, this forces the dev to guess the generate class name).
Is what I want possible, define the class somewhat like in the first example AND bind it to the specific models.py module?
The project (pre-Alpha) is here (in develop branch):
https://github.com/zostera/django-mav
Some relevant code:
def create_model_attribute_class(model_class, class_name=None, related_name=None, meta=None):
"""
Generate a value class (derived from AbstractModelAttribute) for a given model class
:param model_class: The model to create a AbstractModelAttribute class for
:param class_name: The name of the AbstractModelAttribute class to generate
:param related_name: The related name
:return: A model derives from AbstractModelAttribute with an object pointing to model_class
"""
if model_class._meta.abstract:
# This can't be done, because `object = ForeignKey(model_class)` would fail.
raise TypeError("Can't create attrs for abstract class {0}".format(model_class.__name__))
# Define inner Meta class
if not meta:
meta = {}
meta['app_label'] = model_class._meta.app_label
meta['db_tablespace'] = model_class._meta.db_tablespace
meta['managed'] = model_class._meta.managed
meta['unique_together'] = list(meta.get('unique_together', [])) + [('attribute', 'object')]
meta.setdefault('db_table', '{0}_attr'.format(model_class._meta.db_table))
# The name of the class to generate
if class_name is None:
value_class_name = '{name}Attr'.format(name=model_class.__name__)
else:
value_class_name = class_name
# The related name to set
if related_name is None:
model_class_related_name = 'attrs'
else:
model_class_related_name = related_name
# Make a type for our class
value_class = type(
str(value_class_name),
(AbstractModelAttribute,),
dict(
# Set to same module as model_class
__module__=model_class.__module__,
# Add a foreign key to model_class
object=models.ForeignKey(
model_class,
related_name=model_class_related_name
),
# Add Meta class
Meta=type(
str('Meta'),
(object,),
meta
),
))
return value_class
class Attrs(object):
def contribute_to_class(self, cls, name):
# Called from django.db.models.base.ModelBase.__new__
mav_class = create_model_attribute_class(model_class=cls, related_name=name)
cls.ModelAttributeClass = mav_class
I see you create the model from within models.py, so I think you should be able to add it to the module's globals. How about this:
new_class = create_model_attribute_class(**kwargs)
globals()[new_class.__name__] = new_class
del new_class # no need to keep original around
Thanks all for thinking about this. I have updated the source code of the project at GitHub and added more tests. See https://github.com/zostera/django-mav
Since the actual generation of the models is done outside of foo/models.py (it takes place in mav/models.py, it seems Pythonically impossible to link the model to foo/models.py. Also, after rethinking this, it seems to automagically for Python (explicit is better, no magic).
So my new strategy is to use simple functions, a decorator to make it easy to add mav, and link the generated models to mac/attrs.py, so I can universally from mav.attrs import FooAttr. I also link the generated class to the Foo model as Foo._mav_class.
(In this comment, Foo is of course used as an example model that we want to add model-attribute-value to).
i'm trying to build sort of a "mini django model" for working with Django and MongoDB without using the norel Django's dist (i don't need ORM access for these...).
So, what i'm trying to do is to mimic the standart behavior or "implementation" of default models of django... that's what i've got so far:
File "models.py" (the base)
from django.conf import settings
import pymongo
class Model(object):
#classmethod
def db(cls):
db = pymongo.Connection(settings.MONGODB_CONF['host'], settings.MONGODB_CONF['port'])
#classmethod
class objects(object):
#classmethod
def all(cls):
db = Model.db() #Not using yet... not even sure if that's the best way to do it
print Model.collection
File "mongomodels.py" (the implementation)
from mongodb import models
class ModelTest1(models.Model):
database = 'mymongodb'
collection = 'mymongocollection1'
class ModelTest2(models.Model):
database = 'mymongodb'
collection = 'mymongocollection2'
File "views.py" (the view)
from mongomodels import ModelTest1, ModelTest2
print ModelTest1.objects.all() #Should print 'mymongocollection1'
print ModelTest2.objects.all() #Should print 'mymongocollection2'
The problem is that it's not accessing the variables from ModelTest1, but from the original Model... what's wrong??
You must give objects some sort of link to class that contains it. Currently, you are just hard-coding it to use Model()s atttributes. Because you are not instantiating these classes, you will either have to use either a decorator or a metaclass to create the object class for you in each subclass of Model().