I've implemented a simple flask application with mongodb that now needs some upgrades.
Let's say to have a class model for Foo and a class model for Bar in which there is a reference field to Foo
class Foo(Document):
title = StringField()
class Bar(Document):
name = StringField()
foo = ReferenceField('Foo')
Let the flask application runs doing its job for a while, so that now there are some data in the DB.
Due to requirements changes, we need to refactor the Foo class subclassing it from a new super class:
class SuperFoo(Document):
meta = { 'allow_inheritance': True,}
#[...]
class Foo(SuperFoo):
#[...]
class Bar(Document):
name = StringField()
foo = ReferenceField('Foo')
The code above works well with an empty database.
But in case of some data in it, mongoengine raises an Exception when a flask admin tries to show a Bar instance (in edit mode)
File "[...]/site-packages/mongoengine/fields.py", line 1124, in __get__
raise DoesNotExist('Trying to dereference unknown document %s' % value)
mongoengine.errors.DoesNotExist: Trying to dereference unknown document DBRef('super_foo', ObjectId('5617a08939c6c70cbaa2af6e'))
I suppose data model needs to be migrated in some way.
How?
thanks,
alessandro.
After a little analyis I came up to solve the problem.
Mongoengine creates a new collection super_foo.
Documents of every inherited class goes into this super_foo collection with an additional attribute _cls.
The value is the CamelCased hierarchy path of that class. In this example documents will have
'_cls': 'SuperFoo.Foo' field.
What I've done is to copy every document from the old foo collection into the new super_foo one, adding the field {'_cls': u'SuperPlesso.Plesso'} to each.
The migration function should look like:
def migrationFunc():
from pymongo.errors import DuplicateKeyError
from my.app import models
_cls = {'_cls': u'SuperFoo.Foo'}
fromOldCollection = models.Foo._collection
toSuperCollection = models.Superfoo._collection
for doc in fromOldCollection.find():
doc.update(_cls)
try:
toSuperCollection.insert(doc)
except DuplicateKeyError:
logger.error('...')
Then I updated the base code of the models with the actual new hierarchy:
class SuperFoo(Document):
meta = { 'allow_inheritance': True,}
#[...]
# was class Foo(Document)
class Foo(SuperFoo):
#[...]
Al back references to Foo in Bar collections, or elsewhere, are preserved.
Related
I'm trying to create some objects in setUp method of Django test case. I use FactoryBoy that helps me with creating the objects. But it seems that FactoryBoy can't find any objects in the database.
factories.py
class ProductFactory(DjangoModelFactory):
...
market_category = factory.fuzzy.FuzzyChoice(list(MarketplaceCategory.objects.all()))
class Meta:
model = Product
tests.py
from django.test import TestCase
from marketplaces.models import MarketplaceCategory
class MyTestCase(TestCase):
def setUp(self) -> None:
...
self.marketplace_category = MarketplaceCategoryFactory.create()
print(MarketplaceCategory.objects.first().pk) # prints 1
self.product = ProductFactory(created_by=self.user)
As you can see, ProductFactory tries to populate Product.market_category by random MarketCategory object.
The problem is that it seems like it does not exist even when I've created it before and made sure it is in the db (it has pk).
EDIT: It chose a MarketCategory object with pk=25 but there is only one such objects in the test db with pk=1. I think it accesses Django development DB instead of testing one.
The error:
psycopg2.errors.ForeignKeyViolation: insert or update on table "products_product" violates foreign key constraint "products_product_market_category_id_2d634517_fk"
DETAIL: Key (market_category_id)=(25) is not present in table "marketplaces_marketplacecategory".
Do you have any idea why it behaves this way? It looks like the Factory is accessing the real DB instead of testdb for some reason.
Defining the "market_category" field like that is going to cause issues, the queryset that populates the choices is going to be executed at some random time whenever the module is imported and the instances returned may no longer exist. You should use a SubFactory
class ProductFactory(DjangoModelFactory):
market_category = factory.SubFactory(MarketplaceCategoryFactory)
class Meta:
model = Product
Pass the queryset directly to FuzzyChoice to get a random existing value, don't convert it to a list
class ProductFactory(DjangoModelFactory):
market_category = factory.fuzzy.FuzzyChoice(MarketplaceCategory.objects.all())
class Meta:
model = Product
This will then create an instance whenever you create a product but you can pass "market_category" to the factory to override it
class MyTestCase(TestCase):
def setUp(self) -> None:
self.marketplace_category = MarketplaceCategoryFactory.create()
self.product = ProductFactory(created_by=self.user, market_category =self.marketplace_category)
I want to create a set of classes, its vars and methodes just from a given text configuration, espcially with django models, for exmaple i have a list of models to create in models.py
classes=["users", "posts", "commnets"]
vars= [{"a","b"},{"bb","vv"},{"aa"}]
#methods=[{....},{....},{....}] not now
in models.py
i want to make something like this to create that classes
for i,j in zip(classes,vars):
create_classes_from_string(i,j)
how can i program #create_classes_from_string assuring that it creates tables in my database with that configuration
I can view this question in 2 perspectives
Normal way of dynamically creating python class
Create dynamic django models specifically
But in both cases, the attrs should be defined as dict with the variable name and its value. Because defining a variable without a value is meaningless here.
1. Normal way of dynamically creating python class
Here we can simply use type() method to generate a python class. This can be later used to create object with their own name by adding them to locals() builtin function.
An example is mentioned below
classes = ["Class1", "Class2"]
class_fileds = [
{
'cl1_var1': "test",
'cl1_var2': 123,
},
{
'cl2_var1': [1, 2, 3],
}
]
classes_details = list(zip(classes, class_fileds)) # Python3 format
for class_details in classes_details:
class_name = class_details[0]
class_attrs = class_details[1]
class_def = type(
class_name,
(object, ), # Base classes tuple
class_attrs
)
locals().update({class_name: class_def}) # To associate the class with the script running
instance1 = Class1()
instance2 = Class2()
Outputs
>>> instance1 = Class1()
>>> instance2 = Class2()
>>>
>>> instance1.cl1_var1
'test'
>>> instance1.cl1_var2
123
>>> instance2.cl2_var1
[1, 2, 3]
Here the class names in the list, classes = ["Class1", "Class2"], can be used as such as given i.e. Class1(), Class2() etc. This is achieved by adding the variables Class1 and Class2 to the running script dynamically by using local() inbuilt function
2. Create dynamic django models specifically
Even though the basic logic remains the same there are a couple of changes required.
First of all we need to understand the dynamic model creations in Django. Django provides a clear documentation for this.
Please refer, https://code.djangoproject.com/wiki/DynamicModels
An example can be seen as below, you can directly add the below script to models.py file
from django.db import models
from django.db.models import CharField, IntegerField
# This is taken from https://code.djangoproject.com/wiki/DynamicModels#Ageneral-purposeapproach
def create_model(name, fields=None, app_label='', module='', options=None, admin_opts=None):
class Meta:
pass
if app_label:
setattr(Meta, 'app_label', app_label)
if options is not None:
for key, value in options.iteritems():
setattr(Meta, key, value)
attrs = {'__module__': module, 'Meta': Meta} # Set up a dictionary to simulate declarations within a class
if fields: # Add in any fields that were provided
attrs.update(fields)
model = type(name, (models.Model,), attrs) # Create the class, which automatically triggers ModelBase processing
return model
classes = ["Class1", "Class2"]
class_fileds = [
{
'cl1_var1': CharField(max_length=255),
'cl1_var2': IntegerField(),
},
{
'cl2_var2': IntegerField(),
}
]
models_details = list(zip(classes, class_fileds))
for model_detail in models_details:
model_name = model_detail[0]
model_attrs = model_detail[1]
model_def = create_model(
model_name,
fields=model_attrs,
app_label=__package__,
module= __name__,
)
locals()[model_name] = model_def
Output at django shell
>>> from my_app.models import Class1
>>> Class1(cl1_var1="Able to create dynamic class", cl1_var2=12345).save()
>>> Class1.objects.all().values()
<QuerySet [{'cl1_var1': 'Able to create dynamic class', 'id': 3, 'cl1_var2': 12345}]>
This model is added to django app, my_app and this would work fine and there are a few things to be noted
field attrs should be handled carefully as you are going to read that from text file
The models should be added using locals() to import that from app
Method, create_model should be taken from the reference link as it supports more features like adding admin pages etc
Data migration also works with this kind of model
My Suggestion
The above-explained methods would work without any issue and all of them are supported but one thing to be not forgotten that, there is performance difference in dynamically imported classes and real import. Also, this is a bit complex structure and any change in the code should be done very carefully to not break it up.
So my suggestion is to read the text file with configurations and generate models.py file from the configuration file using some magic script(that can also be created in python).
So every time there is a change in the text-config file you have to generate the models.py script. This way you can also ensure the Model definitions
I am using the latest ponyorm on Python 3.6.
I want to do some monkey patching on entity classes created at another stage (to add computed fields).
Any chance I can get the list of entities types available from the db object ?
In my models.py file:
from pony.orm import *
db = Database()
class OneEntity(db.Entity):
id = PrimaryKey(int, auto=True)
nom = Required(str)
class AnotherEntity(db.Entity):
id = PrimaryKey(int, auto=True)
someprop = Required(str)
In another file:
from models import *
db.bind(provider='sqlite', filename = 'test.db', create_db = True)
db.generate_mapping(create_tables = True)
def say_hello():
""" some dummy proc to monkey patch onto entity classes"""
print("hello")
#This works, but isn't workable for my use case (too many entity classes)
OneEntity.monkey_patched_method = say_hello
#And here I'd like to have the ability to list entity classes programmatically
for Entity in some_code_that_i_dont_know :
Entity.new_method = say_hello
In PonyORM Database object has entities property which is a dict of all associated entities:
for entity_name, entity_cls in db.entities.items():
print(entity_name)
You should be able to obtain subclasses of Entity using the __subclasses__ method.
This example is from Flask SQLAlchemy. Your results should be similar:
>>> db.Model.__subclasses__()
[myapp.models.User,
myapp.models.Organization,
myapp.models.Customer,
myapp.models.Address,
...
]
In your code, you should do the following:
for Entity in db.Entity.__subclasses__():
Entity.new_method = say_hello
This isn’t specific to Pony, but, you can use inspect.getmembers to do this:
import inspect
import models
for name, attr in inspect.getmembers(models):
if inspect.isclass(attr) and issubclass(attr, db.Entity:
models.__dict__[name].new_method = say_hello
Basically this will run through all the attributes of the models module and add new_method to any db.Entity subclasses it encounters.
I am generating a Django model based on an abstract model class AbstractAttr and a normal model (let's say Foo).
I want my foo/models.py to look like this:
from bar.models import Attrs
# ...
class Foo(models.Model):
....
attrs = Attrs()
In the Attrs class which mimics a field I have a contribute_to_class that generates the required model using type(). The generated model c is called FooAttr.
Everything works. If I migrate, I see FooAttr appear in the proper table.
EXCEPT FOR ONE THING.
I want to be able to from foo.models import FooAttr. Somehow my generated FooAttr class is not bound to the models.py file in which it is generated.
If I change my models.py to this:
class Foo(models.Model):
# ...
FooAttr = generate_foo_attr_class(...)
it works, but this is not what I want (for example, this forces the dev to guess the generate class name).
Is what I want possible, define the class somewhat like in the first example AND bind it to the specific models.py module?
The project (pre-Alpha) is here (in develop branch):
https://github.com/zostera/django-mav
Some relevant code:
def create_model_attribute_class(model_class, class_name=None, related_name=None, meta=None):
"""
Generate a value class (derived from AbstractModelAttribute) for a given model class
:param model_class: The model to create a AbstractModelAttribute class for
:param class_name: The name of the AbstractModelAttribute class to generate
:param related_name: The related name
:return: A model derives from AbstractModelAttribute with an object pointing to model_class
"""
if model_class._meta.abstract:
# This can't be done, because `object = ForeignKey(model_class)` would fail.
raise TypeError("Can't create attrs for abstract class {0}".format(model_class.__name__))
# Define inner Meta class
if not meta:
meta = {}
meta['app_label'] = model_class._meta.app_label
meta['db_tablespace'] = model_class._meta.db_tablespace
meta['managed'] = model_class._meta.managed
meta['unique_together'] = list(meta.get('unique_together', [])) + [('attribute', 'object')]
meta.setdefault('db_table', '{0}_attr'.format(model_class._meta.db_table))
# The name of the class to generate
if class_name is None:
value_class_name = '{name}Attr'.format(name=model_class.__name__)
else:
value_class_name = class_name
# The related name to set
if related_name is None:
model_class_related_name = 'attrs'
else:
model_class_related_name = related_name
# Make a type for our class
value_class = type(
str(value_class_name),
(AbstractModelAttribute,),
dict(
# Set to same module as model_class
__module__=model_class.__module__,
# Add a foreign key to model_class
object=models.ForeignKey(
model_class,
related_name=model_class_related_name
),
# Add Meta class
Meta=type(
str('Meta'),
(object,),
meta
),
))
return value_class
class Attrs(object):
def contribute_to_class(self, cls, name):
# Called from django.db.models.base.ModelBase.__new__
mav_class = create_model_attribute_class(model_class=cls, related_name=name)
cls.ModelAttributeClass = mav_class
I see you create the model from within models.py, so I think you should be able to add it to the module's globals. How about this:
new_class = create_model_attribute_class(**kwargs)
globals()[new_class.__name__] = new_class
del new_class # no need to keep original around
Thanks all for thinking about this. I have updated the source code of the project at GitHub and added more tests. See https://github.com/zostera/django-mav
Since the actual generation of the models is done outside of foo/models.py (it takes place in mav/models.py, it seems Pythonically impossible to link the model to foo/models.py. Also, after rethinking this, it seems to automagically for Python (explicit is better, no magic).
So my new strategy is to use simple functions, a decorator to make it easy to add mav, and link the generated models to mac/attrs.py, so I can universally from mav.attrs import FooAttr. I also link the generated class to the Foo model as Foo._mav_class.
(In this comment, Foo is of course used as an example model that we want to add model-attribute-value to).
i'm trying to build sort of a "mini django model" for working with Django and MongoDB without using the norel Django's dist (i don't need ORM access for these...).
So, what i'm trying to do is to mimic the standart behavior or "implementation" of default models of django... that's what i've got so far:
File "models.py" (the base)
from django.conf import settings
import pymongo
class Model(object):
#classmethod
def db(cls):
db = pymongo.Connection(settings.MONGODB_CONF['host'], settings.MONGODB_CONF['port'])
#classmethod
class objects(object):
#classmethod
def all(cls):
db = Model.db() #Not using yet... not even sure if that's the best way to do it
print Model.collection
File "mongomodels.py" (the implementation)
from mongodb import models
class ModelTest1(models.Model):
database = 'mymongodb'
collection = 'mymongocollection1'
class ModelTest2(models.Model):
database = 'mymongodb'
collection = 'mymongocollection2'
File "views.py" (the view)
from mongomodels import ModelTest1, ModelTest2
print ModelTest1.objects.all() #Should print 'mymongocollection1'
print ModelTest2.objects.all() #Should print 'mymongocollection2'
The problem is that it's not accessing the variables from ModelTest1, but from the original Model... what's wrong??
You must give objects some sort of link to class that contains it. Currently, you are just hard-coding it to use Model()s atttributes. Because you are not instantiating these classes, you will either have to use either a decorator or a metaclass to create the object class for you in each subclass of Model().