I'm starting off with Django, and I'm sort of confused about how the models work. I've searched for a bit and can't find an answer. When creating classes in Python, we have to initialize the object properties, for example:
class Contact(Object):
def __init__(self, name, number):
self.name = name
self.number = number
And if we create a subclass, for example, coworker:
Class Coworker(Contact):
def __init__(self, name, number, title):
Contact.__init__(self, name, number)
self.title = title
So it makes sense that we still initialize the properties from the superclass, but in Django, why don't we do any initialization? We inherit from the models.Model class:
class Poll(models.Model):
question = models.CharField(max_length=200)
Why don't we have to initialize CharField from Model before we use it? I hope I'm not being too cryptic with my question. Like I said, I'm just getting started with Django, so any help is appreciated.
Because they're stored in the database. Either the program or the manager will instantiate the model with the correct arguments, therefore it is inappropriate for the initializer to blatantly override them.
def __init___(self, **kwargs):
cls_ = type(self)
for k in kwargs:
if not hasattr(cls_, k):
raise TypeError(
"%r is an invalid keyword argument for %s" %
(k, cls_.__name__))
setattr(self, k, kwargs[k])
There should be an universal __init__ method in models.Model like this (This is a snippet from SQLAlchemy but I believe that Django uses some similar methods)
Related
The Scenario:
class A:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_some_staff
def method_a(self):
pass
class B:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_another_staff
def method_b(self):
pass
class C(A,B):
def __init__(self, *args, **kwargs):
# I want to init both class A and B's key and secret
## I want to rename class A and B's same method
any_ideas()
...
What I Want:
I want the instance of class C initialize both class A and B, because they are different api key.
And I want rename class A and B's same_name_method, so I will not confused at which same_name_method.
What I Have Done:
For problem one, I have done this:
class C(A,B):
def __init__(self, *args, **kwargs):
A.__init__(self, a_api_key,a_api_secret)
B.__init__(self, b_api_key,b_api_secret)
Comment: I know about super(), but for this situation I do not know how to use it.
For problem two, I add a __new__ for class C
def __new__(cls, *args, **kwargs):
cls.platforms = []
cls.rename_method = []
for platform in cls.__bases__:
# fetch platform module name
module_name = platform.__module__.split('.')[0]
cls.platforms.append(module_name)
# rename attr
for k, v in platform.__dict__.items():
if not k.startswith('__'):
setattr(cls, module_name+'_'+k, v)
cls.rename_method.append(k)
for i in cls.rename_method:
delattr(cls, i) ## this line will raise AttributeError!!
return super().__new__(cls)
Comment: because I rename the new method names and add it to cls attr. I need to delete the old method attr, but do not know how to delattr. Now I just leave them alone, did not delete the old methods.
Question:
Any Suggestions?
So, you want some pretty advanced things, some complicated things, and you don't understand well how classes behave in Python.
So, for your first thing: initializing both classes, and every other method that should run in all classes: the correct solution is to make use of cooperative calls to super() methods.
A call to super() in Python returns you a very special proxy objects that reflects all methods available in the next class, obeying the proper method Resolution Order.
So, if A.__init__ and B.__init__ have to be called, both methods should include a super().__init__ call - and one will call the other's __init__ in the appropriate order, regardless of how they are used as bases in subclasses. As object also have __init__, the last super().__init__ will just call it that is a no-op. If you have more methods in your classes that should be run in all base classes, you'd rather build a proper base class so that the top-most super() call don't try to propagate to a non-existing method.
Otherwise, it is just:
class A:
def __init__(self, akey, asecret, **kwargs):
self.key = akey
self.secret = asecret
super().__init__(**kwargs)
class B:
def __init__(self, bkey, bsecret, **kwargs):
self.key = bkey
self.secret = bsecret
super().__init__(**kwargs)
class C(A,B):
# does not even need an explicit `__init__`.
I think you can get the idea. Of course, the parameter names have to differ - ideally, when writing C you don't have to worry about parameter order - but when calling C you have to worry about suplying all mandatory parameters for C and its bases. If you can't rename the parameters in A or B to be distinct, you could try to use the parameter order for the call, though, with each __init__ consuming two position-parameters - but that will require some extra care in inheritance order.
So - up to this point, it is basic Python multiple-inheritance "howto", and should be pretty straightforward. Now comes your strange stuff.
As for the auto-renaming of methods: first things first -
are you quite sure you need inheritance? Maybe having your granular classes for each external service, and a registry and dispatch class that call the methods on the others by composition would be more sane. (I may come back to this later)
Are you aware that __new__ is called for each instantiation of the class, and all class-attribute mangling you are performing there happens at each new instance of your classes?
So, if the needed method-renaming + shadowing needs to take place at class creation time, you can do that using the special method __init_subclass__ that exists from Python 3.6. It is a special class method that is called once for each derived class of the class it is defined on. So, just create a base class, from which A and B themselves will inherit, and move a properly modified version the thing you are putting in __new__ there. If you are not using Python 3.6, this should be done on the __new__ or __init__ of a metaclass, not on the __new__ of the class itself.
Another approach would be to have a custom __getattribute__ method - this could be crafted to provide namespaces for the base classes. It would owrk ony on instances, not on the classes themselves (but could be made to, again, using a metaclass). __getattribute__ can even hide the same-name-methods.
class Base:
#classmethod
def _get_base_modules(cls):
result = {}
for base in cls.__bases__:
module_name = cls.__module__.split(".")[0]
result[module_name] = base
return result
#classmethod
def _proxy(self, module_name):
class base:
def __dir__(base_self):
return dir(self._base_modules[module_name])
def __getattr__(base_self, attr):
original_value = self._base_modules[module_name].__dict__[attr]
if hasattr(original_value, "__get__"):
original_value = original_value.__get__(self, self.__class__)
return original_value
base.__name__ = module_name
return base()
def __init_subclass__(cls):
cls._base_modules = cls._get_base_modules()
cls._shadowed = {name for module_class in cls._base_modules.values() for name in module_class.__dict__ if not name.startswith("_")}
def __getattribute__(self, attr):
if attr.startswith("_"):
return super().__getattribute__(attr)
cls = self.__class__
if attr in cls._shadowed:
raise AttributeError(attr)
if attr in cls._base_modules:
return cls._proxy(attr)
return super().__getattribute__(attr)
def __dir__(self):
return super().dir() + list(self._base_modules)
class A(Base):
...
class B(Base):
...
class C(A, B):
...
As you can see - this is some fun, but starts getting really complicated - and all the hoola-boops that are needed to retrieve the actual attributes from the superclasses after ading an artificial namespace seem to indicate your problem is not calling for using inheritance after all, as I suggested above.
Since you have your small, functional, atomic classes for each "service" , you could use a plain, simple, non-meta-at-all class that would work as a registry for the various services - and you can even enhance it to call the equivalent method in several of the services it is handling with a single call:
class Services:
def __init__(self):
self.registry = {}
def register(self, cls, key, secret):
name = cls.__module__.split(".")[0]
service= cls(key, secret)
self.registry[name] = service
def __getattr__(self, attr):
if attr in self.registry:
return self.registry[attr]
I have spent a lot of time researching this, but none of the answers seem to work how I would like.
I have an abstract class with a class attribute I want each subclass to be forced to implement
class AbstractFoo():
forceThis = 0
So that when I do this
class RealFoo(AbstractFoo):
pass
it throws an error telling me it can't create the class until I implement forceThis.
How can I do that?
(I don't want the attribute to be read-only, but if that's the only solution, I'll accept it.)
For a class method, I've discovered I can do
from abc import ABCMeta, abstractmethod
class AbstractFoo(metaclass=ABCMeta):
#classmethod
#abstractmethod
def forceThis():
"""This must be implemented"""
so that
class RealFoo(AbstractFoo):
pass
at least throws the error TypeError: Can't instantiate abstract class EZ with abstract methods forceThis
(Although it doesn't force forceThis to be a class method.)
How can I get a similar error to pop up for the class attribute?
You can do this by defining your own metaclass. Something like:
class ForceMeta(type):
required = ['foo', 'bar']
def __new__(mcls, name, bases, namespace):
cls = super().__new__(mcls, name, bases, namespace)
for prop in mcls.required:
if not hasattr(cls, prop):
raise NotImplementedError('must define {}'.format(prop))
return cls
Now you can use this as the metaclass of your own classes:
class RequiredClass(metaclass=ForceMeta):
foo = 1
which will raise the error 'must define bar'.
I came up with a solution based on those posted earlier. (Thank you #Daniel Roseman and #martineau)
I created a metaclass called ABCAMeta (the last 'A' stands for 'Attributes').
The class has two ways of working.
A class which just uses ABCAMeta as a metaclass must have a property called required_attributes which should contain a list of the names of all the attributes you want to require on future subclasses of that class
A class whose parent's metaclass is ABCAMeta must have all the required attributes specified by its parent class(es).
For example:
class AbstractFoo(metaclass=ABCAMeta):
required_attributes = ['force_this']
class RealFoo(AbstractFoo):
pass
will throw an error:
NameError: Class 'RealFoo' has not implemented the following attributes: 'force_this'
Exactly how I wanted.
from abc import ABCMeta
class NoRequirements(RuntimeError):
def __init__(self, message):
RuntimeError.__init__(self, message)
class ABCAMeta(ABCMeta):
def __init__(mcls, name, bases, namespace):
ABCMeta.__init__(mcls, name, bases, namespace)
def __new__(mcls, name, bases, namespace):
def get_requirements(c):
"""c is a class that should have a 'required_attributes' attribute
this function will get that list of required attributes or
raise a NoRequirements error if it doesn't find one.
"""
if hasattr(c, 'required_attributes'):
return c.required_attributes
else:
raise NoRequirements(f"Class '{c.__name__}' has no 'required_attributes' property")
cls = super().__new__(mcls, name, bases, namespace)
# true if no parents of the class being created have ABCAMeta as their metaclass
basic_metaclass = True
# list of attributes the class being created must implement
# should stay empty if basic_metaclass stays True
reqs = []
for parent in bases:
parent_meta = type(parent)
if parent_meta==ABCAMeta:
# the class being created has a parent whose metaclass is ABCAMeta
# the class being created must contain the requirements of the parent class
basic_metaclass=False
try:
reqs.extend(get_requirements(parent))
except NoRequirements:
raise
# will force subclasses of the created class to define
# the attributes listed in the required_attributes attribute of the created class
if basic_metaclass:
get_requirements(cls) # just want it to raise an error if it doesn't have the attributes
else:
missingreqs = []
for req in reqs:
if not hasattr(cls, req):
missingreqs.append(req)
if len(missingreqs)!=0:
raise NameError(f"Class '{cls.__name__}' has not implemented the following attributes: {str(missingreqs)[1:-1]}")
return cls
Any suggestions for improvement are welcome in the comments.
Although you can do something very similar with a metaclass, as illustrated in #Daniel Roseman's answer, it can also be done with a class decorator. A couple of advantages they have are that errors will occur when the class is defined, instead of when an instance of one is created, and the syntax for specifying them is the same in both Python 2 and 3. Some folks also find them simpler and easier to understand.
def check_reqs(cls):
requirements = 'must_have',
missing = [req for req in requirements if not hasattr(cls, req)]
if missing:
raise NotImplementedError(
'class {} did not define required attribute{} named {}'.format(
cls.__name__, 's' if len(missing) > 1 else '',
', '.join('"{}"'.format(name) for name in missing)))
return cls
#check_reqs
class Foo(object): # OK
must_have = 42
#check_reqs
class Bar(object): # raises a NotImplementedError
pass
I'm making Django like ORM for my study project and because we are not allowed to use existing ORMs (If you want to use one you have to code it yourself) and just for educating myself, i thought that the same kind of ORM like in Django would be nice.
In the ORM I wan't to make model definitions in same style that they are implemented in Django. ie.
class Person(models.Model):
first_name = models.CharField(max_length=30)
last_name = models.CharField(max_length=30)
Django uses metaclasses and in my project I'm using too, but I have problem with the fact that metaclasses construct classes not instances and so all attributes are class attributes and shared between all instances.
This is generic example what I tried but because of what I earlier said, it won't work:
def getmethod(attrname):
def _getmethod(self):
return getattr(self, "__"+attrname).get()
return _getmethod
def setmethod(attrname):
def _setmethod(self, value):
return getattr(self, "__"+attrname).set(value)
return _setmethod
class Metaclass(type):
def __new__(cls, name, based, attrs):
ndict = {}
for attr in attrs:
if isinstance(attrs[attr], Field):
ndict['__'+attr] = attrs[attr]
ndict[attr] = property(getmethod(attr), setmethod(attr))
return super(Metaclass, cls).__new__(cls, name, based, ndict)
class Field:
def __init__(self):
self.value = 0;
def set(self, value):
self.value = value
def get(self):
return self.value
class Mainclass:
__metaclass__ = Metaclass
class Childclass(Mainclass):
attr1 = Field()
attr2 = Field()
a = Childclass()
print "a, should be 0:", a.attr1
a.attr1 = "test"
print "a, should be test:", a.attr1
b = Childclass()
print "b, should be 0:", b.attr1
I tried to lookup from Djangos source but it is too complicated for me to understand and the "magic" seems to be hidden somewhere.
Question is simple, how Django does this in very simplificated example?
The answer is quite simple really, once you check the right code. The metaclass used by Django adds all fields to <model>._meta.fields (well, kinda), but the field attribute is removed from the actual class. The only exception to this is a RelatedField subclass, in which case an object descriptor is added (similar to a property - in fact, a propery is an object descriptor, just with a native implementation).
Then, in the __init__ method of the model, the code iterates over all fields, and either sets the provided value in *args or **kwargs, or sets a default value on the instance.
In your example, this means that the class Person will never have attributes named first_name and last_name, but both fields are stored in Person._meta.fields. However, an instance of Person will always have attributes named first_name and last_name, even if they are not provided as arguments.
I'm using Django and want to be able to store classes in a database for things like forms and models so that I can easily make them creatable through a user interface since they are just stored in the database as opposed to a regular file. I don't really know a whole lot about this and am not sure if this is a situation where I need to use exec in python or if there is some other way. My searches on this aren't turning up much of anything.
Basically, it would just be where I do a database call and get the contents of a class, then I want to instantiate it. Any advice is appreciated on how to best do this sort of thing.
EDIT: In response to the idea of a malicious __init__ in the class, these are only for things like forms or models where it is tightly controlled through validation what goes in the class, there would never be an __init__ in the class and it would be basically impossible, since I would validate everything server side, to put anything malicious in the class.
Do not store code in the database!!!
Imagine a class with a malicious __init__ method finding it's way in your "class repository" in the database. This means whoever has write access to those database tables has the ability to read any file from your web server and even nuke it's file system, since they have the ability to execute any python code on it.
Don't store the class itself, store the import path as a string in the database (e.g. 'django.forms.CharField')
I started doing this same thing for another project, and saved off the code in my local repository. To address the security concerns I was going to add an argument to the field constructor of allowed base classes. If you do implement this, let me know, I'd love to have it.
helpers.py
def get_class_from_concrete_classpath(class_path):
# Unicode will throw errors in the __import__ (at least in 2.6)
class_path = str(class_path)
mod_list = class_path.split('.')
module_path = '.'.join(mod_list[:-1])
class_name = mod_list[-1]
base_mod = __import__(module_path, fromlist=[class_name,])
return getattr(base_mod, class_name)
def get_concrete_name_of_class(klass):
"""Given a class return the concrete name of the class.
klass - The reference to the class we're interested in.
Raises a `TypeError` if klass is not a class.
"""
if not isinstance(klass, (type, ClassType)):
raise TypeError('The klass argument must be a class. Got type %s; %s' %
(type(klass), klass))
return '%s.%s' % (klass.__module__, klass.__name__)
fields.py
class ClassFormField(forms.Field):
def to_python(self, value):
return get_concrete_name_of_class(value)
class ClassField(models.CharField):
__metaclass__ = models.SubfieldBase
"""Field used for storing a class as a string for later retrieval"""
MAX_LENGTH = 255
default_error_messages = {
'invalid': _(u'Enter a valid class variable.'),
}
def __init__(self, *args, **kwargs):
kwargs['max_length'] = kwargs.get('max_length', ClassField.MAX_LENGTH)
super(ClassField, self).__init__(*args, **kwargs)
def get_prep_value(self, value):
if isinstance(value, (basestring, NoneType)):
return value
return get_concrete_name_of_class(value)
def to_python(self, value):
if isinstance(value, basestring):
return get_class_from_concrete_classpath(value)
return value
def formfield(self, **kwargs):
defaults = {'form_class' : ClassFormField}
defaults.update(kwargs)
return super(ClassField, self).formfield(**defaults)
I have the following models. How do I get access to the unicode of the inheriting tables (Team and Athete) from the Entity table? I'm trying to display a list of all the Entities that displays 'name' if Team and 'firstname' and 'lastname' if Athlete.
class Entity(models.Model):
entity_type_list = (('T', 'Team'), ('A', 'Athlete'))
type = models.CharField(max_length=2, choices=entity_type_list,default='P')
pictureurl = models.URLField('Picture Url', verify_exists=False, max_length=255, null=True, blank=True)
class Team(Entity):
name = models.CharField(max_length=100)
def __unicode__(self):
return self.name
class Athlete(Entity):
firstname = models.CharField(max_length=100)
lastname = models.CharField(max_length=100)
def __unicode__(self):
return '%s %s' % (self.firstname, self.lastname)
This answer from Carl Meyer to the question mentioned earlier by Paul McMillan might be what your looking for. A subtlety to this problem not captured in some of the answers is how to get at derived class instances from a QuerySet on Entity.
The Problem
for entity in Entity.objects.all()
print unicode(entity) # Calls the Entity class unicode, which is not what you want.
A Solution
Use the InheritanceCastModel mixin in the answer linked above as a base class for Entity. You can then cast from Entity instances to the actual derived class instances. This is particularly handy when you want to use querysets on your parent class (Entity) but access the derived class instances.
class Entity(InheritanceCastModel):
# your model definition. You can get rid of the entity_type_list and type, as the
# real_type provided by InheritanceCastModel provides this info
class Athlete(Entity):
# unchanged
class Team(Entity):
# unchanged
for entity in Entity.objects.all():
actual_entity = entity.cast()
print unicode(actual_entity) # actual entity is a a Team or Athlete
From pure Python, you can use the isinstance function:
class Entity:
def __init__(self):
if isinstance(self, Team):
print 'is team'
elif isinstance(self, Athlete):
print 'is athlete'
class Team(Entity):
def __unicode__(self):
return 'Team'
class Athlete(Entity):
def __unicode__(self):
return 'Athlete'
I am not very clear what you want to do, but in any case you can add a criteria in dervied class instead of checking unicode method of derived classes
e.g. you can ask the class isTypeA ? or why don't you check the type?
Loop over all the entities... if entity.class == 'Athlete' and entity.firstname and entity.lastname: blah
Hope this helps.
Edit: hmmm looks like I forgot about actually getting the combined list of both entities. Not sure I know of a slick way to do that.
I answered a similar question a while ago. Have a look, I think one of the answers probably solves your problem as well.
How do I access the child classes of an object in django without knowing the name of the child class?
My answer from there was to add this to the parent class:
def get_children(self):
rel_objs = self._meta.get_all_related_objects()
return [getattr(self, x.get_accessor_name()) for x in rel_objs if x.model != type(self)]
Then you can call that function to get the children objects (in your case you will only have one) and then call the unicode function from that object.
This is a little ugly, but I think it should work:
entities = Entity.objects.all()
for entity in entities:
try:
print entity.team
except:
print entity.athlete
Check out http://docs.djangoproject.com/en/1.0/topics/db/models/#id7 for more on multi-table inheritance. Just be careful, because the Django ORM is inevitably a leaky abstraction and things you might normally do with objects can get you in trouble, or do unexpected things.
If I undestand correctly, you are simply asking how to call the __unicode__ method of a given object.
Use unicode(instance) and depending on the type of entity, the appropriate implementation will be called polymorphically.
I don't believe you have to do anything. If Entity is never instantiated directly, you will never call the non-existent Entity.__unicode__ method. However, if you'd like to play it save, you could add a stub method in your Entity class:
class Entity(models.Model):
def __unicode__(self):
pass
class Team(Entity):
def __unicode__(self):
return self.name
class Athlete(Entity):
def __unicode__(self):
return '%s %s' % (self.firstname, self.lastname)
You are now assured that any class which inherits from Entity will have a __unicode__ method, and you can simply traverse them:
for thing in [TeamA(), AthleteA(), TeamB(), AthleteB()]:
print unicode(thing)