Django form, overriding init, variable scope - python

In django, I have a form being called from the view, which is passed an extra object that popped in the form init. I want to use this object data (person) in the clean def's outside of init. How can I fix the scope of this passed information? Thanks!
class RegForm(forms.Form):
first = forms.CharField(min_length=5)
def __init__(self, *args, **kwargs):
person = kwargs.pop("person")
super(CompleteRegistrationForm, self).__init__(*args, **kwargs)
def clean_first(self):
if not self.cleaned_data['first'] == person.first:
raise forms.ValidationError(_("This information does not match records."))
else:
return self.cleaned_data['first']

person should be an instance variable:
def __init__(self, *args, **kwargs):
self.person = kwargs.pop("person")
super(CompleteRegistrationForm, self).__init__(*args, **kwargs)
Then, in other methods, refer to it as self.person (not just person).

You should assign it to self:
self.person = kwargs.pop("person")
This is fairly basic Python - you would probably benefit from doing a tutorial.

Related

Python multiple class hierarchy __init__ not being executed

I'm using django, but this is rather a generic python question.
I have defined a class that I intend to use to extend the ModelForm and Form classes, from django.forms.
The code looks like this:
class FormMixin(object):
def __init__(self, *args, **kwargs):
""" every method ocurrence must call super """
super(FormMixin, self).__init__(*args, **kwargs)
self.new_attr = 'This is an attribute'
class ModelFormAdapter(forms.ModelForm):
""" I use this class so __init__ signatures match """
def __init__(self, *args, **kwargs):
""" every method ocurrence must call super """
super(ModelFormAdapter, self).__init__(*args, **kwargs)
class BaseModelForm(ModelFormAdapter, FormMixin):
def __init__(self, *args, **kwargs):
""" BaseModelForm never gets the attribute new_attr """
super(BaseModelForm, self).__init__(*args, **kwargs)
I have even debugged this and the FormMixin init method is never called. What am I doing wrong? What I want to achieve is to add some attributes to the form and preprocess field labels and css classes
That's because one of ModelFormAdapter's ancestors (BaseForm), doesn't call super, and the chain breaks. Put FormMixin first in the parent list.

Django: Access and Update fields of a Form Class inside __init__ or save functions

I have a MyModelForm form class for MyModel model class, and I want to generate a random value for a certain field.
The way I see it is either inside init or save functions, I tried using self.fields['randfield'] but it throws an error 'MyModelForm' object has no attribute 'fields'.
How can I access and update a field inside form class so that I can instantiate it with a random value?
Thanks.
EDIT: After using self.fields['randint'].initial I am getting a KeyError. The code is
Okay, here goes:
def __init__(self, instance=None, *args, **kwargs):
_fields = ('username', 'email')
_initial = model_to_dict(instance.user, _fields) if instance is not None else {}
super(UserDetailsForm, self).__init__(initial=_initial, instance=instance, *args, **kwargs)
self.fields.update(fields_for_model(User, _fields))
self.fields['randint'].initial = '987654321'
Use something like this:
class RandomValueForm(ModelForm):
myfield = models.IntegerField(default=0)
def __init__(self, *args, **kwargs):
super(RandomValueForm, self).__init__(*args, **kwargs)
self.fields['myfield'].initial = my_random_generator()
You got this error because you would have tried accessing fields on self without calling the __init__ of superclass. So, first you need to call superclass __init__ i.e __init__ of ModelForm and then you can access fields.
class MyModelForm(ModelForm):
def __init__(self, *args, **kwargs):
super(MyModelForm, self).__init__(*args, **kwargs)
self.fields['myfield'].initial = my_random_number()

Injecting function call after __init__ with decorator

I'm trying to find the best way to create a class decorator that does the following:
Injects a few functions into the decorated class
Forces a call to one of these functions AFTER the decorated class' __init__ is called
Currently, I'm just saving off a reference to the 'original' __init__ method and replacing it with my __init__ that calls the original and my additional function. It looks similar to this:
orig_init = cls.__init__
def new_init(self, *args, **kwargs):
"""
'Extend' wrapped class' __init__ so we can attach to all signals
automatically
"""
orig_init(self, *args, **kwargs)
self._debugSignals()
cls.__init__ = new_init
Is there a better way to 'augment' the original __init__ or inject my call somewhere else? All I really need is for my self._debugSignals() to be called sometime after the object is created. I also want it happen automatically, which is why I thought after __init__ was a good place.
Extra misc. decorator notes
It might be worth mentioning some background on this decorator. You can find the full code here. The point of the decorator is to automatically attach to any PyQt signals and print when they are emitted. The decorator works fine when I decorate my own subclasses of QtCore.QObject, however I've been recently trying to automatically decorate all QObject children.
I'd like to have a 'debug' mode in the application where I can automatically print ALL signals just to make sure things are doing what I expect. I'm sure this will result in TONS of debug, but I'd still like to see what's happening.
The problem is my current version of the decorator is causing a segfault when replacing QtCore.QObject.__init__. I've tried to debug this, but the code is all SIP generated, which I don't have much experience with.
So, I was wondering if there was a safer, more pythonic way to inject a function call AFTER the __init__ and hopefully avoid the segfault.
Based on this post and this answer, an alternative way to do this is through a custom metaclass. This would work as follows (tested in Python 2.7):
# define a new metaclass which overrides the "__call__" function
class NewInitCaller(type):
def __call__(cls, *args, **kwargs):
"""Called when you call MyNewClass() """
obj = type.__call__(cls, *args, **kwargs)
obj.new_init()
return obj
# then create a new class with the __metaclass__ set as our custom metaclass
class MyNewClass(object):
__metaclass__ = NewInitCaller
def __init__(self):
print "Init class"
def new_init(self):
print "New init!!"
# when you create an instance
a = MyNewClass()
>>> Init class
>>> New init!!
The basic idea is that:
when you call MyNewClass() it searches for the metaclass, finds that you have defined NewInitCaller
The metaclass __call__ function is called.
This function creates the MyNewClass instance using type,
The instance runs its own __init__ (printing "Init class").
The meta class then calls the new_init function of the instance.
Here is the solution for Python 3.x, based on this post's accepted answer. Also see PEP 3115 for reference, I think the rationale is an interesting read.
Changes in the example above are shown with comments; the only real change is the way the metaclass is defined, all other are trivial 2to3 modifications.
# define a new metaclass which overrides the "__call__" function
class NewInitCaller(type):
def __call__(cls, *args, **kwargs):
"""Called when you call MyNewClass() """
obj = type.__call__(cls, *args, **kwargs)
obj.new_init()
return obj
# then create a new class with the metaclass passed as an argument
class MyNewClass(object, metaclass=NewInitCaller): # added argument
# __metaclass__ = NewInitCaller this line is removed; would not have effect
def __init__(self):
print("Init class") # function, not command
def new_init(self):
print("New init!!") # function, not command
# when you create an instance
a = MyNewClass()
>>> Init class
>>> New init!!
Here's a generalized form of jake77's example which implements __post_init__ on a non-dataclass. This enables a subclass's configure() to be automatically invoked in correct sequence after the base & subclass __init__s have completed.
# define a new metaclass which overrides the "__call__" function
class PostInitCaller(type):
def __call__(cls, *args, **kwargs):
"""Called when you call BaseClass() """
print(f"{__class__.__name__}.__call__({args}, {kwargs})")
obj = type.__call__(cls, *args, **kwargs)
obj.__post_init__(*args, **kwargs)
return obj
# then create a new class with the metaclass passed as an argument
class BaseClass(object, metaclass=PostInitCaller):
def __init__(self, *args, **kwargs):
print(f"{__class__.__name__}.__init__({args}, {kwargs})")
super().__init__()
def __post_init__(self, *args, **kwargs):
print(f"{__class__.__name__}.__post_init__({args}, {kwargs})")
self.configure(*args, **kwargs)
def configure(self, *args, **kwargs):
print(f"{__class__.__name__}.configure({args}, {kwargs})")
class SubClass(BaseClass):
def __init__(self, *args, **kwargs):
print(f"{__class__.__name__}.__init__({args}, {kwargs})")
super().__init__(*args, **kwargs)
def configure(self, *args, **kwargs):
print(f"{__class__.__name__}.configure({args}, {kwargs})")
super().configure(*args, **kwargs)
# when you create an instance
a = SubClass('a', b='b')
running gives:
PostInitCaller.__call__(('a',), {'b': 'b'})
SubClass.__init__(('a',), {'b': 'b'})
BaseClass.__init__(('a',), {'b': 'b'})
BaseClass.__post_init__(('a',), {'b': 'b'})
SubClass.configure(('a',), {'b': 'b'})
BaseClass.configure(('a',), {'b': 'b'})
I know that the metaclass approach is the Pro way, but I've a more readable and easy proposal using #staticmethod:
class Invites(TimestampModel, db.Model):
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
invitee_email = db.Column(db.String(128), nullable=False)
def __init__(self, invitee_email):
invitee_email = invitee_email
#staticmethod
def create_invitation(invitee_email):
"""
Create an invitation
saves it and fetches it because the id
is being generated in the DB
"""
invitation = Invites(invitee_email)
db.session.save(invitation)
db.session.commit()
return Invites.query.filter(
PartnerInvites.invitee_email == invitee_email
).one_or_none()
So I could use it this way:
invitation = Invites.create_invitation("jim#mail.com")
print(invitation.id, invitation.invitee_email)
>>>> 1 jim#mail.com

Django 1.2: How to connect pre_save signal to class method

I am trying to define a "before_save" method in certain classes in my django 1.2 project. I'm having trouble connecting the signal to the class method in models.py.
class MyClass(models.Model):
....
def before_save(self, sender, instance, *args, **kwargs):
self.test_field = "It worked"
I've tried putting pre_save.connect(before_save, sender='self') in 'MyClass' itself, but nothing happens.
I've also tried putting it at the bottom of the models.py file:
pre_save.connect(MyClass.before_save, sender=MyClass)
I read about connecting signals to class methods here, but can't figure out the code.
Anybody know what I'm doing wrong?
A working example with classmethod:
class MyClass(models.Model):
#....
#classmethod
def before_save(cls, sender, instance, *args, **kwargs):
instance.test_field = "It worked"
pre_save.connect(MyClass.before_save, sender=MyClass)
There's also a great decorator to handle signal connections automatically: http://djangosnippets.org/snippets/2124/
I know this question is old, but I was looking for an answer to this earlier today so why not. It seems from your code that you actually wanted to use an instance method (from the self and the field assignment). DataGreed addressed how to use it for a class method, and using signals with instance methods is pretty similar.
class MyClass(models.Model)
test_field = models.Charfield(max_length=100)
def __init__(self, *args, **kwargs):
super(MyClass, self).__init__(*args, **kwargs)
pre_save.connect(self.before_save, sender=MyClass)
def before_save(self, sender, instance, *args, **kwargs):
self.test_field = "It worked"
I'm not sure if this is a good idea or not, but it was helpful when I needed an instance method called on an object of class A before save from class B.
Rather than use a method on MyClass, you should just use a function. Something like:
def before_save(sender, instance, *args, **kwargs):
instance.test_field = "It worked"
pre_save.connect(before_save, sender=MyClass)

Python : metaclass + wrapped methods + inheritance = problems

I have a problem in Python, for which I cannot find any clean solution ...
When calling some methods, I want to execute some code before the method execution and after. In order (among many other things) to automatically set and clean a context variable.
In order to achieve this, I have declared the following metaclass :
class MyType(type):
def __new__(cls, name, bases, attrs):
#wraps the 'test' method to automate context management and other stuff
attrs['test'] = cls.other_wrapper(attrs['test'])
attrs['test'] = cls.context_wrapper(attrs['test'])
return super(MyType, cls).__new__(cls, name, bases, attrs)
#classmethod
def context_wrapper(cls, operation):
def _manage_context(self, *args, **kwargs):
#Sets the context to 'blabla' before the execution
self.context = 'blabla'
returned = operation(self, *args, **kwargs)
#Cleans the context after execution
self.context = None
return returned
return _manage_context
#classmethod
def other_wrapper(cls, operation):
def _wrapped(self, *args, **kwargs):
#DO something with self and *args and **kwargs
return operation(self, *args, **kwargs)
return _wrapped
This works like a charm :
class Parent(object):
__metaclass__ = MyType
def test(self):
#Here the context is set:
print self.context #prints blabla
But as soon as I want to subclass Parent, problems appear, when I call the parent method with super :
class Child(Parent):
def test(self):
#Here the context is set too
print self.context #prints blabla
super(Child, self).test()
#But now the context is unset, because Parent.test is also wrapped by _manage_context
#so this prints 'None', which is not what was expected
print self.context
I have thought of saving the context before setting it to a new value, but that only solves partially the problem...
Indeed, (hang on, this is hard to explain), the parent method is called, the wrappers are executed, but they receive *args and **kwargs addressed to Parent.test, while self is a Child instance, so self attributes have irrelevant values if I want to challenge them with *args and **kwargs (for example for automated validation purpose), example :
#classmethod
def validation_wrapper(cls, operation):
def _wrapped(self, *args, **kwargs):
#Validate the value of a kwarg
#But if this is executed because we called super(Child, self).test(...
#`self.some_minimum` will be `Child.some_minimum`, which is irrelevant
#considering that we called `Parent.test`
if not kwarg['some_arg'] > self.some_minimum:
raise ValueError('Validation failed')
return operation(self, *args, **kwargs)
return _wrapped
So basically, to solve this problem I see two solutions :
preventing the wrappers to be executed when the method was called with super(Child, self)
having a self that is always of the "right" type
Both solutions seem impossible to me ... Do somebody has an idea on how to solve this ? A suggestion ?
Well, can't you just check if the context is already set in _manage_context? Like this:
def _manage_context(self, *args, **kwargs):
#Sets the context to 'blabla' before the execution
if self.context is None:
self.context = 'blabla'
returned = operation(self, *args, **kwargs)
#Cleans the context after execution
self.context = None
return returned
else:
return operation(self, *args, **kwargs)
Also, this should probably be wrapped in a try-catch block, to ensure resetting of the context in case of exceptions.
Actually I have found out a way to prevent the wrappers to be executed when the method was called with super(Child, self) :
class MyType(type):
def __new__(cls, name, bases, attrs):
#wraps the 'test' method to automate context management and other stuff
new_class = super(MyType, cls).__new__(cls, name, bases, attrs)
new_class.test = cls.other_wrapper(new_class.test, new_class)
#classmethod
def other_wrapper(cls, operation, new_class):
def _wrapped(self, *args, **kwargs):
#DO something with self and *args and **kwargs ...
#ONLY if self is of type *new_class* !!!
if type(self) == new_class:
pass #do things
return operation(self, *args, **kwargs)
return _wrapped
That way, when calling :
super(Child, self).a_wrapped_method
The wrapping code is by-passed !!! That's quite hackish, but it works ...
Ok, first, your "solution" is really ugly, but I suppose you know that. :-) So let's try to answer your questions.
First is an implicit "question": why don't you use Python's context managers? They give you much nicer syntax and error management practically for free. See contextlib module, it can help you greatly. Especially see section about reentrancy.
Then you'll see that people usually have problems when trying to stack context managers. That's not surprising, since to properly support recursion you need a stack of values, not a single value. [You could see the source for some reentrant cm, for example redirect_stdout, to see how it's handled.] So your context_wrapper should either:
(cleaner) keep a list of self.contexts, append to it when entering context, and pop from it when exiting. That way you always get your context.
(more like what you want) keep a single self.context, but also a global value DEPTH, increased by one on entering, decreased by one on exiting, and self.context being reset to None when DEPTH is 0.
As for your second question, I must say I don't quite understand you. self is of the right type. If A is subclass of B, and self is instance of A, then it is also instance of B. If self.some_minimum is "wrong" whether you consider self an instance of A or of B, that means that some_minimum is not really an instance attribute of self, but a class attribute of A or B. Right? They can be freely different on A and on B, because A and B are different objects (of their metaclass).

Categories

Resources