Raise exception while defining a class improperly - python

How can I write a mixin, which raises an Exception if the class which is using this specific mixin is not created properly.
If I do these checks and balances in the __init__ or __new__ methods of the mixin, Exception is raised when this erroneous class tries to create an instance. Which is late, ideally the exception needs to be thrown when the compiler detects a wrong class. (Assuming, how to detect if a class is acceptable or not is a trivial matter)
To Illustrate the question
class ASampleMixin:
"""
A sample docstring
"""
def a_method(self):
raise NotImplementedError
def class_rule(self):
if something is wrong:
return False
return True
# more methods
class AClass(ASampleMixin, BaseClass):
"""
This class should satisfy a condition specified in class_rule method of the mixin
"""
# some methods
I am right now performing the check in the init method of mixin. Which raises an exception if rule returns False. Now this needs to be done at the time AClass is read by interpreter and not when I try to create an instance of AClass.
Is it possible even in dynamically typed languages like Python 3.5?

This sounds as if you want to create a custom metaclass that performs the check upon creation of the class object. See the documentation for metaclasses.

A metaclass example as reference:
class CustomType(type):
def __call__(cls, *args, **kwargs):
if not CustomType.some_rule(kwargs.pop('some_attr', None)):
raise Exception('Abort! Abort!')
return super(CustomType, cls).__call__(*args, **kwargs)
#staticmethod
def some_rule(var):
if type(var) is not str:
return False
return True
class A(object):
__metaclass__ = CustomType
class B(A):
pass
b = B(some_attr='f') # all is well
b = B() # raises

Related

How to typehint that an object of a class is also adhering to a Protocol in Python?

I have a set of classes, Lets call them Foo and Bar, where both inherit from a base class Father that is defined outside of the current scope (not by me). I have definied a protocol class DummyProtocol that has a function do_something.
class DummyProtocol(Protocol):
def do_something(self):
...
class Foo(Father):
def do_something(self):
pass
class Bar(Father):
def do_something(self):
pass
I have a function create_instance.
def create_dummy_and_father_instance(cls, *args, **kwargs):
return cls(*args, **kwargs)
I want to typehint it in a way, that cls is typehinted to accept a class that is of type Father that also implements the DummyProtocol.
So I changed the function to this to indicate that cls is a type that inherit from both Father and DummyProtocol
def create_dummy_and_father_instance(
cls: Type[tuple[Father, DummyProtocol]], *args, **kwargs
):
return cls(*args, **kwargs)
But I get this error in mypy:
Cannot instantiate type "Type[Tuple[Father, DummyProtocol]]"
I came across the same issue and found this discussion on proposed Intersection types which seem to be exactly what is needed (e.g. see this comment).
Unfortunately this feature is not yet supported by the Python typing system, but there's a PEP in the making.
You can define a second Father class which inherits from Father and Protocol (see also mypy: how to verify a type has multiple super classes):
class DummyProtocol(Protocol):
def do_something(self):
...
class Father:
pass
class Father2(Father, DummyProtocol):
pass
class Foo(Father2):
def do_something(self):
pass
class Bar(Father2):
def do_something(self):
pass
class FooNot(Father):
pass
def create_dummy_and_father_instance(
cls: Type[Father2]
):
return cls()
create_dummy_and_father_instance(Foo)
create_dummy_and_father_instance(Bar)
create_dummy_and_father_instance(FooNot) # mypy error ok

How to make a class attribute exclusive to the super class

I have a master class for a planet:
class Planet:
def __init__(self,name):
self.name = name
(...)
def destroy(self):
(...)
I also have a few classes that inherit from Planet and I want to make one of them unable to be destroyed (not to inherit the destroy function)
Example:
class Undestroyable(Planet):
def __init__(self,name):
super().__init__(name)
(...)
#Now it shouldn't have the destroy(self) function
So when this is run,
Undestroyable('This Planet').destroy()
it should produce an error like:
AttributeError: Undestroyable has no attribute 'destroy'
The mixin approach in other answers is nice, and probably better for most cases. But nevertheless, it spoils part of the fun - maybe obliging you to have separate planet-hierarchies - like having to live with two abstract classes each ancestor of "destroyable" and "non-destroyable".
First approach: descriptor decorator
But Python has a powerful mechanism, called the "descriptor protocol", which is used to retrieve any attribute from a class or instance - it is even used to ordinarily retrieve methods from instances - so, it is possible to customize the method retrieval in a way it checks if it "should belong" to that class, and raise attribute error otherwise.
The descriptor protocol mandates that whenever you try to get any attribute from an instance object in Python, Python will check if the attribute exists in that object's class, and if so, if the attribute itself has a method named __get__. If it has, __get__ is called (with the instance and class where it is defined as parameters) - and whatever it returns is the attribute. Python uses this to implement methods: functions in Python 3 have a __get__ method that when called, will return another callable object that, in turn, when called will insert the self parameter in a call to the original function.
So, it is possible to create a class whose __get__ method will decide whether to return a function as a bound method or not depending on the outer class been marked as so - for example, it could check an specific flag non_destrutible. This could be done by using a decorator to wrap the method with this descriptor functionality
class Muteable:
def __init__(self, flag_attr):
self.flag_attr = flag_attr
def __call__(self, func):
"""Called when the decorator is applied"""
self.func = func
return self
def __get__(self, instance, owner):
if instance and getattr(instance, self.flag_attr, False):
raise AttributeError('Objects of type {0} have no {1} method'.format(instance.__class__.__name__, self.func.__name__))
return self.func.__get__(instance, owner)
class Planet:
def __init__(self, name=""):
pass
#Muteable("undestroyable")
def destroy(self):
print("Destroyed")
class BorgWorld(Planet):
undestroyable = True
And on the interactive prompt:
In [110]: Planet().destroy()
Destroyed
In [111]: BorgWorld().destroy()
...
AttributeError: Objects of type BorgWorld have no destroy method
In [112]: BorgWorld().destroy
AttributeError: Objects of type BorgWorld have no destroy method
Perceive that unlike simply overriding the method, this approach raises the error when the attribute is retrieved - and will even make hasattr work:
In [113]: hasattr(BorgWorld(), "destroy")
Out[113]: False
Although, it won't work if one tries to retrieve the method directly from the class, instead of from an instance - in that case the instance parameter to __get__ is set to None, and we can't say from which class it was retrieved - just the owner class, where it was declared.
In [114]: BorgWorld.destroy
Out[114]: <function __main__.Planet.destroy>
Second approach: __delattr__ on the metaclass:
While writting the above, it occurred me that Pythn does have the __delattr__ special method. If the Planet class itself implements __delattr__ and we'd try to delete the destroy method on specifc derived classes, it wuld nt work: __delattr__ gards the attribute deletion of attributes in instances - and if you'd try to del the "destroy" method in an instance, it would fail anyway, since the method is in the class.
However, in Python, the class itself is an instance - of its "metaclass". That is usually type . A proper __delattr__ on the metaclass of "Planet" could make possible the "disinheitance" of the "destroy" method by issuing a `del UndestructiblePlanet.destroy" after class creation.
Again, we use the descriptor protocol to have a proper "deleted method on the subclass":
class Deleted:
def __init__(self, cls, name):
self.cls = cls.__name__
self.name = name
def __get__(self, instance, owner):
raise AttributeError("Objects of type '{0}' have no '{1}' method".format(self.cls, self.name))
class Deletable(type):
def __delattr__(cls, attr):
print("deleting from", cls)
setattr(cls, attr, Deleted(cls, attr))
class Planet(metaclass=Deletable):
def __init__(self, name=""):
pass
def destroy(self):
print("Destroyed")
class BorgWorld(Planet):
pass
del BorgWorld.destroy
And with this method, even trying to retrieve or check for the method existense on the class itself will work:
In [129]: BorgWorld.destroy
...
AttributeError: Objects of type 'BorgWorld' have no 'destroy' method
In [130]: hasattr(BorgWorld, "destroy")
Out[130]: False
metaclass with a custom __prepare__ method.
Since metaclasses allow one to customize the object that contains the class namespace, it is possible to have an object that responds to a del statement within the class body, adding a Deleted descriptor.
For the user (programmer) using this metaclass, it is almost the samething, but for the del statement been allowed into the class body itself:
class Deleted:
def __init__(self, name):
self.name = name
def __get__(self, instance, owner):
raise AttributeError("No '{0}' method on class '{1}'".format(self.name, owner.__name__))
class Deletable(type):
def __prepare__(mcls,arg):
class D(dict):
def __delitem__(self, attr):
self[attr] = Deleted(attr)
return D()
class Planet(metaclass=Deletable):
def destroy(self):
print("destroyed")
class BorgPlanet(Planet):
del destroy
(The 'deleted' descriptor is the correct form to mark a method as 'deleted' - in this method, though, it can't know the class name at class creation time)
As a class decorator:
And given the "deleted" descriptor, one could simply inform the methods to be removed as a class decorator - there is no need for a metaclass in this case:
class Deleted:
def __init__(self, cls, name):
self.cls = cls.__name__
self.name = name
def __get__(self, instance, owner):
raise AttributeError("Objects of type '{0}' have no '{1}' method".format(self.cls, self.name))
def mute(*methods):
def decorator(cls):
for method in methods:
setattr(cls, method, Deleted(cls, method))
return cls
return decorator
class Planet:
def destroy(self):
print("destroyed")
#mute('destroy')
class BorgPlanet(Planet):
pass
Modifying the __getattribute__ mechanism:
For sake of completeness - what really makes Python reach methods and attributes on the super-class is what happens inside the __getattribute__ call. n the object version of __getattribute__ is where the algorithm with the priorities for "data-descriptor, instance, class, chain of base-classes, ..." for attribute retrieval is encoded.
So, changing that for the class is an easy an unique point to get a "legitimate" attribute error, without need for the "non-existent" descritor used on the previous methods.
The problem is that object's __getattribute__ does not make use of type's one to search the attribute in the class - if it did so, just implementing the __getattribute__ on the metaclass would suffice. One have to do that on the instance to avoid instance lookp of an method, and on the metaclass to avoid metaclass look-up. A metaclass can, of course, inject the needed code:
def blocker_getattribute(target, attr, attr_base):
try:
muted = attr_base.__getattribute__(target, '__muted__')
except AttributeError:
muted = []
if attr in muted:
raise AttributeError("object {} has no attribute '{}'".format(target, attr))
return attr_base.__getattribute__(target, attr)
def instance_getattribute(self, attr):
return blocker_getattribute(self, attr, object)
class M(type):
def __init__(cls, name, bases, namespace):
cls.__getattribute__ = instance_getattribute
def __getattribute__(cls, attr):
return blocker_getattribute(cls, attr, type)
class Planet(metaclass=M):
def destroy(self):
print("destroyed")
class BorgPlanet(Planet):
__muted__=['destroy'] # or use a decorator to set this! :-)
pass
If Undestroyable is a unique (or at least unusual) case, it's probably easiest to just redefine destroy():
class Undestroyable(Planet):
# ...
def destroy(self):
cls_name = self.__class__.__name__
raise AttributeError("%s has no attribute 'destroy'" % cls_name)
From the point of view of the user of the class, this will behave as though Undestroyable.destroy() doesn't exist … unless they go poking around with hasattr(Undestroyable, 'destroy'), which is always a possibility.
If it happens more often that you want subclasses to inherit some properties and not others, the mixin approach in chepner's answer is likely to be more maintainable. You can improve it further by making Destructible an abstract base class:
from abc import abstractmethod, ABCMeta
class Destructible(metaclass=ABCMeta):
#abstractmethod
def destroy(self):
pass
class BasePlanet:
# ...
pass
class Planet(BasePlanet, Destructible):
def destroy(self):
# ...
pass
class IndestructiblePlanet(BasePlanet):
# ...
pass
This has the advantage that if you try to instantiate the abstract class Destructible, you'll get an error pointing you at the problem:
>>> Destructible()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Destructible with abstract methods destroy
… similarly if you inherit from Destructible but forget to define destroy():
class InscrutablePlanet(BasePlanet, Destructible):
pass
>>> InscrutablePlanet()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class InscrutablePlanet with abstract methods destroy
Rather than remove an attribute that is inherited, only inherit destroy in the subclasses where it is applicable, via a mix-in class. This preserves the correct "is-a" semantics of inheritance.
class Destructible(object):
def destroy(self):
pass
class BasePlanet(object):
...
class Planet(BasePlanet, Destructible):
...
class IndestructiblePlanet(BasePlanet): # Does *not* inherit from Destructible
...
You can provide suitable definitions for destroy in any of Destructible, Planet, or any class that inherits from Planet.
Metaclasses and descriptor protocols are fun, but perhaps overkill. Sometimes, for raw functionality, you can't beat good ole' __slots__.
class Planet(object):
def __init__(self, name):
self.name = name
def destroy(self):
print("Boom! %s is toast!\n" % self.name)
class Undestroyable(Planet):
__slots__ = ['destroy']
def __init__(self,name):
super().__init__(name)
print()
x = Planet('Pluto') # Small, easy to destroy
y = Undestroyable('Jupiter') # Too big to fail
x.destroy()
y.destroy()
Boom! Pluto is toast!
Traceback (most recent call last):
File "planets.py", line 95, in <module>
y.destroy()
AttributeError: destroy
You cannot inherit only a portion of a class. Its all or nothing.
What you can do is to put the destroy function in a second level of the class, such you have the Planet-class without the destry-function, and then you make a DestroyablePlanet-Class where you add the destroy-function, which all the destroyable planets use.
Or you can put a flag in the construct of the Planet-Class which determines if the destroy function will be able to succeed or not, which is then checked in the destroy-function.

How to enforce method signature for child classes?

Languages like C#, Java has method overloads, which means if child class does not implement the method with exact signature will not overwrite the parent method.
How do we enforce the method signature in child classes in python? The following code sample shows that child class overwrites the parent method with different method signature:
>>> class A(object):
... def m(self, p=None):
... raise NotImplementedError('Not implemented')
...
>>> class B(A):
... def m(self, p2=None):
... print p2
...
>>> B().m('123')
123
While this is not super important, or maybe by design of python (eg. *args, **kwargs). I am asking this for the sake of clarity if this is possible.
Please Note:
I have tried #abstractmethod and the ABC already.
Below is a complete running example showing how to use a metaclass to make sure that subclass methods have the same signatures as their base classes. Note the use of the inspect module. The way I'm using it here it makes sure that the signatures are exactly the same, which might not be what you want.
import inspect
class BadSignatureException(Exception):
pass
class SignatureCheckerMeta(type):
def __new__(cls, name, baseClasses, d):
#For each method in d, check to see if any base class already
#defined a method with that name. If so, make sure the
#signatures are the same.
for methodName in d:
f = d[methodName]
for baseClass in baseClasses:
try:
fBase = getattr(baseClass, methodName).__func__
if not inspect.getargspec(f) == inspect.getargspec(fBase):
raise BadSignatureException(str(methodName))
except AttributeError:
#This method was not defined in this base class,
#So just go to the next base class.
continue
return type(name, baseClasses, d)
def main():
class A(object):
def foo(self, x):
pass
try:
class B(A):
__metaclass__ = SignatureCheckerMeta
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
except BadSignatureException:
print("Class B can't be constructed because of a bad method signature")
print("This is as it should be :)")
try:
class C(A):
__metaclass__ = SignatureCheckerMeta
def foo(self, x):
"""This is ok because the signature matches A.foo"""
pass
except BadSignatureException:
print("Class C couldn't be constructed. Something went wrong")
if __name__ == "__main__":
main()
Update of the accepted answer to work with python 3.5.
import inspect
from types import FunctionType
class BadSignatureException(Exception):
pass
class SignatureCheckerMeta(type):
def __new__(cls, name, baseClasses, d):
#For each method in d, check to see if any base class already
#defined a method with that name. If so, make sure the
#signatures are the same.
for methodName in d:
f = d[methodName]
if not isinstance(f, FunctionType):
continue
for baseClass in baseClasses:
try:
fBase = getattr(baseClass, methodName)
if not inspect.getargspec(f) == inspect.getargspec(fBase):
raise BadSignatureException(str(methodName))
except AttributeError:
#This method was not defined in this base class,
#So just go to the next base class.
continue
return type(name, baseClasses, d)
def main():
class A(object):
def foo(self, x):
pass
try:
class B(A, metaclass=SignatureCheckerMeta):
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
except BadSignatureException:
print("Class B can't be constructed because of a bad method signature")
print("This is as it should be :)")
try:
class C(A):
__metaclass__ = SignatureCheckerMeta
def foo(self, x):
"""This is ok because the signature matches A.foo"""
pass
except BadSignatureException:
print("Class C couldn't be constructed. Something went wrong")
if __name__ == "__main__":
main()
By design, the language doesn't support checking the signatures. For an interesting read, check out:
http://grokbase.com/t/python/python-ideas/109qtkrzsd/abc-what-about-the-method-arguments
From this thread, it does sound like you may be able to write a decorator to check the signature, with abc.same_signature(method1, method2), but I've never tried that.
The reason it is being overridden is because they actually have the same method signature. What is written there is akin to doing something like this in Java:
public class A
{
public void m(String p)
{
throw new Exception("Not implemented");
}
}
public class B extends A
{
public void m(String p2)
{
System.out.println(p2);
}
}
Note that even though the paramater names are different, the types are the same and thus they have the same signature. In strongly typed languages like this, we get to explicitly say what the types are going to be ahead of time.
In python the type of the paramater is dynamically determined at run time when you use the method. This makes it impossible for the python interpreter to tell which method you actually wished to call when you say B().m('123'). Because neither of the method signatures specify which type of paramater they expect, they simply say I'm looking for a call with one parameter. So it makes sense that the deepest (and most relevent to the actual object you have) is called, which would be class B's method because it is an instance of class B.
If you want to only process cetain types in a child class method, and pass along all others to the parent class, it can be done like this:
class A(object):
def m(self, p=None):
raise NotImplementedError('Not implemented')
class B(A):
def m(self, p2=None):
if isinstance(p2, int):
print p2
else:
super(B, self).m(p2)
Then using b gives you the desired output. That is, class b processes ints, and passes any other type along to its parent class.
>>> b = B()
>>> b.m(2)
2
>>> b.m("hello")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in m
File "<stdin>", line 3, in m
NotImplementedError: Not implemented
I use meta classes for others purposes in my code so I rolled a version that uses a class decorator instead. The below version works with python3. and also supports decorated methods (yes, this creates a potential loophole but if you use decorators that changes the actual signature, shame on you). To make it work with python2, change inspect.isfunction to inspect.ismethod
import inspect
from functools import wraps
class BadSignatureException(Exception):
pass
def enforce_signatures(cls):
for method_name, method in inspect.getmembers(cls, predicate=inspect.isfunction):
if method_name == "__init__":
continue
for base_class in inspect.getmro(cls):
if base_class is cls:
continue
try:
base_method = getattr(base_class, method_name)
except AttributeError:
continue
if not inspect.signature(method) == inspect.signature(base_method):
raise BadSignatureException("%s.%s does not match base class %s.%s" % (cls.__name__, method_name,
base_class.__name__, method_name))
return cls
if __name__ == "__main__":
class A:
def foo(self, x):
pass
def test_decorator(f):
#wraps(f)
def decorated_function(*args, **kwargs):
pass
return decorated_function
#enforce_signatures
class B(A):
#test_decorator
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
mypy, and I expect other static type-checkers, will complain if methods on your subclass have a different signature to the methods they overwrite. It seems to me the best way to enforce type-signatures on child classes is to enforce mypy (or whatever).

#classmethod with Abstract Base Class

I have an Abstract Base Class and subclasses defined as follows (Python 2.7):
import abc
import MyDatabaseModule
class _DbObject(object):
__metaclass__ = abc.ABCMeta
def _GetObjectType(self):
raise NotImplementedError, "Please override in the derived class"
ObjectType = abc.abstractproperty(_GetObjectType, None)
class Entry(_DbObject):
_objectTypeID = 'ENTRY'
def _GetObjectType(self):
return MyDatabaseModule.DoesSomethingWith(self._objectTypeID)
ObjectType = property(_GetObjectType, None)
This works fine, meaning that the base class _DbObject cannot be instantiated because it has only an abstract version of the property getter method.
try:
dbObject = _DbObject()
print "dbObject.ObjectType: " + dbObject.ObjectType
except Exception, err:
print 'ERROR:', str(err)
Now I can do:
entry = Entry()
print entry.ObjectType
to get access to the ObjectType property. However, what I would like to be able to do is just:
print Entry.ObjectType
However, wherever I try to insert #classmethod, I get the error classmethod object is not callabale.
So, the magic for the way "property" works in Python is implemented using the descriptor protocol - property itself if a powerful built-in that provides a descriptor that works well for instances, not classes as you had seen.
So, you need a "class property" - the property built-in can't give you that, but the descriptor protocol can. What the descriptor protocol says is that whenever an attribute is retrieved from the class, if it is an object with a __get__ method, that method is called with "self, instance, owner" - and if it is retrieved from the class, instead of from an instance, the "instance" parameter is set to None.
BTW, as stated by #Constantinius, this does not have to do with the ABC's at all, just with you wanting a "class property".
class classproperty(object):
def __init__(self, func):
self.func = func
def __get__(self, instance, owner):
return self.func(owner)
class Entry(_DbObject):
_objectTypeID = 'ENTRY'
def _GetObjectType(cls):
return MyDatabaseModule.DoesSomethingWith(cls._objectTypeID)
ObjectType = classproperty(_GetObjectType, None)
The problem is not your ABC but the simple fact, that there is no such thing as a classproperty in python, you have to create it on your own. Actually there is a good question + answer on SO about that. It actually should be no problem using it with your ABC aswell.

How to implement a Required Property in Python

If I have a class such as below (only with many more properties), is there are clean way to note which fields are required before calling a particular method?
class Example():
def __init__(self):
pass
#property
"""Have to use property methods to have docstrings..."""
def prop1(self):
return self._prop1
#prop1.setter
def task(self, value):
# validation logic..
self._prop1 = value
def method(self):
# check all required properties have been added
I could write an array by hand of all required propeties and loop through them in a method, but I was wondering if there is a cleaner way for example by implementing a #requiredProperty descriptor.
The class is used to generate a POST request for a web API. The request has 25+ parameters, some of which are required and some optional.
Rather than on the method calling the request having to loop through an array such as:
required_props = ['prop1','prop2',....]
I was hoping there was a way in Python of adding a required decorator to properties so I wouldn't have to keep track by hand. E.g.
#property, #required
def prop1(self):
return self._prop1
Would it not be best to make sure that all the attributes are supplied when an object is initialised? Then all your properties will be defined when you try to acces them.
For example,
class Example(object):
def __init__(self, prop1, prop2):
self.prop1 = prop1
self.prop2 = prop2
Also, note from PEP8:
For simple public data attributes, it
is best to expose just the attribute
name, without complicated
accessor/mutator methods.
So why use properties?
This should work the same way as in any OO language: A required property must be set during construction time. Calling the objects methods must never leave the object in a "bad" state, so that method can be called on any constructed object.
If the above doesn't hold true, you should think about refactoring your code.
Of course it is always possible to alter a python object to not be valid anymore by poking around in its guts. You don't do that unless you have a good reason. Don't bother checking for this, as your program should just blow up in your face whenever you do something stupid so you learn and stop.
It's hard to tell from your example what problem you are actually trying to solve, but I'm not convinced properties are the answer.
If you just want to check that an instance variable exists, you could use the special attribute __dict__, thus:
% cat ./test.py
#!/usr/bin/env python
class Example():
def __init__(self):
self.foo = None
def method(self):
assert 'foo' in self.__dict__
assert 'bar' in self.__dict__
Example().method()
% ./test.py
Traceback (most recent call last):
File "./test.py", line 12, in <module>
Example().method()
File "./test.py", line 10, in method
assert 'bar' in self.__dict__
AssertionError
But remember... EAFP: Easier to ask for forgiveness than permission.
As others have suggested, I suspect you are over-engineering. However, you could use a decorator to define 'required' attributes. Something along the lines of:
import functools
class MissingAttributeError(Exception):
pass
def requires(*required_attrs):
def wrapper(method):
#functools.wraps(method)
def inner_wrapper(self, *args, **kargs):
if not all(hasattr(self, attr) for attr in required_attrs):
raise MissingAttributeError()
return method(self, *args, **kargs)
return inner_wrapper
return wrapper
class Test(object):
def __init__(self, spam, eggs):
self.spam = spam
self.eggs = eggs
#requires('spam', 'eggs', 'ham')
def something(self):
return 'Done'
t = Test('fu', 'bar')
t.something() ## fails
t.ham = 'nicer than spam'
t.something() ## succeeds
Although defining attribute dependencies this way has a certain neatness to it, I'm not sure I recommend it.

Categories

Resources