different setters for python properties - python

I have a class
class Animal:
def __init__(self, name='', num_of_owners=0, sleep=0):
self.name = name
self.num_of_owners = int(float(num_of_owners))
self.sleep = float(sleep)
let's say I'm reading all the properties from some file.
I'm using Properties for getters and setters.
#property
def name(self):
return self.name
#name.setter
def name(self, value):
self.name = value
now when reading from the file, I don't want to look for every property in the dictionary i got specifically.
So i can run a for over the dictionary and type
for name, value in animal_props.iteritems():
setattr(animal, name, value)
but then all the properties are set as strings.
The thing is I have about 8 properties some floats some int some strings.
Anyway to run this for, and not make regular setters and run a specific setter for each property.
example:
class Animal:
def __init__(self, name='', num_of_owners=0, sleep=0):
self._name = name
self._num_of_owners = int(float(num_of_owners))
self._sleep = float(sleep)
#property
def name(self):
return self._name
#name.setter
def name(self, value):
self._name = value
#property
def num_of_owners(self):
return self._num_of_owners
#num_of_owners.setter
def num_of_owners(self, value):
self._num_of_owners = int(value)
#property
def sleep (self):
return self._sleep
#sleep.setter
def sleep(self, value):
self._sleep = int(float(value))
d = {'name': 'doggy', 'num_of_owners': '3', 'sleep': '5.643'}
dog = Animal()
for name, value in d.iteritems():
setattr(dog, name, value)
print type(dog.sleep)
I need the type at the end to be float. since i will later use it as a float.
Creating separate 'ifs' and send to each setter is fine, but is there anyway to do it with just that one for.

You are using python 2 with old-style classes. Properties are only available with new-style classes:
class Animal(object):
...

If you're really using Python 2 (as your tag suggests), you need to change your class declaration so that you inherit from object. Doing so will make your class a "new style" class rather than an "old-style" class (which were the only kind of class back in the Python 2.1 days). If you don't know much about these two kinds of class, don't worry about learning about the old ones. Just create new-style classes always (they're the only kind in Python 3).
To make a new-style class, inherit from object:
class Animal(object):
#...
Note that if you're not doing type conversions or other kinds of validation in your property getter or setter methods (as with name in your example code), you might as well get rid of the whole property and just use a regular attribute instead. If you find you do need validation later on in your program's design, you can switch back to using a property at that point and the other parts of the code that read or write from the attribute won't need to change.

Related

Why use #property decorator when I can use a normal method getter? [duplicate]

This question already has answers here:
Using #property versus getters and setters [duplicate]
(13 answers)
Closed 2 years ago.
class Employee:
def __init__(self, name):
self.name = name
def getName(self):
return self.name
#property
def getNameAgain(self):
return self.name
person = Employee("John")
print(person.getName()) #John <--(calling method)
print(person.name) #John <--(using #property)
So I get the same results from using the getName(), so why do we use the #property decorator in getNameAgain() for? May I know what is better/advisable to use? Thank you in advance!
The key difference is in how the access is perceived by users of the property or method.
They certainly look different:
class Foo:
def __init__(self, name):
self._name = name
def get_name(self):
return self._name
#property
def name(self):
return self._name
f = Foo('bar')
print(f.name)
print(f.get_name())
A method is called with parameters (or empty parentheses if it has none), a property is not.
Also, with a property there can be the expectation of being able to assign values to it, whereas with methods, you'd be looking for a separate setter:
#name.setter
def name(self, value):
self._name = value
def set_name(self, value):
self._name = value
f.name = 'baz'
f.set_name('baz')
Again, with the property the syntax is unambiguous, with the setter, you might wonder about additional parameters.
Besides, if you only need a property, why use separate setters and getters and type more parentheses? After all, this is Python, not Java.
So when would you use a separate setter and getter method and not a property? If those aspects are what you want. I.e. you need additional modifiers when setting the value, or when retrieving the value, e.g. encoding.
By the way: I renamed your methods and property to be more in line with standard Python naming conventions. Since the main reason to use properties to begin with is aligning with user expectations (users of your library functions, not the software of course), it makes sense to do the same when naming stuff. So, get_name instead of getName and ._name for the hidden value instead of .name which signals that users are OK to just manipulate the value directly (which is fine, if that's expected).

How to have an object call a method when another object's attribute changes?

I want to set up a system where a method of one object is called whenever an attribute of a different object is modified, with the new value as an argument to the method. How would I accomplish something like that?
Edit: Code sample/clarification:
My code looks something like this:
class SharedAttribute(object): #I use this as a container class to share attributes between multiple objects
def __init__(self, value):
self.value = value
class BaseObject(object):
def __init__(self, **shared_attributes:
for (key, value) in shared_attributes:
if isinstance(value, SharedAttribute):
setattr(self, key, value)
else:
setattr(self, key, SharedAttribute(value))
class MyObject(BaseObject):
def __init__(self, value1, value2):
BaseObject.__init__(self, value1, value2)
def on_value_changed(self, new_value):
#Do something here
def incrementValue(self):
self.value1.value += 1
And then it would be implemented like:
a = MyObject(value1 = 1)
b = MyObject(value1 = MyObject.value1) #a and b are storing references to the same attribute, so when one changes it, it's changed for the other one too.
I want to set it up so if I were to call a.incrementValue(), b.on_value_changed would be called, with the SharedAttribute's new value as an argument. Is there a good way to do that?
One way to do it is to use a manager style attribute similar to how Django uses ModelManager's (model.objects for example):
class Foo(object):
def method_to_call(self, new_val):
print('Hello from manager. Doing stuff with {}'.format(new_val))
class Bar(object):
manager = Foo()
def __init__(self, your_attr):
self.your_attr = your_attr
def __setattr__(self, key, val):
if key == 'your_attr':
self.manager.method_to_call(val)
super(Bar, self).__setattr__(key, val)
And to use it:
>>> bar = Bar('some value')
Hello from manager. Doing stuff with some value
>>> bar.your_attr = 'A new value'
Hello from manager. Doing stuff with A new value
If you want use a specific instance of Foo to manage a specific instance of Bar you can modify __init__() to take an instance of Foo and use that. Since we are making a super() call here it's important to keep track of your inheritance. I don't know how your app is setup but if you are using multiple inheritance things can get weird.
If you're being specific about which attributes are being updated and you have special knowledge of those attributes -- IOW you don't need to know whenever every attribute of some object changes -- then you probably just want some kind of observer pattern by way of properties.
class A(object):
def __init__(self, x):
self._x = x
#property
def x(self):
return self._x
#x.setter
def x(self, value):
self._x = value
whatever_else_you_wanted_to_happen()
Otherwise if you're trying to get generic (which you probably don't want) then you end up wrapping a class attribute instead of an instance attribute, which has its own pros and cons.
class B(object):
pass
def wrapper(objmethod):
def _inner(*args, **kwargs)
some_other_thing_you_wanted_to_happen()
return objmethod(*args, **kwargs)
return _inner
B.__setattr__ = wrapper(B.__setattr__)
That's really sort of JavaScript-y prototype-y though, and so people that like Python looking like Python will probably get their cooking spoons out and whack you with them.
I think ultimately if you want to do this you'll want to just stick to decorators. The attribute accessors aren't exposed in a way that you can just 'hook in' without writing a getter/setter so you'll almost certainly want properties and decorators anyway.

How to let inheritance take precedence over class properties when using a metaclass?

I have posted a similar question before but I interpreted my problem wrong, so I flagged it for deletion and come forth with the new and correct problem.
My general goal is the following: I want to have a property on a class, so I implement it on a metaclass using a property as suggested on my question Implementing a class property that preserves the docstring. Additionally, I want users to be able to subclass the base class and override this property with static values. The thing here is that if the user does not provide an attribute, I want to calculate a sensible default and since all configuration is done at the class level, I need a property at the class level.
A basic example will show this:
class Meta(type):
#property
def test(self):
return "Meta"
class Test(object):
__metaclass__ = Meta
class TestSub(Test):
test = "TestSub"
class TestSubWithout(Test):
pass
print(TestSub.test, TestSubWithout.test)
Here is what it prints:
('Meta', 'Meta')
And what I want it to print:
('TestSub', 'Meta')
Essentially, on TestSub the user overrides the test attribute himself. Now, TestSub is the correct output, since the user provided it. In the case of TestSubWithout, the user instead did not provide the attribute and so the default should be calculated (of course, the real calculation has more than just a static return, this is just to show it).
I know what happens here: First the attribute test is assigned to TestSub and then the metaclass overrides it. But I don't know how to prevent it in a clean and pythonic way.
A property is a "data descriptor", which means that it takes precedence in the attribute search path over values stored in a instance dictionary (or for a class, the instance dictionaries of the other classes in its MRO).
Instead, write your own non-data descriptor that works like property:
class NonDataProperty(object):
def __init__(self, fget):
self.fget = fget
def __get__(self, obj, type):
if obj:
return self.fget(obj)
else:
return self
# don't define a __set__ method!
Here's a test of it:
class MyMeta(type):
#NonDataProperty
def x(self):
return "foo"
class Foo(metaclass=MyMeta):
pass # does not override x
class Bar(metaclass=MyMeta):
x = "bar" # does override x
class Baz:
x = "baz"
class Baz_with_meta(Baz, metaclass=MyMeta):
pass # inherited x value will take precedence over non-data descriptor
print(Foo.x) # prints "foo"
print(Bar.x) # prints "bar"
print(Baz_with_meta.x) # prints "baz"
I cleanest way I could come up with is creating a subclass of property that handles this case:
class meta_property(property):
def __init__(self, fget, fset=None, fdel=None, doc=None):
self.key = fget.__name__
super(meta_property, self).__init__(fget, fset, fdel, doc)
def __get__(self, obj, type_):
if self.key in obj.__dict__:
return obj.__dict__[self.key]
else:
return super(meta_property, self).__get__(obj, type_)
This handles the case by storing the name of the function and returning the overridden value if it is present. This seems like an okayish solution but I am open to more advanced suggestions.

Detect if super() has a function with decorator?

I'm defining several classes intended to be used for multiple inheritance, e.g.:
class A:
def __init__(self, bacon = None, **kwargs):
self.bacon = bacon
if bacon is None:
self.bacon = 100
super().__init__(**kwargs)
class Bacon(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Eggs(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Spam(Eggs, Bacon):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
However, I have multiple classes (e.g. possibly Bacon, A, and Spam, but not Eggs) that care about when their property bacon is changed. They don't need to modify the value, only to know what the new value is, like an event. Because of the Multiple Inheritance nature I have set up, this would mean having to notify the super class about the change (if it cares).
I know that it might be possible if I pass the class name to the method decorator, or if I use a class decorator. I don't want to have all the direct self-class referencing, having to create lots of decorators above each class, or forcing the methods to be the same name, as none of these sound very pythonic.
I was hoping to get syntax that looks something like this:
#on_change(bacon)
def on_bacon_change(self, bacon):
# read from old/new bacon
make_eggs(how_much = bacon)
I don't care about the previous value of bacon, so that bacon argument isn't necessary, if this is called after bacon is set.
Is it possible to check if a super class has a method with this
decorator?
If this isn't feasible, are there alternatives to passing events like
this, up through the multiple-inheritance chain?
EDIT:
The actual calling of the function in Spam would be done in A, by using a #property and #bacon.setter, as that would be the upper-most class that initializes bacon. Once it knows what function to call on self, the problem only lies in propagating the call up the MI chain.
EDIT 2:
If I override the attribute with a #bacon.setter, Would it be possible to determine whether the super() class has a setter for bacon?
What you call for would probably be nicely fit with a more complete framework of signals, and so on - maybe even invite for Aspected Oriented Programing.
Without going deep into it however, a metaclass and a decorator can do just what you are asking for - I came up with these, I hope they work for you.
If you'd like to evolve this in to something robust and usable, write me - if nothing like this exists out there, it wouldbe worth to keep an utility package in pipy for this.
def setattr_wrapper(cls):
def watcher_setattr(self, attr, val):
super(cls, self).__setattr__(attr, val)
watched = cls.__dict__["_watched_attrs"]
if attr in watched:
for method in watched[attr]:
getattr(self, method)(attr, val)
return watcher_setattr
class AttrNotifier(type):
def __new__(metacls, name, bases, dct):
dct["_watched_attrs"] = {}
for key, value in dct.items():
if hasattr(value, "_watched_attrs"):
for attr in getattr(value, "_watched_attrs"):
if not attr in dct["_watched_attrs"]:
dct["_watched_attrs"][attr] = set()
dct["_watched_attrs"][attr].add(key)
cls = type.__new__(metacls, name, bases, dct)
cls.__setattr__ = setattr_wrapper(cls)
return cls
def on_change(*args):
def decorator(meth):
our_args = args
#ensure that this decorator is stackable
if hasattr(meth, "_watched_attrs"):
our_args = getattr(meth, "_watched_attrs") + our_args
setattr(meth, "_watched_attrs", our_args)
return meth
return decorator
# from here on, example of use:
class A(metaclass=AttrNotifier):
#on_change("bacon")
def bacon_changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
class Spam(A):
#on_change("bacon", "pepper")
def changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
a = A()
a.bacon = 5
b = Spam()
b.pepper = 10
b.bacon = 20
(tested in Python 3.2 and Python 2.6 - changing the declaration of the "A" class for
Python 2 metaclass syntax)
edit - some words on what is being done
Here is what happens:
The metaclass picks all methods marked with the "on_close" decorator, and register then in a dictionary on the class - this dictionary is named _watched_attrs and it can be accessed as a normal class attribute.
The other thing the metaclass does is to override the __setattr__ method for the clas once it is created. This new __setattr__ just sets the attribute, and then checks the _wacthed_attrs dictionary if there are any methods on that class registered to be called when the attribute just changed has been modified - if so, it calls it.
The extra indirection level around watcher_setattr (which is the function that becomes each class's __setattr__ is there so that you can register different attributes to be watched on each class on the inheritance chain - all the classess have indepently acessible _watched_attrs dictionaries. If it was not for this, only the most specilized class on the inheritance chain _watched_attrs would be respected.
You are looking for python properties:
http://docs.python.org/library/functions.html#property
Search google for override superclass property setter resulted in this StackOverflow question:
Overriding inherited properties’ getters and setters in Python

python: subclass a metaclass

For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.

Categories

Resources