Using property method with setter prevents function from binding to object attribute - python

I am attempting to use a property-based method to control the class constructor (screening for bad values at instance creation time), in this code. For some reason I don't understand the function set_sequence won't alter the derived class Binary object's attribute seq when called from the #num.setter method - it has to be called elsewhere. Am I missing something basic about how properties work?
class Number:
def __init__(self, num):
self.num = num
class Binary(Number):
def __init__(self, num):
super().__init__(num)
self.seq = ()
#property
def num(self):
return self._num
#num.setter
def num(self, value):
if value < 0:
raise ValueError('Unsigned numbers cannot be negative')
self._num = value
self.set_sequence() #this calls the function, but the value doesn't get bound
def set_sequence(self):
print('called ', end='')
self.seq = tuple([int(x) for x in bin(self.num)[2:]])
Calling this code as follows:
if __name__ == "__main__":
n1 = Binary(11)
print(f"{n1.num} = {n1.seq}")
n1.set_sequence()
print(f"{n1.num} = {n1.seq}")
Gives:
called 11 = ()
called 11 = (1, 0, 1, 1)
This throws an exception as expected when negative values are passed to constructor, but I don't understand why the function call fails to behave as expected. This pattern is based on Brett Slatkin's Item#29 in 'Effective Python' incidentally, the part about using #property to do type checking and value validation when the constructor is called.

Because in your constructor after super().__init__(num) that calls your #num.setter you use self.seq = () that overrides the value stored in your setter method.

To have the desired output, you should do like this. In you example self.set_sequence() is overridden by the second instruction in the constructor.
class Number:
def __init__(self, num):
self.num = num
class Binary(Number):
seq = ()
def __init__(self, num):
# or eventually here
# seq = ()
super().__init__(num)
#property
def num(self):
return self._num
#num.setter
def num(self, value):
if value < 0:
raise ValueError('Unsigned numbers cannot be negative')
self._num = value
self.set_sequence() #this calls the function, but the value doesn't get bound
def set_sequence(self):
print('called ', end='')
self.seq = tuple([int(x) for x in bin(self.num)[2:]])
if __name__ == "__main__":
n1 = Binary(11)
print(f"{n1.num} = {n1.seq}")
n1.set_sequence()
print(f"{n1.num} = {n1.seq}")

Related

Python class does not have value

I am a Javascript engineer and am switching into a JS/Python role. Working on some easy leetcodes to get some quick Python practice.
I'm looking to create a LinkedList here and perhaps I am coming at it from a JS mindset?
Error:
AttributeError: type object 'LinkedListNode' has no attribute 'value'
utils.py
# LinkedList Methods
def createLinkedList(arr):
head = createLinkedListNode(None, arr.pop(0))
def populateList(arr, prevNode):
if arr:
node = createLinkedListNode(None, arr.pop(0))
prevNode.next = node
if arr:
populateList(arr, node)
populateList(arr, head)
return head
def createLinkedListNode(next, value):
class LinkedListNode:
def __init__(self):
self.next = next
self.value = value
return LinkedListNode
deleteNode.py
from python.utils import createLinkedList, linkedListToArray
useCase1 = [4, 5, 1, 9]
linkedList = createLinkedList(useCase1)
^ linkedList.value doesn't exist?
Some misunderstandings with python classes:
The class LinkedListNode should not defined in function.
Return LinkedListNode is actually returning the class itself, but not the Instance. To return the instance, you have to call the class. return LinkedListNode()
Using next as instance variable is not ideal. next is an iteration function in python, so when you set self.next = next, you are actually assigning the function to self.next
If you want to set a variable, for example self.next_value = next_value, you should put next_value as a parameter of __init__ function, like def __init__(self, next_value)
Here is a simple demo of Linked List:
class LinkedList:
def __init__(self, value):
self.value = value
self.next_value = None
def __iter__(self):
yield self.value
if self.next_value is not None:
yield from self.next_value
# else raise StopIteration
def __getitem__(self, index):
if index == 0:
return self.value
else:
return self.next_value[index-1]
# recursively get the next value
def __str__(self):
return str(self.value) + ' -> ' + str(self.next_value)
def __len__(self):
if self.next_value is None:
return 1
else:
return 1 + len(self.next_value)
# recursively get the length
def append(self, value):
if self.next_value is None:
self.next_value = LinkedList(value, self)
else:
self.next_value.append(value)
a = LinkedList(2)
a.append(1)
a.append(3)
for num in a:
print(num, end=", ")
print()
print(a[1])
print(a)
print(len(a))
Output:
2, 1, 3,
1
2 -> 1 -> 3 -> None
3
createLinkedListNode() returns the LinkedListNode class itself, not an instance of the class.
Why are you defining classes and functions inside of other functions? That's an odd way of doing things.

Override attribute get behaviour but only by external methods in Python

Consider the following code:
class A():
def __init__(self, thing):
self.thing = thing
def do_something(self):
if self.thing > 0:
print('thing is positive')
else:
print('thing is not positive')
def some_function(a):
if a.thing > 0:
print('this thing is positive')
else:
print('this thing is not positive')
class B(A):
#property
def thing(self):
return 0
#thing.setter
def thing(self, val):
pass
# Purposely don't want to override A.do_something
a = A(5)
print(a.thing) # 5
a.do_something() # thing is positive
some_function(a) # this thing is positive
isinstance(a, A) # True
b = B(5)
print(b.thing) # 0
b.do_something() # thing is not positive (!!! - not what we want - see below)
some_function(b) # this thing is not positive
isinstance(b, A) # True
Suppose that do_something is a complicated function which we don't want to override. This could be because it is in an external package and we want to be able to keep using the latest version of this package containing A without having to update B each time. Now suppose that an outside function accesses a.thing by referencing it directly. We want B to extend A so that this external function always sees b.thing == 0. However, we want to do this without modifying the behaviour of internal methods. In the example above, we want to modify the behaviour of some_function, but we do this at the cost of also changing the behaviour of the internal method b.do_something.
The obvious way to fix this would be to have the external functions some_function use a get_thing() method. But if these external functions have been already written in another package modifying these is not possible.
Another way would be to have B update the value of self.thing before calling the parent class' method,
class B(A):
def __init__(self, thing):
self.thing = 0
self._thing = thing
def do_something(self):
self.thing = self._thing
rval = super().do_something()
self.thing = 0
return rval
however this seems clunky and if the developer of A adds new methods, then B would change the behaviour of these methods if it is not updated.
Is there a best practice on how to go about extending a class like this which allows use to override __getattribute__ if called by an external function, but without changing any internal behaviour?
I think you can set class B like the following to achieve what you want:
class B:
def __init__(self, thing):
self.thing = 0
self._a = A(thing)
def __getattr__(self, name):
return getattr(self._a, name)
The full code is below:
class A:
def __init__(self, thing):
self.thing = thing
def do_something(self):
if self.thing > 0:
print('thing is positive')
else:
print('thing is not positive')
def some_function(a):
if a.thing > 0:
print('this thing is positive')
else:
print('this thing is not positive')
class B:
def __init__(self, thing):
self.thing = 0
self._a = A(thing)
def __getattr__(self, name):
return getattr(self._a, name)
if __name__ == '__main__':
a = A(5)
print(a.thing) # 5
a.do_something() # thing is positive
some_function(a) # this thing is positive
b = B(5)
print(b.thing) # 0
b.do_something() # thing is positive
some_function(b) # this thing is not positive

Class function return different results

I have 2 different ways to set up a python class. One that automaticly runs the class function, and one that you need to run manually.
Manually running function:
class testclass:
def __init__(self, value):
self.value = value
def validator(self):
data = self.value[0] + self.value[1]
data = int(data)
return data
theClass = testclass('123456')
print(theClass.validator())
This prints "12"
Automaticly running the function:
class testclass:
def __init__(self, value):
self.value = value
self.validator()
def validator(self):
data = self.value[0] + self.value[1]
data = int(data)
return data
theClass = testclass('123456')
print(theClass)
this prints "<main.testclass object at 0x011C72B0>"
How can i run the class function automaticly, and still get 12 as print output?
In the second version you are calling the validator function in __init__, but not returning the value that validator is returning. The problem is that __init__ is not able to return anything but None. What you can do is to assign the value to an instance variable:
class testclass:
value = 0
def __init__(self, value):
self.value = value
self.value = self.validator()
def validator(self):
data = self.value[0] + self.value[1]
data = int(data)
return data
theClass = testclass('123456')
print(theClass.value)
Ouptut:
12
In your automatic example, you are not calling 'theClass'. Any function calls need ().
You can rename your automatic validator the __call__ and call it as theClass().
See more at https://www.journaldev.com/22761/python-callable-call
If you simply want to print the output value, and not use it as a variable, you can define __str__ as part of your class.
class testclass(object):
def __init__(self, value):
self.value = value
def __str__(self):
return self.validator()
def validator(self):
data = self.value[0] + self.value[1]
data = int(data)
return data
>>> theClass = testclass('123456')
>>> print(theClass)
12
If you want to use it as a variable, such as theClass + 5, then using a custom class is not the way to go in this case.
Print inside the validator function:
class TestClass: # sticking to more Pythonic naming conventions
def __init__(self, value):
self.value = value
self.validator()
def validator(self):
print(int(self.value[0] + self.value[1]))
This will automatically print validation output whenever an instance is created:
>>> the_class = TestClass('123456')
12

Python proxy class

I'm trying to create a Proxy class to another class. I want this class to be passed into the proxy in its constructor and then for the proxy to dynamically create all the same methods of this class on itself.
This is what I hvae so far which is not working:
import inspect
from optparse import OptionParser
class MyClass:
def func1(self):
print 'MyClass.func1'
def func2(self):
print 'MyClass.func1'
class ProxyClass:
def __init__(self):
myClass = MyClass()
members = inspect.getmembers(MyClass, predicate=inspect.ismethod)
for member in members:
funcName = member[0]
def fn(self):
print 'ProxyClass.' + funcName
return myClass[funcName]()
self.__dict__[funcName] = fn
proxyClass = ProxyClass()
proxyClass.func1()
proxyClass.func2()
I think it is the line self.__dict__[funcName] = fn that needs to be changed but am not sure what to?
I'm new to Python so if there is a completely different Pythonic way of doing this I would be happy to hear about that too.
I would not explicitly copy the methods of the wrapped class. You can use the magic method __getattr__ to control what happens when you call something on the proxy object, including decorating it as you like; __getattr__ has to return a callable object, so you can make that callable do whatever you need to (in addition to calling the original method).
I have included an example below.
class A:
def foo(self): return 42
def bar(self, n): return n + 5
def baz(self, m, n): return m ** n
class Proxy:
def __init__(self, proxied_object):
self.__proxied = proxied_object
def __getattr__(self, attr):
def wrapped_method(*args, **kwargs):
print("The method {} is executing.".format(attr))
result = getattr(self.__proxied, attr)(*args, **kwargs)
print("The result was {}.".format(result))
return result
return wrapped_method
proxy = Proxy(A())
proxy.foo()
proxy.bar(10)
proxy.baz(2, 10)

Setting a get/set property in a python memoization decorator class

I have created a decorator memoization class that I am actively using for cache my calls. There are already many excellent suggestions on how to implement python memoization.
The class that I have created currently uses get and set method calls to set the cacheTimeOut. They are called getCacheTimeOut() and setCacheTimeOut(). While this is an adequate solution. I was hoping to use the #property and #cacheTimeOut.setter decorators to enable the functions to be called directly as for example cacheTimeOut=120
The problem is in the details. I do not know how to make these properties accessible in the __get__ method. The __get__ method assigns the different function calls defined within the class to functions.partial.
Here is my script example designed for Python 2.7
import time
from functools import partial
import cPickle
class memoize(object):
def __init__(self, func):
self.func = func
self._cache = {}
self._timestamps = {}
self._cacheTimeOut = 120
self.objtype = None
def __new__(cls, *args, **kwargs):
return object.__new__(cls,*args, **kwargs)
def __get__(self, obj, objtype=None):
"""Used for object methods where decorator has been placed before methods."""
self.objtype = objtype
fn = partial(self, obj)
fn.resetCache = self.resetCache
fn.getTimeStamps = self.getTimeStamps
fn.getCache = self.getCache
fn._timestamps = self._timestamps
fn.setCacheTimeOut = self.setCacheTimeOut
fn.getCacheTimeOut = self.getCacheTimeOut
return fn
def __argsToKey(self, *args, **kwargs):
args = list(args)
for x, arg in enumerate(args): # remove instance from
if self.objtype:
if isinstance(arg, self.objtype):
args.remove(arg)
str = cPickle.dumps(args, 1)+cPickle.dumps(kwargs, 1)
return str
def __call__(self, *args, **kwargs):
"""Main calling function of decorator."""
key = self.__argsToKey(*args, **kwargs)
now = time.time() # get current time to query for key
if self._timestamps.get(key, now) > now:
return self._cache[key]
else:
value = self.func(*args, **kwargs)
self._cache[key] = value
self._timestamps[key] = now + self._cacheTimeOut
return value
def __repr__(self):
'''Return the function's docstring.'''
return self.func.__doc__
def resetCache(self):
"""Resets the cache. Currently called manually upon request."""
self._cache = {}
self._timestamps = {}
def getCacheTimeOut(self):
"""Get the cache time out used to track stale data."""
return self._cacheTimeOut
def setCacheTimeOut(self, timeOut):
"""Set the cache timeout to some other value besides 120. Requires an integer value. If you set timeOut to zero you are ignoring the cache"""
self._cacheTimeOut = timeOut
def getCache(self):
"""Returns the cache dictionary."""
return self._cache
def getTimeStamps(self):
"""Returns the encapsulated timestamp dictionary."""
return self._timestamps
#property
def cacheTimeOut(self):
"""Get cacheTimeOut."""
return self._cacheTimeOut
#cacheTimeOut.setter
def cacheTimeOut(self, timeOut):
"""Set cacheTimeOut."""
self._cacheTimeOut = timeOut
memoize
def increment(x):
increment.count+=1
print("increment.count:%d, x:%d"%(increment.count, x))
x+=1
return x
increment.count = 0 # Define the count to track whether calls to increment vs cache
class basic(object):
def __init__(self):
self.count = 0
#memoize
def increment(self, x):
self.count+=1
print("increment.count:%d, x:%d"%(increment.count, x))
x+=1
return x
def main():
print increment(3)
print increment(3)
# What I am actually doing
print increment.getCacheTimeOut() # print out default of 120
increment.setCacheTimeOut(20) # set to 20
print increment.getCacheTimeOut() # verify that is has been set to 120
# What I would like to do and currently does not work
print increment.cacheTimeOut
# Assign to property
increment.cacheTimeOut = 20
myObject = basic()
print myObject.increment(3)
print myObject.count
print myObject.increment(3)
print myObject.count
print myObject.increment(4)
print myObject.count
####### Unittest code.
import sys
import time
import unittest
from memoize import memoize
class testSampleUsages(unittest.TestCase):
# """This series of unit tests is to show the user how to apply memoize calls."""
def testSimpleUsageMemoize(self):
#memoize
def increment(var=0):
var += 1
return var
increment(3)
increment(3)
def testMethodBasedUsage(self):
"""Add the #memoize before method call."""
class myClass(object):
#memoize
def increment(self,var=0):
var += 1
return var
#memoize
def decrement(self, var=0):
var -=1
return var
myObj = myClass()
myObj.increment(3)
myObj.increment(3)
myObj.decrement(6)
myObj.decrement(6)
def testMultipleInstances(self):
#memoize
class myClass(object):
def __init__(self):
self.incrementCountCalls = 0
self.decrementCountCalls = 0
self.powCountCall = 0
# #memoize
def increment(self,var=0):
var += 1
self.incrementCountCalls+=1
return var
# #memoize
def decrement(self, var=0):
self.decrementCountCalls+=1
var -=1
return var
def pow(self, var=0):
self.powCountCall+=1
return var*var
obj1 = myClass() # Memoizing class above does not seem to work.
obj2 = myClass()
obj3 = myClass()
obj1.increment(3)
obj1.increment(3)
#obj2.increment(3)
#obj2.increment(3)
#obj3.increment(3)
#obj3.increment(3)
obj1.pow(4)
obj2.pow(4)
obj3.pow(4)
There's no way to attach a property to a single instance. Being descriptors, propertys must be part of a class definition in order to function. That means you can't easily add them to the partial object you create in __get__.
Now, you could create a class of your own to reimplement the behavior of partial with your added property. However, I suspect the limitation is actually to your benefit. If memo is applied to a method, its state is shared by all instances of the class (and perhaps even instances of subclasses). If you allow the caching details to be adjusted through instances, you might confuse users with cases like:
obj1 = basic()
print obj1.increment.getCacheTimeout() # prints the initial value, e.g. 120
obj2 = basic()
obj2.increment.setCacheTimeOut(20) # change the timeout value via another instance
print obj1.increment.getCacheTimeout() # the value via the first instance now prints 20
I suggest that you make the memoization-related interfaces of decorated methods accessible only through the class, not through instances. To make that work, you need to update your __get__ method to work if obj is None. It can simply return self:
def __get__(self, obj, objtype=None):
if obj is None:
return self
self.objtype = objtype
return partial(self, obj) # no need to attach our methods to the partial anymore
With this change, using a property on the memo via the class works:
basic.increment.cacheTimeOut = 20 # set property of the "unbound" method basic.increment
There is actually a way to accomplish this - by rebinding the decorator as instance-object with a call-method
class Helper(object):
def __init__(self, d, obj):
self.d = d
self.obj = obj
self.timeout = 0
def __call__(self, *args, **kwargs):
print self, self.timeout
return self.d.func(self.obj, *args, **kwargs)
class decorator(object):
def __init__(self, func):
self.func = func
self.name = func.__name__
def __get__(self, obj, clazz):
if object is not None:
obj.__dict__[self.name] = Helper(self, obj)
return obj.__dict__[self.name]
class Foo(object):
#decorator
def bar(self, args):
return args * 2
f = Foo()
g = Foo()
f.bar.timeout = 10
g.bar.timeout = 20
print f.bar(10)
print g.bar(20)
HTH

Categories

Resources