Access a subset of functions of a Python class - python

Using a class that has an xmlrpc proxy as one of it's object's properties
def __init__(self):
self.proxy = ServerProxy(...)
# ...
I'm trying to ease the use of some of the proxy's functions. Only a subset of the proxy functions are supposed to be used and I thus thought of creating a set of tiny wrapper functions for them like
def sample(self):
""" A nice docstring for a wrapper function. """
self.proxy.sample()
Is there a good way of getting a list of all the wrapper functions? I'm thinking about something like dir(), but then I would need to filter for the object's wrapper functions. xmlrpc introspection (http://xmlrpc-c.sourceforge.net/introspection.html) doesn't help much either since I don't want to use/ provide all the server's functions.
Maybe setting an attribute on the wrappers together with a #staticmethod get_wrappers() would do the trick. Having a _wrapper suffix is not appropriate for my use case. A static list in the class that keeps track of the available is too error prone. So I'm looking for good ideas on how to best getting a list of the wrapper functions?

I'm not 100% sure if this is what you want, but it works:
def proxy_wrapper(name, docstring):
def wrapper(self, *args, **kwargs):
return self.proxy.__getattribute__(name)(*args, **kwargs)
wrapper.__doc__ = docstring
wrapper._is_wrapper = True
return wrapper
class Something(object):
def __init__(self):
self.proxy = {}
#classmethod
def get_proxy_wrappers(cls):
return [m for m in dir(cls) if hasattr(getattr(cls, m), "_is_wrapper")]
update = proxy_wrapper("update", "wraps the proxy's update() method")
proxy_keys = proxy_wrapper("keys", "wraps the proxy's keys() method")
Then
>>> a = Something()
>>> print a.proxy
{}
>>> a.update({1: 42})
>>> print a.proxy
{1: 42}
>>> a.update({"foo": "bar"})
>>> print a.proxy_keys()
[1, 'foo']
>>> print a.get_proxy_wrappers()
['proxy_keys', 'update']

Use xml-rpc introspection to get the server list and intersect it with your object's properties. Something like:
loc = dir(self)
rem = proxy.listMethods() # However introspection gets a method list
wrapped = [x for x in rem if x in loc]

Related

Dynamic Wrapper in Python

I'm looking to create a dynamic wrapper class that exposes the API calls from a provided object using data in the object.
Statically it looks like this:
class Concrete:
def __init__(self, data):
self.data = data
def print_data(self):
print(self.data)
class Wrapper:
'''
One day this will wrap a variety of objects. But today
it can only handle Concrete objects.
'''
def wrap_it(self, concrete):
self.cco = concrete # concreteobject=cco
def print_data(self):
self.cco.print_data()
cco = Concrete(5)
wcco = Wrapper()
wcco.wrap_it(cco)
wcco.print_data()
Produces
5
I'd like to figure out how to do the same thing but make
wrap_it dynamic. It should search the concrete object
find the functions, and create functions of the same name
that call the same function in the concrete object.
I imagine that the solution involves inspect.signature or
at least some use of *args and **kwargs, but I've not seen
an example on how to put all this together.
You can use the __getattr__ magic method to hook getting undefined attributes, and forward them to the concrete object:
class DynamicWrapper():
def wrap_it(self, concrete):
self.cco = concrete
def __getattr__(self, k):
def wrapper(*args, **kwargs):
print(f'DynamicWrapper calling {k} with args {args} {kwargs}')
return getattr(self.cco, k)(*args, **kwargs)
if hasattr(self.cco, k):
return wrapper
else:
raise AttributeError(f'No such field/method: {k}')
cco = Concrete(5)
dwcco = DynamicWrapper()
dwcco.wrap_it(cco)
dwcco.print_data()
Use the dir() function to get the attributes of the given object, check if they are callable and assign them to your wrapper, like this:
class Wrapper:
def wrap_it(self, objToWrap):
for attr in dir(objToWrap):
if not attr.startswith('__') and callable(getattr(objToWrap, attr)):
exec('self.%s = objToWrap.%s' % (attr, attr))
And now, for testing.
>>> cco = Concrete(5)
>>> wcco = Wrapper()
>>> wcco.wrap_it(cco)
>>> wcco.print_data()
5

Write a no-op or dummy class in Python

Let's say I have code like this:
foo = fooFactory.create()
For various reasons, fooFactory.create() could fail to create an instance of Foo.
If it does, I want fooFactory.create() to return a dummy/no-op object. This object should be completely inert - no matter how it is used, it should not do anything or throw any exceptions. Note that foo does not have methods that return values.
I've considered the following options.
First, create a mock. The upside of this is that it's easy and gives me exactly what I want. The downside is that it feels odd to use a mock in production code. More importantly, I have no control over the mock library and so its behavior could change at some point and break my application.
Second, create a dummy NoopFoo/ DummyFoo class. I then manually implement the methods it needs to support, and just put pass in the method bodies. The upside of that I know it will never break my application. The downside is that if other methods of Foo are used in future, I have to know to update NoopFoo/ DummyFoo ... or my application may break.
Is there a better option than either of these? Note that I'm new to Python so if it involves more advanced Python features, I would appreciate a little more information. Thanks!
You ask for an object that does nothing. An instance of object does precisely that.
def create():
return object()
On the other hand, you might actually want it to do something. You might want it to have methods that execute and return None. You might return an instance of this class:
In [1]: class Nop(object):
...: def nop(*args, **kw): pass
...: def __getattr__(self, _): return self.nop
...:
In [2]: n=Nop()
In [3]: n.foo
Out[3]: <bound method Nop.nop of <__main__.Nop object at 0x7f9fec034650>>
In [4]: n.foo()
In [5]: n.foo(1,2,3)
How about this one:
class MayBeCalled(object):
def __call__(self, *args, **kwargs):
return None
class Dummy(object):
def __getattr__(self, attr):
return MayBeCalled()
def __setattr__(self, attr, val):
pass
a = Dummy()
print a.nn
<__main__.MayBeCalled object at 0x103ca9a90>
print a.nn()
None
a.nn = 23
print a.nn()
None
A Dummy object responds to any attribute access doing nothing.
The previous answers with dummy objects don't work with nested attributes/methods. Here is my solution:
class Dummy:
def __init__(*args, **kwargs):
pass
def __call__(self, *args, **kwargs):
return self
def __getattr__(self, *args, **kwargs):
return self
Now we can access nested attributes and methods:
>>> d = Dummy()
>>> d.my_attr
<__main__.Dummy object at 0x10d007160>
>>> d.my_method()
<__main__.Dummy object at 0x10d007160>
>>> d.my_attr.my_method()
<__main__.Dummy object at 0x10d007160>
>>> d.my_attr.my_other_attr
<__main__.Dummy object at 0x10d007160>
Consider the nop package, available via pip install nop. Please do not use the mock package instead (except in tests) because it stores call history!
from nop import NOP
dummy = NOP()
dummy.foo()
NOP
dummy.foo().bar()
NOP
dummy.foo('x', 'y')
NOP
dummy.foo('x', 'y', z=3)
NOP
type(_)
<class 'nop.nop_base.NOP'>
Its source code is here.

How to create an object collection proxy in Python?

I'm trying to create an object collection proxy, which could do something like this:
class A:
def do_something():
# ...
class B:
def get_a():
return A()
class Proxy:
?
collection = [B(), B()]
proxy = Proxy(collection)
proxy.get_a().do_something()
# ^ for each B in collection get_a() and do_something()
What would be the best architecture / strategy for achieving this?
The key question, I guess is, how to cache the result of get_a() so I can then proxy do_something()
N.B. I don't expect proxy.get_a().do_something() to return anything sensible, it's only supposed to do things.
Simple enough... you may want to adapt it to do some more checking
class A(object):
def do_something(self):
print id(self), "called"
class B(object):
def get_a(self):
return A()
class Proxy(object):
def __init__(self, objs):
self._objs = objs
def __getattr__(self, name):
def func(*args, **kwargs):
return Proxy([getattr(o, name)(*args, **kwargs) for o in self._objs])
return func
collection = [B(), B()]
proxy = Proxy(collection)
proxy.get_a().do_something()
Results in:
4455571152 called
4455571216 called
The most pythonic way of going about this would probably be a list comprehension:
results = [b.get_a().do_something() for b in collection]
If you want to cache calls to B.get_a(), you can use memoization. A simple way of doing memoization yourself could look like this:
cache = None
# ...
class B:
def get_a(self):
global cache
if cache is None:
cache = A()
return cache
If you want to use caching in multiple places, you'll need to cache results based on keys in order to distinguish them, and for convenience's sake write a decorator that you can simply wrap functions with whose results you want to cache.
A good example of this is found in Python Algorithms: Mastering Basic Algorithms in the Python Language (see this question). Modified for your case, to not use the function arguments but the function name as cache key, it would look like this:
from functools import wraps
def memoize(func):
cache = {}
key = func.__name__
# wraps(func)
def wrap(*args):
if key not in cache:
cache[key] = func(*args)
return cache[key]
return wrap
class A:
def do_something(self):
return 1
class B:
#memoize
def get_a(self):
print "B.get_a() was called"
return A()
collection = [B(), B()]
results = [b.get_a().do_something() for b in collection]
print results
Output:
B.get_a() was called
[1, 1]

Is this abstract base class with a "better" __repr__() dangerous?

It bugs me that the default __repr__() for a class is so uninformative:
>>> class Opaque(object): pass
...
>>> Opaque()
<__main__.Opaque object at 0x7f3ac50eba90>
... so I've been thinking about how to improve it. After a little consideration, I came up with this abstract base class which leverages the pickle protocol's __getnewargs__() method:
from abc import abstractmethod
class Repro(object):
"""Abstract base class for objects with informative ``repr()`` behaviour."""
#abstractmethod
def __getnewargs__(self):
raise NotImplementedError
def __repr__(self):
signature = ", ".join(repr(arg) for arg in self.__getnewargs__())
return "%s(%s)" % (self.__class__.__name__, signature)
Here's a trivial example of its usage:
class Transparent(Repro):
"""An example of a ``Repro`` subclass."""
def __init__(self, *args):
self.args = args
def __getnewargs__(self):
return self.args
... and the resulting repr() behaviour:
>>> Transparent("an absurd signature", [1, 2, 3], str)
Transparent('an absurd signature', [1, 2, 3], <type 'str'>)
>>>
Now, I can see one reason Python doesn't do this by default straight away - requiring every class to define a __getnewargs__() method would be more burdensome than expecting (but not requiring) that it defines a __repr__() method.
What I'd like to know is: how dangerous and/or fragile is it? Off-hand, I can't think of anything that could go terribly wrong except that if a Repro instance contained itself, you'd get infinite recursion ... but that's solveable, at the cost of making the code above uglier.
What else have I missed?
If you're into this sort of thing, why not have the arguments automatically picked up from __init__ by using a decorator? Then you don't need to burden the user with manually handling them, and you can transparently handle normal method signatures with multiple arguments. Here's a quick version I came up with:
def deco(f):
def newFunc(self, *args, **kwargs):
self._args = args
self._kwargs = kwargs
f(self, *args, **kwargs)
return newFunc
class AutoRepr(object):
def __repr__(self):
args = ', '.join(repr(arg) for arg in self._args)
kwargs = ', '.join('{0}={1}'.format(k, repr(v)) for k, v in self._kwargs.iteritems())
allArgs = ', '.join([args, kwargs]).strip(', ')
return '{0}({1})'.format(self.__class__.__name__, allArgs)
Now you can define subclasses of AutoRepr normally, with normal __init__ signatures:
class Thingy(AutoRepr):
#deco
def __init__(self, foo, bar=88):
self.foo = foo
self.bar = bar
And the __repr__ automatically works:
>>> Thingy(1, 2)
Thingy(1, 2)
>>> Thingy(10)
Thingy(10)
>>> Thingy(1, bar=2)
Thingy(1, bar=2)
>>> Thingy(bar=1, foo=2)
Thingy(foo=2, bar=1)
>>> Thingy([1, 2, 3], "Some junk")
Thingy([1, 2, 3], 'Some junk')
Putting #deco on your __init__ is much easier than writing a whole __getnewargs__. And if you don't even want to have to do that, you could write a metaclass that automatically decorates the __init__ method in this way.
One problem with this whole idea is that there can be some kinds of objects who's state is not fully dependent on the arguments given to its constructor. For a trivial case, consider a class with random state:
import random
def A(object):
def __init__(self):
self.state = random.random()
There's no way for this class to correctly implement __getnewargs__, and so your implantation of __repr__ is also impossible. It may be that a class like the one above is not well designed. But pickle can handle it with no problems (I assume using the __reduce__ method inherited from object, but my pickle-fu is not enough to say so with certainty).
This is why it is nice that __repr__ can be coded to do whatever you want. If you want the internal state to be visible, you can make your class's __repr__ do that. If the object should be opaque, you can do that too. For the class above, I'd probably implement __repr__ like this:
def __repr__(self):
return "<A object with state=%f>" % self.state

Namespaces inside class in Python3

I am new to Python and I wonder if there is any way to aggregate methods into 'subspaces'. I mean something similar to this syntax:
smth = Something()
smth.subspace.do_smth()
smth.another_subspace.do_smth_else()
I am writing an API wrapper and I'm going to have a lot of very similar methods (only different URI) so I though it would be good to place them in a few subspaces that refer to the API requests categories. In other words, I want to create namespaces inside a class. I don't know if this is even possible in Python and have know idea what to look for in Google.
I will appreciate any help.
One way to do this is by defining subspace and another_subspace as properties that return objects that provide do_smth and do_smth_else respectively:
class Something:
#property
def subspace(self):
class SubSpaceClass:
def do_smth(other_self):
print('do_smth')
return SubSpaceClass()
#property
def another_subspace(self):
class AnotherSubSpaceClass:
def do_smth_else(other_self):
print('do_smth_else')
return AnotherSubSpaceClass()
Which does what you want:
>>> smth = Something()
>>> smth.subspace.do_smth()
do_smth
>>> smth.another_subspace.do_smth_else()
do_smth_else
Depending on what you intend to use the methods for, you may want to make SubSpaceClass a singleton, but i doubt the performance gain is worth it.
I had this need a couple years ago and came up with this:
class Registry:
"""Namespace within a class."""
def __get__(self, obj, cls=None):
if obj is None:
return self
else:
return InstanceRegistry(self, obj)
def __call__(self, name=None):
def decorator(f):
use_name = name or f.__name__
if hasattr(self, use_name):
raise ValueError("%s is already registered" % use_name)
setattr(self, name or f.__name__, f)
return f
return decorator
class InstanceRegistry:
"""
Helper for accessing a namespace from an instance of the class.
Used internally by :class:`Registry`. Returns a partial that will pass
the instance as the first parameter.
"""
def __init__(self, registry, obj):
self.__registry = registry
self.__obj = obj
def __getattr__(self, attr):
return partial(getattr(self.__registry, attr), self.__obj)
# Usage:
class Something:
subspace = Registry()
another_subspace = Registry()
#MyClass.subspace()
def do_smth(self):
# `self` will be an instance of Something
pass
#MyClass.another_subspace('do_smth_else')
def this_can_be_called_anything_and_take_any_parameter_name(obj, other):
# Call it `obj` or whatever else if `self` outside a class is unsettling
pass
At runtime:
>>> smth = Something()
>>> smth.subspace.do_smth()
>>> smth.another_subspace.do_smth_else('other')
This is compatible with Py2 and Py3. Some performance optimizations are possible in Py3 because __set_name__ tells us what the namespace is called and allows caching the instance registry.

Categories

Resources