I'm creating several classes to use them as state flags (this is more of an exercise, though I'm going to use them in a real project), just like we use None in Python, i.e.
... some_var is None ...
NoneType has several special properties, most importantly it's a singleton, that is there can't be more than one NoneType instance during any interpreter session, and its instances (None objects) are immutable. I've come up with two possible ways to implement somewhat similar behaviour in pure Python and I'm eager to know which one looks better from the architectural standpoint.
1. Don't use instances at all.
The idea is to have a metaclass, that produces immutable classes. The classes are prohibited to have instances.
class FlagMetaClass(type):
def __setattr__(self, *args, **kwargs):
raise TypeError("{} class is immutable".format(self))
def __delattr__(self, *args, **kwargs):
self.__setattr__()
def __repr__(self):
return self.__name__
class BaseFlag(object):
__metaclass__ = FlagMetaClass
def __init__(self):
raise TypeError("Can't create {} instances".format(type(self)))
def __repr__(self):
return str(type(self))
class SomeFlag(BaseFlag):
pass
And we get the desired behaviour
a = BaseFlag
a is BaseFlag # -> True
a is SomeFlag # -> False
Obviously any attempt to set attributes on these classes will fail (of course there are several hacks to overcome this, but the direct way is closed). And the classes themselves are unique objects loaded in a namespace.
2. A proper singleton class
class FlagMetaClass(type):
_instances = {}
def __call__(cls):
if cls not in cls._instances:
cls._instances[cls] = super(FlagMetaClass, cls).__call__()
return cls._instances[cls] # This may be slightly modified to
# raise an error instead of returning
# the same object, e.g.
# def __call__(cls):
# if cls in cls._instances:
# raise TypeError("Can't have more than one {} instance".format(cls))
# cls._instances[cls] = super(FlagMetaClass, cls).__call__()
# return cls._instances[cls]
def __setattr__(self, *args, **kwargs):
raise TypeError("{} class is immutable".format(self))
def __delattr__(self, *args, **kwargs):
self.__setattr__()
def __repr__(self):
return self.__name__
class BaseFlag(object):
__metaclass__ = FlagMetaClass
__slots__ = []
def __repr__(self):
return str(type(self))
class SomeFlag(BaseFlag):
pass
Here the Flag classes are real singletons. This particular implementation doesn't raise an error when we try to create another instance, but returns the same object (though it's easy to alter this behaviour). Both classes and instances can't be directly modified. The point is to create an instance of each class upon import like it's done with None.
Both approaches give me somewhat immutable unique objects that can be used for comparison just like the None. To me the second one looks more NoneType-like, since None is an instance, but I'm not sure that it's worth the increase in idealogical complexity. Looking forward to hear from you.
Theoretically, it's an interesting exercise. But when you say "though I'm going to use them in a real project" then you lose me.
If the real project is highly unPythonic (using traits or some other package to emulate static typing, using __slots__ to keep people from falling on sharp objects, etc.) -- well, I've got nothing for you, because I've got no use for that, but others do.
If the real project is Pythonic, then do the simplest thing possible.
Your "not use instances at all" answer is the correct one here, but you don't need to do a lot of class definition, either.
For example, if you have a function that could accept None as a real parameter, and you want to tell if the parameter has been defaulted, then just do this:
class NoParameterGiven:
pass
def my_function(my_parameter=NoParameterGiven):
if my_parameter is NoParameterGiven:
<do all my default stuff>
That class is so cheap, there's no reason even to share it between files. Just create it where you need it.
Your state classes are a different story, and you might want to use something like that enum module that #Dunes mentioned -- it has some nice features.
OTOH, if you want to keep it really simple, you could just do something like this:
class MyStates:
class State1: pass
class State2: pass
class State3 pass
You don't need to instantiate anything, and you can refer to them like this: MyStates.State1.
Related
I feel like I have a pretty good grasp on using decorators when dealing with regular functions, but between using methods of base classes for decorators in derived classes, and passing parameters to said decorators, I cannot figure out what to do next.
Here is a snippet of code.
class ValidatedObject:
...
def apply_validation(self, field_name, code):
def wrap(self, f):
self._validations.append(Validation(field_name, code, f))
return f
return wrap
class test(ValidatedObject):
....
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
If I try this as is, I get an "apply_validation" is not found.
If I try it with #self.apply_validation I get a "self" isn't found.
I've also been messing around with making apply_validation a class method without success.
Would someone please explain what I'm doing wrong, and the best way to fix this? Thank you.
The issue you're having is that apply_validation is a method, which means you need to call it on an instance of ValidatedObject. Unfortunately, at the time it is being called (during the definition of the test class), there is no appropriate instance available. You need a different approach.
The most obvious one is to use a metaclass that searches through its instance dictionaries (which are really class dictionaries) and sets up the _validations variable based on what it finds. You can still use a decorator, but it probably should be a global function, or perhaps a static method, and it will need to work differently. Here's some code, that uses a metaclass and a decorator that adds function attributes:
class ValidatedMeta(type):
def __new__(meta, name, bases, dct):
validations = [Validation(f._validation_field_name, f._validation_code, f)
for f in dct.values if hasattr(f._validation_field_name)]
dct["_validations"] = validations
super(ValidatedMeta, meta).__new__(meta, name, bases, dct)
def apply_validation(field_name, code):
def decorator(f):
f._validation_field_name = field_name
f._validation_code = code
return f
return decorator
def ValidatedObject(metaclass=ValidatedMeta):
pass
class test(ValidatedObject):
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
After this code runs, test._validations will be [Validation("_name", "oh no!", test.name_validation)]. Note that the method that is be passed to Validation is unbound, so you'll need to pass it a self argument yourself when you call it (or perhaps drop the self argument and change the decorator created in apply_validation to return staticmethod(f)).
This code may not do what you want if you have validation methods defined at several levels of an inheritance hierarchy. The metaclass as written above only checks the immediate class's dict for methods with the appropriate attributes. If you need it include inherited methods in _validations too, you may need to modify the logic in ValidatedMeta.__new__. Probably the easiest way to go is to look for _validations attributes in the bases and concatenate the lists together.
Just an example for using decorators on class method:
from functools import wraps
def VALIDATE(dec):
#wraps(dec)
def _apply_validation(self, name):
self.validate(name)
return dec(self, name)
return _apply_validation
class A:
def validate(self, name):
if name != "aamir":
raise Exception, 'Invalid name "%s"' % name
class B(A):
#VALIDATE
def name_validation(self, name):
return name
b = B()
b.name_validation('jacob') # should raise exception
This question may look silly(since I am new to python), but can you guys tell me what is the difference between self and classname when Binding?
class OnlyOne(object):
class __OnlyOne:
def __init__(self):
self.val = None
def __str__(self):
return ´self´ + self.val
instance = None
def __new__(cls): # __new__ always a classmethod
if not OnlyOne.instance:
OnlyOne.instance = OnlyOne.__OnlyOne()
return OnlyOne.instance
def __getattr__(self, name):
return getattr(self.instance, name)
def __setattr__(self, name):
return setattr(self.instance, name)
Here, I usually use Instance as self... What is the difference between using self and Only one... my intuition tells me that, it is a global variable.... if it is a global variable, it does not make sense at all(I will edit this, if its a global variable). Thanks!!
Ok, I think I've got a handle on your code ... The way it works is that when the constructor is called:
a = OnlyOne() #call constructor. This implicitly calls __new__
At this point, __new__ checks the class to see if an instance has been created (instance isn't None). If it hasn't been created, it creates an instance and puts it in the instance class attribute. Then the instance class attribute is returned which is then passed into your methods as self.
I think that if you actually need a singleton, then there's something fishy (lazy) about your program design. Singletons allow information to propagate throughout your program in strange ways (Imagine functions foo and bar both which create an instance of OnlyOne. Changes you make in foo show up when you call bar) -- It's somewhat akin to monkey patching.
If, after rethinking your design for a few months, you decide that you really do need a singleton, you can create some sort of factory class which is a lot more transparent...
I'm seeking advice about design of my code.
Introduction
I have several classes, each represents one file type, eg: MediaImageFile, MediaAudioFile and generic (and also base class) MediaGenericFile.
Each file have two variants: Master and Version, so I created these classes to define theirs specific behaviour. EDIT: Version represents resized/cropped/trimmed/etc variant of Master file. It's used mainly for previews.
EDIT: The reason, why I want to do it dynamically is that this app should be reusable (it's Django-app) and therefore it should be easy to implement other MediaGenericFile subclass without modifying original code.
What I want to do
First of all, user should be able to register own MediaGenericFile subclasses without affecting original code.
Whether file is version or master is easily (one regexp) recognizable from filename.
/path/to/master.jpg -- master
/path/to/.versions/master_version.jpg -- version
Master/Version classes use some methods/properties of MediaGenericFile, like filename (you need to know filename to generate new version).
MediaGenericFile extends LazyFile, which is just lazy File object.
Now I need to put it together…
Used design
Before I start coding 'versions' feature, I had factory class MediaFile, which returns appropriate file type class according to extension:
>>> MediaFile('path/to/image.jpg')
<<< <MediaImageFile 'path/to/image.jpg'>
Classes Master and Version define new methods which use methods and attributes of MediaGenericFile and etc.
Approach 1
One approach is create dynamically new type, which inherits Master (or Version) and MediaGenericFile (or subclass).
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
bases = (Version, klass)
class_name = '{0}Version'.format(klass.__name__)
else:
bases = (Master, klass)
class_name = '{0}Master'.format(klass.__name__)
new_class = type(class_name, bases, {})
...
return new_class(*args, **kwargs)
Approach 2
Second approach is create method 'contribute_to_instance' in Master/Version and call it after creating new_class, but that's more tricky than I thought:
classs Master(object):
#classmethod
def contribute_to_instance(cls, instance):
methods = (...)
for m in methods:
setattr(instance, m, types.MethodType(getattr(cls, m), instance))
class MediaFile(object):
def __new__(*args, **kwargs):
... # decision about new_class
obj = new_class(*args, **kwargs)
if version:
version_class = Version
else:
version_class = Master
version_class.contribute_to_instance(obj)
...
return obj
However, this doesn't work. There are still problems with calling Master/Version's methods.
Questions
What would be good way to implement this multiple inheritance?
How is this problem called? :) I was trying to find some solutions, but I simply don't know how to name this problem.
Thanks in advance!
Note to answers
ad larsmans
Comparison and instance check wouldn't be problem for my case, because:
Comparisons are redefined anyway
class MediaGenericFile(object):
def __eq__(self, other):
return self.name == other.name
I never need to check isinstance(MediaGenericFileVersion, instance). I'm using isinstance(MediaGenericFile, instance) and isinstance(Version, instance) and both works as expected.
Nevertheless, creating new type per instance sounds to me as considerable defect.
Well, I could create both variations dynamically in metaclass and then use them, something like:
>>> MediaGenericFile.version_class
<<< <class MediaGenericFileVersion>
>>> MediaGenericFile.master_class
<<< <class MediaGenericFileMaster>
And then:
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
attr_name = 'version_class'
else:
attr_name = 'master_class'
new_class = getattr(klass, attr_name)
...
return new_class(*args, **kwargs)
Final solution
Finally the design pattern is factory class. MediaGenericFile subclasses are statically typed, users can implement and register their own. Master/Version variants are created dynamically (glued together from several mixins) in metaclass and stored in 'cache' to avoid perils mentioned by larsmans.
Thanks everyone for their suggestions. Finally I understand the metaclass concept. Well, at least I think that I understand it. Push origin master…
I'd certainly advise against the first approach of constructing classes in __new__. The problem with it is that you create a new type per instance, which causes overhead and worse, causes type comparisons to fail:
>>> Ham1 = type("Ham", (object,), {})
>>> Ham2 = type("Ham", (object,), {})
>>> Ham1 == Ham2
False
>>> isinstance(Ham1(), Ham2)
False
>>> isinstance(Ham2(), Ham1)
False
This violates the principle of least surprise because the classes may seem entirely identical:
>>> Ham1
<class '__main__.Ham'>
>>> Ham2
<class '__main__.Ham'>
You can get approach 1 to work properly, though, if you construct the classes at the module level, outside of MediaFile:
classes = {}
for klass in [MediaImageFile, MediaAudioFile]:
for variant in [Master, Version]:
# I'd actually do this the other way around,
# making Master and Version mixins
bases = (variant, klass)
name = klass.__name__ + variant.__name__
classes[name] = type(name, bases, {})
then, in MediaFile.__new__, look the required class up by name in classes. (Alternatively, set the newly constructed classes on the module instead of in a dict.)
I'm not sure how dynamic you want it to be, but using a "factory pattern" (here using a class factory), is fairly readable and understandable and may do what you want. This could serve as a base... MediaFactory could be smarter, and you could register multiple other classes, instead of hard-coding MediaFactoryMaster etc...
class MediaFactory(object):
__items = {}
#classmethod
def make(cls, item):
return cls.__items[item]
#classmethod
def register(cls, item):
def func(kls):
cls.__items[item] = kls
return kls
return func
class MediaFactoryMaster(MediaFactory, Master): pass
class MediaFactoryVersion(MediaFactory, Version): pass
class MediaFile(object):
pass
#MediaFactoryMaster.register('jpg') # adapt to take ['jpg', 'gif', 'png'] ?
class MediaFileImage(MediaFile):
pass
#MediaFactoryVersion.register('mp3') # adapt to take ['mp3', 'ogg', 'm4a'] ?
class MediaFileAudio(MediaFile):
pass
other possible MediaFactory.make
#classmethod
def make(cls, fname):
name, ext = somefunc(fname)
kls = cls.__items[ext]
other = Version if Version else Master
return type('{}{}'.format(kls.__name__,other.__name__), (kls, other), {})
How come you're not using inheritance but are playing around with __new__?
class GenericFile(File):
"""Base class"""
class Master(object):
"""Master Mixin"""
class Versioned(object):
"""Versioning mixin"""
class ImageFile(GenericFile):
"""Image Files"""
class MasterImage(ImageFile, Master):
"""Whatever"""
class VersionedImage(ImageFile, Versioned):
"""Blah blah blah"""
...
It's not clear why you're doing this though. I think there's a weird code smell here. I'd recommend fewer classes with a consistent interface (duck-typing) rather than a dozen classes and isinstance checks throughout the code to make it all work.
Perhaps you can update your question with what you'd like to do in your code and folks can help either identify the real pattern or a suggest a more idiomatic solution.
You do not have to create a new class for each instance. Don't create the new classes in __new__ create them in __metaclass__. define a metaclass in the base or in the base_module. The two "variant" subclasses are easily saved as as class attributes of their genaric parent and then __new__ just looks at the filename according to it's own rules and decides which subclass to return.
Watch out for __new__ that returns a class other than the one "nominated" during the constructor call. You may have to take steps to invoke __init__ from withing __new__
Subclasses will either have to:
"register" themselves with a factory or parent to be found
be imported and then have the parent or factory find them through a recursive search of cls.__subclasses (might have to happen once per creation but that's probably not a problem for file handeling)
found through the use of "setuptools" entry_points type tools but that requires more effort and coordination by the user
The OOD question you should be asking is "do the various classes of my proposed inheritance share any properties at all?"
The purpose of inheritance is to share common data or methods that the instances naturally have in common. Aside from both being files, what do Image files and Audio files have in common? If you really want to stretch your metaphors, you could conceivably have AudioFile.view() which could present — for example — a visualization of the power spectra of the audio data, but ImageFile.listen() makes even less sense.
I think your question side-steps this language independent conceptual issue in favor of the Python dependent mechanics of an object factory. I don't think you have a proper case of inheritance here, or you've failed to explain what common features your Media objects need to share.
I recently developed a class named DocumentWrapper around some ORM document object in Python to transparently add some features to it without changing its interface in any way.
I just have one issue with this. Let's say I have some User object wrapped in it. Calling isinstance(some_var, User) will return False because some_var indeed is an instance of DocumentWrapper.
Is there any way to fake the type of an object in Python to have the same call return True?
You can use the __instancecheck__ magic method to override the default isinstance behaviour:
#classmethod
def __instancecheck__(cls, instance):
return isinstance(instance, User)
This is only if you want your object to be a transparent wrapper; that is, if you want a DocumentWrapper to behave like a User. Otherwise, just expose the wrapped class as an attribute.
This is a Python 3 addition; it came with abstract base classes. You can't do the same in Python 2.
Override __class__ in your wrapper class DocumentWrapper:
class DocumentWrapper(object):
#property
def __class__(self):
return User
>>> isinstance(DocumentWrapper(), User)
True
This way no modifications to the wrapped class User are needed.
Python Mock does the same (see mock.py:612 in mock-2.0.0, couldn't find sources online to link to, sorry).
Testing the type of an object is usually an antipattern in python. In some cases it makes sense to test the "duck type" of the object, something like:
hasattr(some_var, "username")
But even that's undesirable, for instance there are reasons why that expression might return false, even though a wrapper uses some magic with __getattribute__ to correctly proxy the attribute.
It's usually preferred to allow variables only take a single abstract type, and possibly None. Different behaviours based on different inputs should be achieved by passing the optionally typed data in different variables. You want to do something like this:
def dosomething(some_user=None, some_otherthing=None):
if some_user is not None:
#do the "User" type action
elif some_otherthing is not None:
#etc...
else:
raise ValueError("not enough arguments")
Of course, this all assumes you have some level of control of the code that is doing the type checking. Suppose it isn't. for "isinstance()" to return true, the class must appear in the instance's bases, or the class must have an __instancecheck__. Since you don't control either of those things for the class, you have to resort to some shenanigans on the instance. Do something like this:
def wrap_user(instance):
class wrapped_user(type(instance)):
__metaclass__ = type
def __init__(self):
pass
def __getattribute__(self, attr):
self_dict = object.__getattribute__(type(self), '__dict__')
if attr in self_dict:
return self_dict[attr]
return getattr(instance, attr)
def extra_feature(self, foo):
return instance.username + foo # or whatever
return wrapped_user()
What we're doing is creating a new class dynamically at the time we need to wrap the instance, and actually inherit from the wrapped object's __class__. We also go to the extra trouble of overriding the __metaclass__, in case the original had some extra behaviors we don't actually want to encounter (like looking for a database table with a certain class name). A nice convenience of this style is that we never have to create any instance attributes on the wrapper class, there is no self.wrapped_object, since that value is present at class creation time.
Edit: As pointed out in comments, the above only works for some simple types, if you need to proxy more elaborate attributes on the target object, (say, methods), then see the following answer: Python - Faking Type Continued
Here is a solution by using metaclass, but you need to modify the wrapped classes:
>>> class DocumentWrapper:
def __init__(self, wrapped_obj):
self.wrapped_obj = wrapped_obj
>>> class MetaWrapper(abc.ABCMeta):
def __instancecheck__(self, instance):
try:
return isinstance(instance.wrapped_obj, self)
except:
return isinstance(instance, self)
>>> class User(metaclass=MetaWrapper):
pass
>>> user=DocumentWrapper(User())
>>> isinstance(user,User)
True
>>> class User2:
pass
>>> user2=DocumentWrapper(User2())
>>> isinstance(user2,User2)
False
It sounds like you want to test the type of the object your DocumentWrapper wraps, not the type of the DocumentWrapper itself. If that's right, then the interface to DocumentWrapper needs to expose that type. You might add a method to your DocumentWrapper class that returns the type of the wrapped object, for instance. But I don't think that making the call to isinstance ambiguous, by making it return True when it's not, is the right way to solve this.
The best way is to inherit DocumentWrapper from the User itself, or mix-in pattern and doing multiple inherintance from many classes
class DocumentWrapper(User, object)
You can also fake isinstance() results by manipulating obj.__class__ but this is deep level magic and should not be done.
For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.