I apologize for not giving this question a better title; the reason that I am posting it is that I don't even have the correct terminology to know what I am looking for.
I have defined a class with an attribute 'spam':
def SpamClass(object):
def __init__(self, arg):
self.spam = arg
def __str__(self):
return self.spam
I want to create a (sub/sibling?)class that has exactly the same functionality, but with an attribute named 'eggs' instead of 'spam':
def EggsClass(object):
def __init__(self, arg):
self.eggs = arg
def __str__(self):
return self.eggs
To generalize, how do I create functionally-identical classes with arbitrary attribute names? When the class has complicated behavior, it seems silly to duplicate code.
Update: I agree that this smells like bad design. To clarify, I'm not trying to solve a particular problem in this stupid way. I just want to know how to arbitrarily name the (non-magic) contents of an object's __dict__ while preserving functionality. Consider something like the keys() method for dict-like objects. People create various classes with keys() methods that behave according to convention, and the naming convention is a Good Thing. But the name is arbitrary. How can I make a class with a spam() method that exactly replaces keys() without manually substituting /keys/spam/ in the source?
Overloading __getattr__ and friends to reference the generic attribute seems inelegant and brittle to me. If a subclass reimplements these methods, it must accommodate this behavior. I would rather have it appear to the user that there is simply a base class with a named attribute that can be accessed naively.
Actually, I can think of a plausible use case. Suppose that you want a mixin class that confers a special attribute and some closely related methods that manipulate or depend upon this attribute. A user may want to name this special attribute differently for different classes (to match names in the real-world problem domain or to avoid name collisions) while reusing the underlying behavior.
Here is a way to get the effect I think you want.
Define a generic class with a generic attribute name. Then in each sub class follow the advice in http://docs.python.org/reference/datamodel.html#customizing-attribute-access to make the attribute look externally like it is called whatever you want it called.
Your description of what you do feels like it has a "code smell" to me, I'd suggest reviewing your design very carefully to see whether this is really what you want to do. But you can make it work with my suggestion.
You can also create a super-class with all common stuff and then sub-classes with specific attributes.
Or even:
def SuperClass(object):
specific_attribute = 'unset'
def __init__(self, arg):
setattr(self, specific_attribute, arg)
def __str__(self):
return getattr(self, specific_attribute)
def EggClass(SuperClass):
specific_attribute = 'eggs'
Have you considered not overcomplicating things and just create one class? (since they are identical anyway)
class FoodClass(object):
def __init__(self, foodname, arg):
self.attrs = {foodname: arg}
self.foodname = foodname
def __str__(self):
return self.attrs[foodname]
If you want some nice constructors, just create them separately:
def make_eggs(arg):
return FoodClass('eggs', arg)
def make_spam(arg):
return FoodClass('spam', arg)
To create attributes during runtime, just add them in self.__dict__['foo'] = 'I'm foo' in the class code.
Related
Description & What I've tried:
I have seen many posts in stackoverflow about binding methods to class instances (I'm aware there are bunch of duplicates already).
However I havent found a discussion referring to binding a method to the class itself. I can think of workarounds but I'm curious if there is a simple way to achieve following:
import types
def quacks(some_class):
def quack(self, number_of_quacks):
self.number_of_quacks = number_of_quacks
setattr(some_class, "quack", types.MethodType(quack, some_class))
return some_class
#quacks
class Duck:
pass
but above would not work:
d1 = Duck()
d2 = Duck()
d1.quack(1)
d2.quack(2)
print(d2.number_of_quacks)
# 2
print(d1.number_of_quacks)
# 2
because quack is actually modifying the class itself rather than the instance.
There are two workarounds I can think of. Either something like below:
class Duck:
def __init__(self):
setattr(self, "quack", types.MethodType(quack, self))
or something like
class Quacks:
def quack(self, number_of_quacks):
self.number_of_quacks = number_of_quacks
class Duck(Quacks):
pass
Question:
So my question is, is there a simple way to achieve the simple #quacks class decorator I described above?
Why I'm asking:
I intend to create a set of functions to modularly add common methods I use to classes. If I dont quit this project, the list is likely to grow over time and I would prefer to have it look nice on code definition. And as a matter of taste, I think option 1 below looks nicer than option 2:
# option 1
#quacks
#walks
#has_wings
#is_white
#stuff
class Duck:
pass
# option 2
class Duck(
Quacks,
Walks,
HasWings,
IsWhite,
Stuff):
pass
If you don't mind changing your desired syntax completely to get the functionality you want, you can dynamically construct classes with type (see second signature).
The first argument is the name of the class, the second is a tuple of superclasses, and the third is a dictionary of attributes to add.
Duck = type("Duck", (), {
"quack", quack_function,
"walk", walk_function,
...
})
So, instead of decorators that inject the appropriate functionality after creation, you are simply adding the functionality directly at the time of creation. The nice thing about this method is that you can programatically build the attribute dictionary, whereas with decorators you cannot.
Found another workaround, I guess below would do it for me.
def quacks(some_class):
def quack(self, number_of_quacks):
self.number_of_quacks = number_of_quacks
old__init__ = some_class.__init__
def new__init__(self, *args, **kwargs):
setattr(self, "quack", types.MethodType(quack, self))
old__init__(self, *args, **kwargs)
setattr(some_class, "__init__", new__init__)
return some_class
Feel free to add any other alternatives, or if you see any drawbacks with this approach.
Edit: a less hacky way inspired from #SethMMorton's answer:
def quack(self, number_of_quacks):
self.number_of_quacks = number_of_quacks
def add_mixin(some_class, some_fn):
new_class = type(some_class.__name__, (some_class,), {
some_fn.__name__: some_fn
})
return new_class
def quacks(some_class):
return add_mixin(some_class, quack)
#quacks
class Duck:
pass
d1 = Duck()
d2 = Duck()
d1.quack(1)
d2.quack(2)
print(d1.number_of_quacks)
print(d2.number_of_quacks)
I've seen few questions with similar title but none of them seems suitable for me.
I'd like to create python objects, possibly with methods, on the fly with clean and "pythonic" syntax. What I have so far is:
import types
class QuickObject:
def __init__(self, **kwargs):
self.__dict__.update(kwargs)
def with_methods(self, **kwargs):
for name, method in kwargs.items():
self.__dict__[name] = types.MethodType(method, self)
return self
Possible usages:
ob = QuickObject(x=10, y=20)
print(ob.x, ob.y)
ob = QuickObject(x=10, y=20).with_methods(sum=lambda s: s.x + s.y)
print(ob.x, ob.y, ob.sum())
It works and looks clear but seems a little bit bizarre and I believe there's a better way to achieve this than my definition of QuickObject class.
I am aware of namedtuple but it's immutable, does not go with methods and it's not that cool because it has to be defined beforehand for every object with different fields.
Built-in dictionaries also does not provide syntax for method calling nor let to access data with dictionary.key syntax without custom dict subclass.
TLDR: I am looking for a recipe for on-the-fly objects with:
mutable fields,
methods,
nice ClassName(field1=value, field2=value, ...) syntax.
Further reading:
How to use a dot "." to access members of dictionary?
How to create objects on the fly in python?
Using type (external link)
Say I have a third-party library where a metaclass requires me to implement something. But I want to have an intermediate "abstract" subclass that doesn't. How can I do this?
Consider this to be a very minimal example of what third-party library has:
class ServingMeta(type):
def __new__(cls, name, bases, classdict):
if any(isinstance(b, ServingMeta) for b in bases):
if "name" not in classdict:
# Actual code fails for a different reason,
# but the logic is the same.
raise TypeError(f"Class '{name}' has no 'name' attribute")
return super().__new__(cls, name, bases, classdict)
class Serving(object, metaclass=ServingMeta):
def shout_name(self):
return self.name.upper()
I cannot modify the code above. It's an external dependency (and I don't want to fork it).
The code is meant to be used this way:
class Spam(Serving):
name = "SPAM"
spam = Spam()
print(spam.shout_name())
However, I happen to have a lot of spam, and I want to introduce a base class with the common helper methods. Something like this:
class Spam(Serving):
def thrice(self):
return " ".join([self.shout_name()] * 3)
class LovelySpam(Spam):
name = "lovely spam"
class WonderfulSpam(Spam):
name = "wonderful spam"
Obviously, this doesn't work and fails with the well-expected TypeError: Class 'SpamBase' has no 'name' attribute declared. Would third-party library had a SpamBase class without a metaclass, I could've subclassed that - but no such luck this time (I've mentioned this inconvenience to the library authors).
I can make it a mixin:
class SpamMixin(object):
def thrice(self):
return " ".join([self.shout_name()] * 3)
class LovelySpam(SpamMixin, Serving):
name = "lovely spam"
class WonderfulSpam(SpamMixin, Serving):
name = "wonderful spam"
However, this makes me and my IDE cringe a little, as it quickly becomes cumbersome to repeat SpamMixin everywhere and also because object has no shout_name attribute (and I don't want to silence analysis tools). In short, I just don't like this approach.
What else can I do?
Is there a way to get a metaclass-less version of Serving? I think of something like this:
ServingBase = remove_metaclass(Serving)
class Spam(ServingBase, metaclass=ServingMeta):
...
But don't know how to actually implement remove_metaclass and whenever it's even reasonably possible (of course, it must be doable, with some introspection, but it could require more arcane magic than I can cast).
Any other suggestions are also welcomed. Basically, I want to have my code DRY (one base class to rule them all), and have my linter/code analysis icons all green.
The mixin approach is the correct way to go. If your IDE "cringe" that is a deffect on that tool - just disable a little of the "features" that are obviously incorrect tunning when coding for a dynamic language like Python.
And this is not even about creating things dynamically, it is merely multiple-inheritance, which is supported by the language since forever. And one of the main uses of multiple-inheritance is exactly being able to create mixins just as this one you need.
Another inheritance-based workaround is to make your hierarchy one level deeper, and just introduce the metaclass after you come up with your mixin methods:
class Mixin(object):
def mimixin(self): ...
class SpamBase(Mixin, metaclass=ServingMeta):
name = "stub"
Or just addd the mixin in an intermediate subclass:
class Base(metaclass=Serving Meta):
name = "stub"
class MixedBase(Mixin, Base):
name = "stub"
class MyLovingSpam(MixedBase):
name = "MyLovingSpam"
If you don't want to be repeating the mixin=-base name in every class, that is the way to go.
"Removing" a metaclass just for the sake of having a late mixin is way over the top. Really. Broken. The way to do it wol e re-create the class dynamically, just as #vaultah mentions in the other answer, but doing that in an intermediate class is a thing you should not do. Doing that to please the IDE is something you should not do twice: messing with metaclasses is hard enough already. Removing things on inheritance/class creation that the language puts there naturally is something nasty (cf. this answer: How to make a class attribute exclusive to the super class ) . On the other hand, mixins and multiple inheritance are just natural.
Are you still there? I told you not to do so:
Now, onto your question - instead of "supressing the metaclass" in an intermediate class, it would be more feasible to inherit the metaclass you have there and change its behavior - so that it does not check for the constraints in specially marked classes - create an attribute for your use, like _skip_checking
class MyMeta(ServingMeta):
def __new__(metacls, name, bases, namespace):
if namespace.get("_skip_checking", False):
# hardcode call to "type" metaclass:
del namespace["_skip_checking"]
cls = type.__new__(metacls, name, bases, namespace)
else:
cls = super().__new__(metacls, name, bases, namespace)
return cls
# repeat for __init__ if needed.
class Base(metaclass=MyMeta):
_skip_checking = True
# define mixin methods
class LoveSpam(Base):
name = "LoveSpam"
There's really no direct way to remove the metaclass from a Python class, because the metaclass created that class. What you can try is re-create the class using a different metaclass, which doesn't have unwanted behaviour. For example, you could use type (the default metaclass).
In [6]: class Serving(metaclass=ServingMeta):
...: def shout_name(self):
...: return self.name.upper()
...:
In [7]: ServingBase = type('ServingBase', Serving.__bases__, dict(vars(Serving)))
Basically this takes the __bases__ tuple and the namespace of the Serving class, and uses them to create a new class ServingBase. N.B. this means that ServingBase will receive all bases and methods/attributes from Serving, some of which may have been added by ServingMeta.
My attempt at answering this question draws on this question: Cast base class to derived class python (or more pythonic way of extending classes)
I'm writing a mixin class that will add some functionality to an object returned by another module. The code in the other module looks like this:
class Foo(Mixin):
def __new__(cls, *args, **kwargs):
#...handle a bunch of cases
if case1:
return FooTypeA
elif case2:
return FooTypeB
#... etc
class FooTypeA(Mixin):
#...
class FooTypeB(Mixin):
#...
I've written MyMixin, which adds a little bit of functionality to the objects returned by Foo. My attempt at solving the problem is this:
from other_module import Foo, FooTypeA, FooTypeB, ...
class MyFoo(Foo):
def __new__(cls, *args, **kwargs):
#...handle most of the same cases
if case1:
ret = FooTypeA(*args, **kwargs)
ret.__class__ = MyFooTypeA
if case2:
ret = FooTypeB(*args, **kwargs)
ret.__class__ = MyFooTypeB
#... etc
class MyFooTypeA(FooTypeA, MyMixin):
pass
class MyFooTypeB(FooTypeB, MyMixin):
pass
This looks really, really ugly. Is there really no better solution?
If not, why?
EDIT: I thought it would be easier without going in to speicifics, but the code I'm actually working on is here. The author of this module has written "WebDriverMixin," which mostly provides some nicer syntax for accessing elements on the page that an instance of the selenium webdriver is on. I have "SiteSpecificMixin," which provides some nicer syntax for accessing elements of the particular site that I'm testing.
webdriverplus.WebDriver returns instances of webdriverplus.Firefox, webdriverplus.Chrome, webdriverplus.Ie, etc. webdriverplus.Firefox inherits from webdriverplus.WebDriverMixin and selenium.webdriver.firefox.webdriver.Firefox, webdriverplus.Chrome inherits from webdriverplus.WebDriverMixin and selenium.webdriver.firefox.webdriver.Chrome, etc.
I want to add functionality to the objects that webdriverplus.Webdriver returns, which seems like it requires making a class, mysite.SiteSpecificDriver, copy+pasting the body of webdriverplus.WebDriver.__new__ into mysite.SiteSpecificDriver.__new__, and then writing mysite.Firefox (which needs to inherit from webdriverplus.Firefox and mysite.SiteSpecificMixin), mysite.Chrome (which needs to inherit from webdriverplus.Chrome and mysite.SiteSpecificMixin), etc, and re-handling all of the browsers inside my own module that the original author handles in his.
I'm handling it now with code like in my example above, and it works. I'm new to OOP, but my understanding of OO techniques is that they're supposed to let you avoid code that has long if-elif-...-else clauses that depend on what type of object you're working with, so I think that I must be doing something wrong.
You can rewrite this in a more dynamic manner:
from other_module import Foo, FooTypeA, FooTypeB
bases = [Foo, FooTypeA, FooTypeB]
class MyMixin(object):
pass
def factory(bases, mixins, name='MyClass'):
return type(name, bases + mixins, {})
new_classes = [factory((c,), (MyMixin,)) for c in bases]
I'm seeking advice about design of my code.
Introduction
I have several classes, each represents one file type, eg: MediaImageFile, MediaAudioFile and generic (and also base class) MediaGenericFile.
Each file have two variants: Master and Version, so I created these classes to define theirs specific behaviour. EDIT: Version represents resized/cropped/trimmed/etc variant of Master file. It's used mainly for previews.
EDIT: The reason, why I want to do it dynamically is that this app should be reusable (it's Django-app) and therefore it should be easy to implement other MediaGenericFile subclass without modifying original code.
What I want to do
First of all, user should be able to register own MediaGenericFile subclasses without affecting original code.
Whether file is version or master is easily (one regexp) recognizable from filename.
/path/to/master.jpg -- master
/path/to/.versions/master_version.jpg -- version
Master/Version classes use some methods/properties of MediaGenericFile, like filename (you need to know filename to generate new version).
MediaGenericFile extends LazyFile, which is just lazy File object.
Now I need to put it together…
Used design
Before I start coding 'versions' feature, I had factory class MediaFile, which returns appropriate file type class according to extension:
>>> MediaFile('path/to/image.jpg')
<<< <MediaImageFile 'path/to/image.jpg'>
Classes Master and Version define new methods which use methods and attributes of MediaGenericFile and etc.
Approach 1
One approach is create dynamically new type, which inherits Master (or Version) and MediaGenericFile (or subclass).
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
bases = (Version, klass)
class_name = '{0}Version'.format(klass.__name__)
else:
bases = (Master, klass)
class_name = '{0}Master'.format(klass.__name__)
new_class = type(class_name, bases, {})
...
return new_class(*args, **kwargs)
Approach 2
Second approach is create method 'contribute_to_instance' in Master/Version and call it after creating new_class, but that's more tricky than I thought:
classs Master(object):
#classmethod
def contribute_to_instance(cls, instance):
methods = (...)
for m in methods:
setattr(instance, m, types.MethodType(getattr(cls, m), instance))
class MediaFile(object):
def __new__(*args, **kwargs):
... # decision about new_class
obj = new_class(*args, **kwargs)
if version:
version_class = Version
else:
version_class = Master
version_class.contribute_to_instance(obj)
...
return obj
However, this doesn't work. There are still problems with calling Master/Version's methods.
Questions
What would be good way to implement this multiple inheritance?
How is this problem called? :) I was trying to find some solutions, but I simply don't know how to name this problem.
Thanks in advance!
Note to answers
ad larsmans
Comparison and instance check wouldn't be problem for my case, because:
Comparisons are redefined anyway
class MediaGenericFile(object):
def __eq__(self, other):
return self.name == other.name
I never need to check isinstance(MediaGenericFileVersion, instance). I'm using isinstance(MediaGenericFile, instance) and isinstance(Version, instance) and both works as expected.
Nevertheless, creating new type per instance sounds to me as considerable defect.
Well, I could create both variations dynamically in metaclass and then use them, something like:
>>> MediaGenericFile.version_class
<<< <class MediaGenericFileVersion>
>>> MediaGenericFile.master_class
<<< <class MediaGenericFileMaster>
And then:
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
attr_name = 'version_class'
else:
attr_name = 'master_class'
new_class = getattr(klass, attr_name)
...
return new_class(*args, **kwargs)
Final solution
Finally the design pattern is factory class. MediaGenericFile subclasses are statically typed, users can implement and register their own. Master/Version variants are created dynamically (glued together from several mixins) in metaclass and stored in 'cache' to avoid perils mentioned by larsmans.
Thanks everyone for their suggestions. Finally I understand the metaclass concept. Well, at least I think that I understand it. Push origin master…
I'd certainly advise against the first approach of constructing classes in __new__. The problem with it is that you create a new type per instance, which causes overhead and worse, causes type comparisons to fail:
>>> Ham1 = type("Ham", (object,), {})
>>> Ham2 = type("Ham", (object,), {})
>>> Ham1 == Ham2
False
>>> isinstance(Ham1(), Ham2)
False
>>> isinstance(Ham2(), Ham1)
False
This violates the principle of least surprise because the classes may seem entirely identical:
>>> Ham1
<class '__main__.Ham'>
>>> Ham2
<class '__main__.Ham'>
You can get approach 1 to work properly, though, if you construct the classes at the module level, outside of MediaFile:
classes = {}
for klass in [MediaImageFile, MediaAudioFile]:
for variant in [Master, Version]:
# I'd actually do this the other way around,
# making Master and Version mixins
bases = (variant, klass)
name = klass.__name__ + variant.__name__
classes[name] = type(name, bases, {})
then, in MediaFile.__new__, look the required class up by name in classes. (Alternatively, set the newly constructed classes on the module instead of in a dict.)
I'm not sure how dynamic you want it to be, but using a "factory pattern" (here using a class factory), is fairly readable and understandable and may do what you want. This could serve as a base... MediaFactory could be smarter, and you could register multiple other classes, instead of hard-coding MediaFactoryMaster etc...
class MediaFactory(object):
__items = {}
#classmethod
def make(cls, item):
return cls.__items[item]
#classmethod
def register(cls, item):
def func(kls):
cls.__items[item] = kls
return kls
return func
class MediaFactoryMaster(MediaFactory, Master): pass
class MediaFactoryVersion(MediaFactory, Version): pass
class MediaFile(object):
pass
#MediaFactoryMaster.register('jpg') # adapt to take ['jpg', 'gif', 'png'] ?
class MediaFileImage(MediaFile):
pass
#MediaFactoryVersion.register('mp3') # adapt to take ['mp3', 'ogg', 'm4a'] ?
class MediaFileAudio(MediaFile):
pass
other possible MediaFactory.make
#classmethod
def make(cls, fname):
name, ext = somefunc(fname)
kls = cls.__items[ext]
other = Version if Version else Master
return type('{}{}'.format(kls.__name__,other.__name__), (kls, other), {})
How come you're not using inheritance but are playing around with __new__?
class GenericFile(File):
"""Base class"""
class Master(object):
"""Master Mixin"""
class Versioned(object):
"""Versioning mixin"""
class ImageFile(GenericFile):
"""Image Files"""
class MasterImage(ImageFile, Master):
"""Whatever"""
class VersionedImage(ImageFile, Versioned):
"""Blah blah blah"""
...
It's not clear why you're doing this though. I think there's a weird code smell here. I'd recommend fewer classes with a consistent interface (duck-typing) rather than a dozen classes and isinstance checks throughout the code to make it all work.
Perhaps you can update your question with what you'd like to do in your code and folks can help either identify the real pattern or a suggest a more idiomatic solution.
You do not have to create a new class for each instance. Don't create the new classes in __new__ create them in __metaclass__. define a metaclass in the base or in the base_module. The two "variant" subclasses are easily saved as as class attributes of their genaric parent and then __new__ just looks at the filename according to it's own rules and decides which subclass to return.
Watch out for __new__ that returns a class other than the one "nominated" during the constructor call. You may have to take steps to invoke __init__ from withing __new__
Subclasses will either have to:
"register" themselves with a factory or parent to be found
be imported and then have the parent or factory find them through a recursive search of cls.__subclasses (might have to happen once per creation but that's probably not a problem for file handeling)
found through the use of "setuptools" entry_points type tools but that requires more effort and coordination by the user
The OOD question you should be asking is "do the various classes of my proposed inheritance share any properties at all?"
The purpose of inheritance is to share common data or methods that the instances naturally have in common. Aside from both being files, what do Image files and Audio files have in common? If you really want to stretch your metaphors, you could conceivably have AudioFile.view() which could present — for example — a visualization of the power spectra of the audio data, but ImageFile.listen() makes even less sense.
I think your question side-steps this language independent conceptual issue in favor of the Python dependent mechanics of an object factory. I don't think you have a proper case of inheritance here, or you've failed to explain what common features your Media objects need to share.