Get PyCharm to know what classes are mixin for - python

Our application has set of complex form wizards. To avoid code duplication I created several mixins.
The problem is that PyCharm highlights mixin methods with Unresolved attribute refference error.
This is correct as object does not have such methods. But I know that this mixin will be used only with special classes. Is there any way to tell this info to PyCharm?
For now I use such approach:
class MyMixin(object):
def get_context_data(self, **kwargs):
assert isinstance(self, (ClassToBeExtended, MyMixin))
# super.get_context_data is still highlighter,
# as super is considered as object
context = super(MyMixin, self).get_context_data(**kwargs)
context.update(self.get_preview_context())
return context
def get_preview_context(self):
# without this line PyCharm highlights the self.initial_data
assert isinstance(self, (ClassToBeExtended, MyMixin))
return {'needs': (self.initial_data['needs']
if 'type' not in self.initial_data
else '%(needs)s %(type)s' % self.initial_data)}
While this works for some cases like autocomplete for self., it fails for other cases like super. Is there a better approach to achieve the desired behavior?
P.S.: I know that I can disable reference check for specific name or whole class, but I don't want to do this as it will not help in typo checks and autocomplete.

You can type-hint to PyCharm what kind of classes to expect.
class DictMixin(object):
def megamethod(
self, # type: dict
key
):
return self.get(key)
It's still not quite comparable to other type handling.
PyCharm is lazy in evaluating it, and only does so when first working on self.
Things are a bit tricky when accessing attributes of the mixin as well - self, # type: dict | DictMixin works for one of my classes, but not in my test code.
In python 3.5, you should be able to use # type: typing.Union[dict, DictMixin].

If you are creating Mixin, for, let's say ClassSub, which is subclass of ClassSuper, you can implement Mixins this way:
class Mixin1(ClassSuper):
pass
class Mixin2(ClassSuper):
pass
and then use them like:
class ClassSub(Mixin1, Mixin2):
pass
That way I use some mixins for models in Django. Also, django-extensions uses similar pattern (gives models that are actually mixins). Basically, this way you don't have to inherit ClassSuper, because it's "included" in every of your mixins.
Most important - PyCharm works like a charm this way.

Related

Is it possible to set a Metaclass globally so it applies to all classes created by default?

I get that a metaclass can be substituted for type and define how a newly created class behaves.
ex:
class NoMixedCase(type):
def __new__(cls,clsname,base,clsdict):
for name in clsdict:
if name.lower() != name:
raise TypeError("Bad name.Don't mix case!")
return super().__new__(cls,clsname,base,clsdict)
class Root(metaclass=NoMixedCase):
pass
class B(Root):
def Foo(self): #type error
pass
However, is there a way of setting NoMixedCase globally, so anytime a new class is created it's behavior is defined by NoMixedCase by default, without havining to inherit from Root?
So if you did...
Class B:
def Foo(self):
pass
...it would still check case on method names.
As for your question, no, it it is not ordinarily - and possibly not even some extra-ordinary thng that will work for this - a lot of CPythons inner things are tied to the type class, and hardcoded to it.
What is possible of trying, without crashing the interpretrer right away, would be to write a wrapper for type.__new__ and use ctypes to replace it directly in type.__new__ slot. (Ordinary assignment won't do it). You'd probably still crash things.
So, in real life, if you decide not to go via a linter program with a plug-in and commit hooks as I suggested in the comment above, the way to go is to have a Base class that uses your metaclass, and get everyone in your project to inherit from that Base.

How to tell if a class is abstract in Python 3?

I wrote a metaclass that automatically registers its classes in a dict at runtime. In order for it to work properly, it must be able to ignore abstract classes.
The code works really well in Python 2, but I've run into a wall trying to make it compatible with Python 3.
Here's what the code looks like currently:
def AutoRegister(registry, base_type=ABCMeta):
class _metaclass(base_type):
def __init__(self, what, bases=None, attrs=None):
super(_metaclass, self).__init__(what, bases, attrs)
# Do not register abstract classes.
# Note that we do not use `inspect.isabstract` here, as
# that only detects classes with unimplemented abstract
# methods - which is a valid approach, but not what we
# want here.
# :see: http://stackoverflow.com/a/14410942/
metaclass = attrs.get('__metaclass__')
if not (metaclass and issubclass(metaclass, ABCMeta)):
registry.register(self)
return _metaclass
Usage in Python 2 looks like this:
# Abstract classes; these are not registered.
class BaseWidget(object): __metaclass__ = AutoRegister(widget_registry)
class BaseGizmo(BaseWidget): __metaclass__ = ABCMeta
# Concrete classes; these get registered.
class AlphaWidget(BaseWidget): pass
class BravoGizmo(BaseGizmo): pass
What I can't figure out, though, is how to make this work in Python 3.
How can a metaclass determine if it is initializing an abstract class in Python 3?
PEP3119 describes how the ABCMeta metaclass "marks" abstract methods and creates an __abstractmethods__ frozenset that contains all methods of a class that are still abstract. So, to check if a class cls is abstract, check if cls.__abstractmethods__ is empty or not.
I also found this relevant post on abstract classes useful.
I couldn't shake the feeling as I was posting this question that I was dealing with an XY Problem. As it turns out, that's exactly what was going on.
The real issue here is that the AutoRegister metaclass, as implemented, relies on a flawed understanding of what an abstract class is. Python or not, one of the most important criteria of an abstract class is that it is not instanciable.
In the example posted in the question, BaseWidget and BaseGizmo are instanciable, so they are not abstract.
Aren't we just bifurcating rabbits here?
Well, why was I having so much trouble getting AutoRegister to work in Python 3? Because I was trying to build something whose behavior contradicts the way classes work in Python.
The fact that inspect.isabstract wasn't returning the result I wanted should have been a major red flag: AutoRegister is a warranty-voider.
So what's the real solution then?
First, we have to recognize that BaseWidget and BaseGizmo have no reason to exist. They do not provide enough functionality to be instantiable, nor do they declare abstract methods that describe the functionality that they are missing.
One could argue that they could be used to "categorize" their sub-classes, but a) that's clearly not what's going on in this case, and b) quack.
Instead, we could embrace Python's definition of "abstract":
Modify BaseWidget and BaseGizmo so that they define one or more abstract methods.
If we can't come up with any abstract methods, then can we remove them entirely?
If we can't remove them but also can't make them properly abstract, it might be worthwhile to take a step back and see if there are other ways we might solve this problem.
Modify the definition of AutoRegister so that it uses inspect.isabstract to decide if a class is abstract: see final implementation.
That's cool and all, but what if I can't change the base classes?
Or, if you have to maintain backwards compatibility with existing code (as was the case for me), a decorator is probably easier:
#widget_registry.register
class AlphaWidget(object):
pass
#widget_registry.register
class BravoGizmo(object):
pass

Are Mixin classes abstract base classes

Are Mixin classes abstract base classes? In the example below, the calls to test_base would fail because python wouldn't be able to resolve self.assertEqual for example.
Also, is PyCharm incorrect as flagging Mixin classes like the one below has having unresolved attribute errors?
class TestConverterMixin(object):
def setUp(self):
self.alt_hasher = getattr(hash, self.converter.__class__.__name__)
def test_base(self):
with self.settings(PASSWORD_HASHERS=[self.hasher, ]):
load_hashers(settings.PASSWORD_HASHERS)
for password in PASSWORDS:
orig = self.alt_hasher.encrypt(password)
conv = self.converter.from_orig(orig)
# see if we get a working hash:
self.assertTrue(check_password(password, conv))
# convert back and test with passlib:
back = self.converter.to_orig(conv)
self.assertEqual(orig, back)
Are Mixin classes AbstractBaseClasses? The most accurate answer for your case is no but it probably should be.
Your class as a stand-alone cannot survive for you reasons you pointed out. By making it an ABC you explicitly tell anyone looking at your class (like pycharm) that
from abc import ABCMeta, abstractmethod
class TestConverterMixin(object):
__metaclass__ = ABCMeta
#abstractmethod
def assertEqual(self, other):
"Need concrete implementation somewhere"
.... the rest of your code
The problem is that you would need this for all of the other methods (self.AssertTrue, self.converter etc). You could have something else in mind but this seriously looks like just a subclass of unittest.TestCase to me.
Oh and was PyCharm wrong. No, they got it right. If you made this an ABC or a subclass of TestCase, they would not have complained. If you used Interfaces, like zope.Interface, pycharm and the like usually get that wrong since they don't understand the registration and lookup process.(it is outside the python core)
I kept having troubles to get PyCharm to not complain about unresolved attribute reference errors on mixin classes. In particular, I also had mixin classes depending on other mixin classes for which I couldn't make one inherit from the other. But then I found this almost perfect way to make PyCharm 2017.1 happy:
class Human:
def is_male(self):
return True
class BeardMixin:
_facial_hair = {'length': 7, 'color': 'brown'}
def has_beard(self):
return True
class BeardLengthMixin:
"""Mixin for class Human with BeardMixin to provide get_beard_length()"""
def get_beard_length(self):
assert isinstance(self, (Human, BeardMixin))
# PyCharm will now not complain about any of these 3 attributes
if self.is_male() and self.has_beard():
return self._facial_hair['length']
The assert statement gives PyCharm the necessary information about which types self could be. There's a drawback though: The assert statement itself does not do what you think it does: It only checks that self is of either type, not whether it's of both. Unfortunately, using two assert statements doesn't work, because the second overrides the first one as far as PyCharm's type deduction is concerned.

In Python, when should I use a meta class?

I have gone through this: What is a metaclass in Python?
But can any one explain more specifically when should I use the meta class concept and when it's very handy?
Suppose I have a class like below:
class Book(object):
CATEGORIES = ['programming','literature','physics']
def _get_book_name(self,book):
return book['title']
def _get_category(self, book):
for cat in self.CATEGORIES:
if book['title'].find(cat) > -1:
return cat
return "Other"
if __name__ == '__main__':
b = Book()
dummy_book = {'title':'Python Guide of Programming', 'status':'available'}
print b._get_category(dummy_book)
For this class.
In which situation should I use a meta class and why is it useful?
Thanks in advance.
You use metaclasses when you want to mutate the class as it is being created. Metaclasses are hardly ever needed, they're hard to debug, and they're difficult to understand -- but occasionally they can make frameworks easier to use. In our 600Kloc code base we've used metaclasses 7 times: ABCMeta once, 4x models.SubfieldBase from Django, and twice a metaclass that makes classes usable as views in Django. As #Ignacio writes, if you don't know that you need a metaclass (and have considered all other options), you don't need a metaclass.
Conceptually, a class exists to define what a set of objects (the instances of the class) have in common. That's all. It allows you to think about the instances of the class according to that shared pattern defined by the class. If every object was different, we wouldn't bother using classes, we'd just use dictionaries.
A metaclass is an ordinary class, and it exists for the same reason; to define what is common to its instances. The default metaclass type provides all the normal rules that make classes and instances work the way you're used to, such as:
Attribute lookup on an instance checks the instance followed by its class, followed by all superclasses in MRO order
Calling MyClass(*args, **kwargs) invokes i = MyClass.__new__(MyClass, *args, **kwargs) to get an instance, then invokes i.__init__(*args, **kwargs) to initialise it
A class is created from the definitions in a class block by making all the names bound in the class block into attributes of the class
Etc
If you want to have some classes that work differently to normal classes, you can define a metaclass and make your unusual classes instances of the metaclass rather than type. Your metaclass will almost certainly be a subclass of type, because you probably don't want to make your different kind of class completely different; just as you might want to have some sub-set of Books behave a bit differently (say, books that are compilations of other works) and use a subclass of Book rather than a completely different class.
If you're not trying to define a way of making some classes work differently to normal classes, then a metaclass is probably not the most appropriate solution. Note that the "classes define how their instances work" is already a very flexible and abstract paradigm; most of the time you do not need to change how classes work.
If you google around, you'll see a lot of examples of metaclasses that are really just being used to go do a bunch of stuff around class creation; often automatically processing the class attributes, or finding new ones automatically from somewhere. I wouldn't really call those great uses of metaclasses. They're not changing how classes work, they're just processing some classes. A factory function to create the classes, or a class method that you invoke immediately after class creation, or best of all a class decorator, would be a better way to implement this sort of thing, in my opinion.
But occasionally you find yourself writing complex code to get Python's default behaviour of classes to do something conceptually simple, and it actually helps to step "further out" and implement it at the metaclass level.
A fairly trivial example is the "singleton pattern", where you have a class of which there can only be one instance; calling the class will return an existing instance if one has already been created. Personally I am against singletons and would not advise their use (I think they're just global variables, cunningly disguised to look like newly created instances in order to be even more likely to cause subtle bugs). But people use them, and there are huge numbers of recipes for making singleton classes using __new__ and __init__. Doing it this way can be a little irritating, mainly because Python wants to call __new__ and then call __init__ on the result of that, so you have to find a way of not having your initialisation code re-run every time someone requests access to the singleton. But wouldn't be easier if we could just tell Python directly what we want to happen when we call the class, rather than trying to set up the things that Python wants to do so that they happen to do what we want in the end?
class Singleton(type):
def __init__(self, *args, **kwargs):
super(Singleton, self).__init__(*args, **kwargs)
self.__instance = None
def __call__(self, *args, **kwargs):
if self.__instance is None:
self.__instance = super(Singleton, self).__call__(*args, **kwargs)
return self.__instance
Under 10 lines, and it turns normal classes into singletons simply by adding __metaclass__ = Singleton, i.e. nothing more than a declaration that they are a singleton. It's just easier to implement this sort of thing at this level, than to hack something out at the class level directly.
But for your specific Book class, it doesn't look like you have any need to do anything that would be helped by a metaclass. You really don't need to reach for metaclasses unless you find the normal rules of how classes work are preventing you from doing something that should be simple in a simple way (which is different from "man, I wish I didn't have to type so much for all these classes, I wonder if I could auto-generate the common bits?"). In fact, I have never actually used a metaclass for something real, despite using Python every day at work; all my metaclasses have been toy examples like the above Singleton or else just silly exploration.
A metaclass is used whenever you need to override the default behavior for classes, including their creation.
A class gets created from the name, a tuple of bases, and a class dict. You can intercept the creation process to make changes to any of those inputs.
You can also override any of the services provided by classes:
__call__ which is used to create instances
__getattribute__ which is used to lookup attributes and methods on a class
__setattr__ which controls setting attributes
__repr__ which controls how the class is diplayed
In summary, metaclasses are used when you need to control how classes are created or when you need to alter any of the services provided by classes.
If you for whatever reason want to do stuff like Class[x], x in Class etc., you have to use metaclasses:
class Meta(type):
def __getitem__(cls, x):
return x ** 2
def __contains__(cls, x):
return int(x ** (0.5)) == x ** 0.5
# Python 2.x
class Class(object):
__metaclass__ = Meta
# Python 3.x
class Class(metaclass=Meta):
pass
print Class[2]
print 4 in Class
check the link Meta Class Made Easy to know how and when to use meta class.

override method of class in python

I'd like to override a class method, not creating subclass/extending from a class.
An example:
from django.contrib import admin
class NewModelAdmin(admin.ModelAdmin):
def formfield_for_dbfield(self, db_field, **kwargs):
# some custom stuff
Now I don't want to change all the classes (aka Models) which extend from admin.ModelAdmin to NewModelAdmin. But I don't want to modify the original django code either.
Is there some way to accomplish this?
I'm not 100% clear with what you want to do, and why you don't want to create a new subclass or have a method of a different name.
But in general in python you can do something like:
class MyClass(object):
def print_hello(self):
print "not hello"
def real_print_hello():
print "hello"
x = MyClass()
x.print_hello() # "not hello"
setattr(x, "print_hello", real_print_hello)
x.print_hello() # "hello"
Are you trying to do 'monkey patching'?
http://mail.python.org/pipermail/python-dev/2008-January/076194.html
In order to keep your code maintainable, it's best to go ahead and have your individual ModelAdmin classes inherit from NewModelAdmin. This way, other developers who look at your code (and you, perhaps a year or two later) can clearly see where the custom formfield_for_dbfield behavior originates from so that it can be updated if needed. If you monkey-patch admin.ModelAdmin, it will make it much more difficult to track down issues or change the behavior if needed later.
Chances are good that your problem is solvable without monkey-patching, which often can have unintended consequences.
How are you registering models with the django admin?
If you are using this approach:
admin.site.register(FooModel) #uses generic ModelAdmin
You have the problem of needing to change this to many boilerplate instances of subclasses of NewModelAdmin, which would look like this:
class FooModelAdmin(NewModelAdmin):
pass #does nothing except set up inheritance
admin.site.register(FooModel, FooModelAdmin)
This is really wordy and might take a lot of time to implement if you have a lot of models, so do it programmatically by writing a wrapper function:
def my_admin_register(model):
class _newmodeladmin(ModelAdmin):
def your_overridden_method(*args, **kwargs):
#do whatever here
admin.site.register(model, _newmodeladmin)
Then, you can use this like this:
my_admin_register(FooModel)
You can change a class method using setattr() on the class - aka monkey patching.
If you modify a method in a class you modify behavior for:
all instances which resolve their method to that class
all derived classes which resolve their method to that class
Your requirements are mutually exclusive. You cannot modify the behavior of a class without impacting those object which resolve their methods to the class.
In order to not modify the behaviors of these other objects you would want to create the method in your instance so that it doesn't resolve it's method in the class.
Another alternative is to rely on Python's duck-typing. You don't need the object to be directly related to the one currently used. You could reimplement the interface and in the code swap out the calls to the old class for your new one.
These techniques have tradeoffs in maintainability and design. In other words don't use them unless you have no other options.
I'm not 100% sure what you are trying to achieve, but I suppose you want to behave all admins that inherit from models.ModelAdmin without having to change their declaration. The only solution to achieve this will be monkey-patching the original ModelAdmin class, eg. something like:
setattr(admin.ModelAdmin, 'form_field_for_dbfield', mymethod)
This for sure not the most recommendable way, because the code will be hard to maintain and other things.

Categories

Resources