Reference class from #staticmethod - python

↑↑↑ It does NOT
Let's say I have a class with some utility methods:
class Utils:
#staticmethod
def do_stuff():
# some stuff
Utils.do_other_stuff()
# some more stuff
#staticmethod
def do_other_stuff():
# somehting other
I don't really like the Utils.do_other_stuff() part.
If it was instance method, I would reference it via self, but here I have to write the full class name.
Is this where #classmethod is a good idea to use, or is it overkill? - or is there some cleaner way to write Utils, perhaps with a module?

If you need a reference to the current class (which could be a subclass), then definitely make it a classmethod.
That's not overkill; the amount of work Python does to bind a class method is no different from a static method, or a regular method for that matter.
However, don't use classes here unless you have to. Python is not Java, you do not have to use a class and functions can live outside of classes just fine.

#classmethod is the way to go:
class Utils:
#classmethod
def do_stuff(cls):
# some stuff
cls.do_other_stuff()
# some more stuff
#classmethod
def do_other_stuff(cls):
# somehting other
Just a clarification related to Martijn Pieters comment: I usually avoid #staticmethod and I prefer to adopt always #classmethod because it allows me to refer to the class and its methods. (I don't agree with suggestions about writing modules with functions… I'm an OOP supporter :P)

It doesn't look like Utils will ever be subclassed or instantiated; it's just a wrapper for static methods. In that case, these methods can all be turned into module-level functions, perhaps in a separate utils module:
# No class!
def do_stuff():
...
do_other_stuff()
...
def do_other_stuff():
...

Related

Provide object methods into the namespace of a python environment

I am trying to provide wrappers for short-cutting every-day commands. Python environments are very useful to do that.
Is it possible to provide all methods of an object to the local namespace within a new environment?
class my_object:
def method_a():
...
class my_environment:
...
def __enter__(self):
some_object = my_object()
# something like `from some_object import *` ??
return(some_object)
...
with my_environment() as some_object:
# standard syntax:
some_object.method_a()
# shortcut:
method_a() # how to make this possible?
It will be rather complex, and IMHO will not be worth it. The problem is that in Python, local variables are local to a function and not to a bloc. So what you are asking for would require that:
__enter__ declares nonlocal variables for all of the methods from some_object and saves their previous value if any
__exit__ restore the previous values if any of those variables, or deletes them if they did not previously existed
Possible but not really Pythonic IMHO (the reason why I have not proposed any code...). After all, inside a method Python requires the object to be explicitely passed, and requires it to be prepended to any internal method call or attribute access. So my advice is to stick to the standard syntax here...
What you are looking for is class hierarchy. On the way, please be careful with the conventions for class names.
class MyObject:
def method_a():
...
class MyEnvironment(MyObject):
...
def __enter__(self):
return self
...
with MyEnvironment() as some_object:
# standard syntax:
some_object.method_a()
The shortcut you are looking doesn't make much sense because the method_a() was defined as a method, therefore it should be called together with the instance.
Maybe #staticmethod can serve your case better.
class MyEnvironment:
#staticmethod
def method_a():
...
MyEnvironment.method_a()

Is there a magic method to know that a base class is used through inheritance?

I'd like to describe my problem with code to make it clear:
class MyBaseClass(object):
def __init__(self):
print 'foobar'
def __call__(self):
print 'spameggs'
def __is_used__(self): # This is only a pseudo method
print 'I\'m being used! - MyBaseClass'
class MySubClass(MyBaseClass):
def __init__(self):
print 'monty python'
Now I'd like to know if there is a similar magic method __is_used__ for a class object to know if it is being use as a parent/base class of another class (sub)?
Example usage:
class_a = MySubClass()
# Output
# monty python
# I'm being used! - MyBaseClass
Use Case
To avoid confusion (I apologize). A best example would be a mixin. Example an S3Mixin.
An S3Mixin has a capabilities to upload and download file to S3 buckets.
class S3Mixin(object):
def upload(self):
def download(self):
Now i want to use it to ImageFile and VideoFile classes.
class ImageFile(S3Mixin):
# omitted lengthy properties
class VideoFile(S3Mixin):
# omitted lengthy properties
Now each object has a function to use the s3 basic functionalities. Now the real problem arise when I try to use another module inside a S3Mixin which cause a circular dependency issue. Now to avoid it I have to import it inside each function of S3Mixin. I tried putting it on the __init__ method and __call__ method which is obviously not going to work.
I don't want to do that. Instead I wanted to know if there is available method so I can import all the conflicted module preferable on a magic method of an S3Mixin.
Note:
I'm not asking for a checking of a class that is a subclass of another class. That is far from the question. I would like to know if there is a MAGIC METHOD so i can further create a logic in it when a base class is used.

Extending a class in Python inside a decorator

I am using a decorator to extend certain classes and add some functionality to them, something like the following:
def useful_stuff(cls):
class LocalClass(cls):
def better_foo(self):
print('better foo')
return LocalClass
#useful_stuff
class MyClass:
def foo(self):
print('foo')
Unfortunaltely, MyClass is no longer pickleable due to the non global LocalClass
AttributeError: Can't pickle local object 'useful_stuff.<locals>.LocalClass'
I need to pickle my classes. Can you recommend a better design?
Considering that there can be multiple decorators on a class, would switching to multiple inheritance by having MyClass inherit all the functionality be a better option?
You need to set the metadata so the subclass looks like the original:
def deco(cls):
class SubClass(cls):
...
SubClass.__name__ = cls.__name__
SubClass.__qualname__ = cls.__qualname__
SubClass.__module__ = cls.__module__
return SubClass
Classes are pickled by using their module and qualname to record where to find the class. Your class needs to be found in the same location the original class would have been if it hadn't been decorated, so pickle needs to see the same module and qualname. This is similar to what funcutils.wraps does for decorated functions.
However, it would probably be simpler and less bug-prone to instead add the new methods directly to the original class instead of creating a subclass:
def better_foo(self):
print('better_foo')
def useful_stuff(cls):
cls.better_foo = better_foo
return cls

Automatic/transparent decoration of methods in Python

I'm wondering if it's possible to have a class which automatically wraps all of its methods, without it being necessary to explicity decorate each of them.
For example, instead of doing this:
class MyClass:
#mydecorator
def mymethod1(...):
...
#mydecorator
def mymethod2(...):
...
I'd like to do something like this:
class MyClass(metaclass=DecoratedMethods):
def mymethod1(...):
...
def mymethod2(...):
...
Here I'm hinting at metaclasses, but I'm not sure it's the right solution path.
I've just discovered the __prepare__ protocol. This would allow me to do something naive like decorate all functions or all callables in the class namespace, but that's not really what I want. I only want to decorate methods(class methods and instance methods).
Python has so many metaprogramming facilities, I'd be surprised if there wasn't a way... Or at least a better way than decorating each method manually?
I'm using python 3.6.
Thanks!

Python2.7: infinite loop when super __init__ creates an instance of it's own subclass

I have the sense that this must be kind of a dumb question—nub here. So I'm open to an answer of the sort "This is ass-backwards, don't do it, please try this: [proper way]".
I'm using Python 2.7.5.
General Form of the Problem
This causes an infinite loop unless Thesaurus (an app-wide singleton) does not call Baseclass.__init__()
class Baseclass():
def __init__(self):
thes = Thesaurus()
#do stuff
class Thesaurus(Baseclass):
def __init__(self):
Baseclass.__init__(self)
#do stuff
My Specific Case
I have a base class that virtually every other class in my app extends (just some basic conventions for functionality within the app; perhaps should just be an interface). This base class is meant to house a singleton of a Thesaurus class that grants some flexibility with user input by inferring some synonyms (ie. {'yes':'yep', 'ok'}).
But since the subclass calls the superclass's __init__(), which in turn creates another subclass, loops ensue. Not calling the superclass's __init__() works just fine, but I'm concerned that's merely a lucky coincidence, and that my Thesaurus class may eventually be modified to require it's parent __init__().
Advice?
Well, I'm stopping to look at your code, and I'll just base my answer on what you say:
I have a base class that virtually every other class in my app extends (just some basic conventions for functionality within the app; perhaps should just be an interface).
this would be ThesaurusBase in the code below
This base class is meant to house a singleton of a Thesaurus class that grants some flexibility with user input by inferring some synonyms (ie. {'yes':'yep', 'ok'}).
That would be ThesaurusSingleton, that you can call with a better name and make it actually useful.
class ThesaurusBase():
def __init__(self, singleton=None):
self.singleton = singleton
def mymethod1(self):
raise NotImplementedError
def mymethod2(self):
raise NotImplementedError
class ThesaurusSingleton(ThesaurusBase):
def mymethod1(self):
return "meaw!"
class Thesaurus(TheraususBase):
def __init__(self, singleton=None):
TheraususBase.__init__(self, singleton)
def mymethod1(self):
return "quack!"
def mymethod2(self):
return "\\_o<"
now you can create your objects as follows:
singleton = ThesaurusSingleton()
thesaurus = Thesaurus(singleton)
edit:
Basically, what I've done here is build a "Base" class that is just an interface defining an expected behavior for all its children classes. The class ThesaurusSingleton (I know that's a terrible name) is also implementing that interface, because you said it had too and I did not want to discuss your design, you may always have good reasons for weird constraints.
And finally, do you really need to instantiate your singleton inside the class that is defining the singleton object? Though there may be some hackish way to do so, there's often a better design that avoids the "hackish" part.
What I think is that however you create your singleton, you should better do it explicitly. That's in the "Zen of python": explicit is better than implicit. Why? because then people reading your code (and that might be you in six months) will be able to understand what's happening and what you were thinking when you wrote that code. If you try to make things more implicit (like using sophisticated meta classes and weird self-inheritance) you may wonder what this code does in less than three weeks!
I'm not telling to avoid that kind of options, but to only use sophisticated stuff when you're out of simple ones!
Based on what you said I think the solution I gave can be a starting point. But as you focus on some obscure, yet not very useful hackish stuff instead of talking about your design, I can't be sure if my example is that appropriate, and hint you on the design.
edit2:
There's an another way to achieve what you say you want (but be sure that's really the design you want). You may want to use a class method that will act on the class itself (instead of the instances) and thus enable you to store a class-wide instance of itself:
>>> class ThesaurusBase:
... #classmethod
... def initClassWide(cls):
... cls._shared = cls()
...
>>> class T(ThesaurusBase):
... def foo(self):
... print self._shared
...
>>> ThesaurusBase.initClassWide()
>>> t = T()
>>> t.foo()
<__main__.ThesaurusBase instance at 0x7ff299a7def0>
and you can call the initClassWide method at the module level of where you declare ThesaurusBase, so whenever you import that module, it will have the singleton loaded (the import mechanism ensuring that python modules are run only once).
the short answer is:
do not instantiate an instance of a sub class from the super class constructor
longer answer:
if the motive you have to try to do this is the fact the Thesaurus is a singleton then you'll be better off exposing the singleton using a static method in the class (Thesaurus) and calling this method when you need the singleton

Categories

Resources