I want to create an Abstract Factory in order to abstract hardware differences between computers (say a RaspberryPi and an Arduino) in Python 2.7.
I am using the following implementation of an Abstract Factory:
'''
Provide a device-agnostic display interface
'''
from hardware import sysname
class DisplayBase(object):
def __init__(self):
pass
def show(self, message):
pass
def __str__(self):
return "DisplayBase"
def __repr__(self):
return self.__str__()
class RPIDisplay(DisplayBase):
def __new__(cls, *args, **kwargs):
from rpi_display import writeline
instance = super(RPIDisplay, cls).__new__(cls, *args, **kwargs)
return instance
def __str__(self):
return "RPIDisplay"
def show(self, message):
writeline(message)
class ArduinoDisplay(DisplayBase):
def __new__(cls, *args, **kwargs):
import arduino_display
instance = super(ArduinoDisplay, cls).__new__(cls, *args, **kwargs)
return instance
def __str__(self):
return "ArduinoDisplay"
def show(self, message):
return arduino_display.println(message)
class Display(DisplayBase): # Display Factory
def __new__(cls, *args, **kwargs):
platform = sysname()
if platform == "RaspberryPi":
return RPIDisplay()
elif platform == "Arduino":
return ArduinoDisplay()
else:
return MockDisplay()
if __name__ == "__main__":
display = Display()
print display
display.show("hello world")
The instantiation works fine, but when I try to run this, I get:
ArduinoDisplay
Traceback (most recent call last):
File "tt.py", line 56, in <module>
display.show("hello world")
File "tt.py", line 41, in show
return arduino_display.println(message)
NameError: global name 'arduino_display' is not defined
So the import of arduino_display does sorta work, but I cannot find a way to use it within the object.
Conditional imports are needed since different platforms will have different modules installed.
Any idea how to use those conditional imports?
I tried self.arduino_display and ArduinoDisplay.arduino_display but to no avail.
I could obviously catch import errors, as in, add to the top:
try:
import arduino_display
except:
pass
...and that would fail on a RPI, which would be fine, but there must be a better way...
The problem is due to import only binding names in its current scope. Doing an import in a function/method does not make it available in other methods.
Perform the imports where you actually need them. For example, ArduinoDisplay should import arduino_display where it is used:
class ArduinoDisplay(DisplayBase):
# no new, no import
def __str__(self):
return "ArduinoDisplay"
def show(self, message):
# import where needed
import arduino_display
return arduino_display.println(message)
Note that import is idempotent -- if the module has already been loaded before, import just binds the name again. This makes such nested import statements fast enough for most cases.
If your classes need many imports or speed is an issue, isolate classes into separate modules and import the entire module conditionally. You can directly assign the correct class using the common name, instead of having a dummy type that constructs another.
# ## display/arduino.py ##
# other systems accordingly
from .base import DisplayBase
# import once, globally
import arduino_display
class ArduinoDisplay(DisplayBase):
# no new, no import
def __str__(self):
return "ArduinoDisplay"
def show(self, message):
# import where needed
return arduino_display.println(message)
# ## display/__init__.py ##
from hardware import sysname
platform = sysname()
# resolve and import appropriate type once
if platform == "RaspberryPi":
from .rpi import RPIDisplay as Display
elif platform == "Arduino":
from .arduino import ArduinoDisplay as Display
else:
from .mock import MockDisplay as Display
Have you tried using from arduino_display import println then use println just like you did with the RPIs writeline?
Edit: missed the obvious...
You can do somthing like this:
from hardware import sysname
class Display(object):
def __init__(self, print_function, display_name):
self._print_function = print_function
self._display_name = display_name
def show(self, message):
self.print_function(message)
def __str__(self):
return self._display_name
def __repr__(self):
return self.__str__()
def create_display():
platform = sysname()
if platform == "RaspberryPi":
from rpi_display import writeline
return Display(writeline, "RPIDisplay")
elif platform == "Arduino":
from arduino_display import println
return Display(println, "ArduinoDisplay")
In case you need the classes and the nested factory, you can apply the same principle of storing the function object there too.
The problem is that the function you import aren't saved anywhere for the other methods to see them, so they are not visible anywhere else but locally. Try setting ArduinoDisplay.arduino_display = arduino_display after you import.
Here's some sample code that illustrates my meaning:
class ADisplay:
def __init__(self):
from builtins import print as display_write
def show(self, msg):
display_write(msg)
disp = ADisplay()
disp.show("mymsg")
This fails with NameError: name 'display_write' is not defined.
Now, bind the import to your class.
class ADisplay:
def __init__(self):
from builtins import print as display_write
ADisplay.display_write = display_write
def show(self, msg):
self.display_write(msg)
disp = ADisplay()
disp.show("mymsg")
This works. In fact you can even dispense with show by assigning it directly, if all those hardware print methods only take a string as a parameter and if you don't need to format or modify anything.
class ADisplay:
def __init__(self):
#do other init stuff...
pass
#only bind show the first time.
if getattr(ADisplay, "show", None):
return
from builtins import print as display_write
ADisplay.show = display_write
disp = ADisplay()
disp.show("mymsg")
Related
I have classes which require dependencies in order to be instantiated but are otherwise optional. I'd like to lazily import the dependencies and fail to instantiate the class if they aren't available. Note that these dependencies are not required at the package level (otherwise they'd be mandatory via setuptools). I currently have something like this:
class Foo:
def __init__(self):
try:
import module
except ImportError:
raise ModuleNotFoundError("...")
def foo(self):
import module
Because this try/except pattern is common, I'd like to abstract it into a lazy importer. Ideally if module is available, I won't need to import it again in Foo.foo so I'd like module to be available once it's been imported in __init__. I've tried the following, which populates globals() and fails to instantiate the class if numpy isn't available, but it pollutes the global namespace.
def lazy_import(name, as_=None):
# Doesn't handle error_msg well yet
import importlib
mod = importlib.import_module(name)
if as_ is not None:
name = as_
# yuck...
globals()[name] = mod
class NeedsNumpyFoo:
def __init__(self):
lazy_import("numpy", as_="np")
def foo(self):
return np.array([1,2,])
I could instantiate the module outside the class and point to the imported module if import doesn't fail, but that is the same as the globals() approach. Alternatively lazy_import could return the mod and I could call it whenever the module is needed, but this is tantamount to just importing it everywhere as before.
Is there a better way to handle this?
Pandas actually has a function import_optional_dependency which may make a good example (link GitHub) as used in SQLAlchemyEngine (link GitHub)
However, this is only used during class __init__ to get a meaningful error (raise ImportError(...) by default!) or warn about absence or old dependencies (which is likely a more practical use of it, as older or newer dependencies may import correctly anywhere if they exist, but not work or be explicitly tested against or even be an accidental local import)
I'd consider doing similarly, and either not bother to have special handling or only do it in the __init__ (and then perhaps only for a few cases where you're interested in the version, etc.) and otherwise simply import where needed
class Foo():
def __init__(self, ...):
import bar # only tests for existence
def usebar(self, value):
import bar
bar.baz(value)
Plausibly you could assign to a property of the class, but this may cause some trouble or confusion (as the import should already be available in globals once imported)
class Foo():
def __init__(self, ...):
import bar
self.bar = bar
def usebar(self, value):
self.bar.baz(value)
Gave it a quick test with a wrapper, seems to work fine:
def requires_math(fn):
def wrapper(*args, **kwargs):
global math
try:
math
except NameError:
import math
return fn(*args, **kwargs)
return wrapper
#requires_math
def func():
return math.ceil(5.5)
print(func())
Edit: More advanced one that works with any module, and ensures it is a module in case it's been set to something else.
from types import ModuleType
def requires_import(*mods):
def decorator(fn):
def wrapper(*args, **kwargs):
for mod in mods:
if mod not in globals() or not isinstance(globals()[mod], ModuleType):
globals()[mod] = __import__(mod)
return fn(*args, **kwargs)
return wrapper
return decorator
#requires_import('math', 'random')
def func():
return math.ceil(random.uniform(0, 10))
print(func())
I'm trying to find a way to dynamically add methods to a class through decorator.
The decorator i have look like:
def deco(target):
def decorator(function):
#wraps(function)
def wrapper(self, *args, **kwargs):
return function(*args, id=self.id, **kwargs)
setattr(target, function.__name__, wrapper)
return function
return decorator
class A:
pass
# in another module
#deco(A)
def compute(id: str):
return do_compute(id)
# in another module
#deco(A)
def compute2(id: str):
return do_compute2(id)
# **in another module**
a = A()
a.compute() # this should work
a.compute2() # this should work
My hope is the decorator should add the compute() function to class A, any object of A should have the compute() method.
However, in my test, this only works if i explicitly import compute into where an object of A is created. I think i'm missing something obvious, but don't know how to fix it. appreciate any help!
I think this will be quite simpler using a decorator implemented as a class:
class deco:
def __init__(self, cls):
self.cls = cls
def __call__(self, f):
setattr(self.cls, f.__name__, f)
return self.cls
class A:
def __init__(self, val):
self.val = val
#deco(A)
def compute(a_instance):
print(a_instance.val)
A(1).compute()
A(2).compute()
outputs
1
2
But just because you can do it does not mean you should. This can become a debugging nightmare, and will probably give a hard time to any static code analyser or linter (PyCharm for example "complains" with Unresolved attribute reference 'compute' for class 'A')
Why doesn't it work out of the box when we split it to different modules (more specifically, when compute is defined in another module)?
Assume the following:
a.py
print('importing deco and A')
class deco:
def __init__(self, cls):
self.cls = cls
def __call__(self, f):
setattr(self.cls, f.__name__, f)
return self.cls
class A:
def __init__(self, val):
self.val = val
b.py
print('defining compute')
from a import A, deco
#deco(A)
def compute(a_instance):
print(a_instance.val)
main.py
from a import A
print('running main')
A(1).compute()
A(2).compute()
If we execute main.py we get the following:
importing deco and A
running main
Traceback (most recent call last):
A(1).compute()
AttributeError: 'A' object has no attribute 'compute'
Something is missing. defining compute is not outputted. Even worse, compute is never defined, let alone getting bound to A.
Why? because nothing triggered the execution of b.py. Just because it sits there does not mean it gets executed.
We can force its execution by importing it. Feels kind of abusive to me, but it works because importing a file has a side-effect: it executes every piece of code that is not guarded by if __name__ == '__main__, much like importing a module executes its __init__.py file.
main.py
from a import A
import b
print('running main')
A(1).compute()
A(2).compute()
outputs
importing deco and A
defining compute
running main
1
2
I'm trying to figure out how to decorate a test function in a way that makes the information from the decorator available to setUp. The code looks something like this:
import unittest
class MyTest(unittest.TestCase):
def setUp(self):
stopService()
eraseAllPreferences()
setTestPreferences()
startService()
#setPreference("abc", 5)
def testPreference1(self):
pass
#setPreference("xyz", 5)
def testPreference2(self):
pass
The goal is for setUp to understand it's running testPreference1 and to know that it needs to set preference "abc" to 5 before starting the service (& similarly regarding "xyz" and testPreference2).
I can of course just use a conditional on the the test name (if self._testMethodName == "testPreference1") but that doesn't feel quite as maintainable as the number of tests grows (+ refactoring is more error-prone). I'm hoping to solve this in setUp rather than overriding the run implementation. I'm also having
I'm running python3.6 although if there are creative solutions depending on newer python features happy to learn about that too.
Decorators work well but there's no real "official" way to get the underlying method so I just did what the unittest source does: method = getattr(self, self._testMethodName)
import functools
import unittest
def setFoo(value):
def inner(func):
print(f"Changing foo for function {func}")
func.foo = value
#functools.wraps(func)
def wrapper(self, *args, **kwargs):
return func(self, *args, **kwargs)
return wrapper
return inner
class Foo(unittest.TestCase):
def setUp(self):
method = getattr(self, self._testMethodName)
print(f"Foo = {method.foo}")
#setFoo("abc")
def testFoo(self):
self.assertEqual(self.testFoo.foo, "abc")
#setFoo("xyz")
def testBar(self):
self.assertEqual(self.testBar.foo, "xyz")
if __name__ == "__main__":
unittest.main()
I am writing a class that sends slack messages to users when processes have finished. I thought it would be useful to provide a Jupyter magic so that users can be notified when the cell is executed.
The class already provides a decorator, so I figured I'd just wrap a cell execution in a decorated function.
from IPython.core.magic import register_cell_magic
from IPython import get_ipython
import functools
class MyClass(object):
def decorate(self, f):
#functools.wraps(f)
def wrapped(*args, **kwargs):
r = f(*args, **kwargs)
print('Send a message here!')
return r
return wrapped
#register_cell_magic
def magic(self, line, cell):
ip = get_ipython()
#self.decorate
def f():
return ip.run_cell(cell)
return f()
So then I'd do:
obj = MyClass()
# ----- NEW CELL
%%obj.magic
'''do some stuff'''
But I get
>> UsageError: Cell magic `%%obj.magic` not found.
I found out that the magic is registered under its name (above, magic), so %%magic works. But then the arguments are all messed up because there is no self in the mix.
I want the magic to be an instance method so that config (set in __init__ can be used). Is there any way to do this?
Here are a couple hacky solutions I don't want to implement unless I really have to:
Register a regular function with the instance as an argument. I don't want to add that line of code to the notebook, I want to use an instance method.
Register a regular function that constructs an instance on the fly.
This is the best I can come up with, and it's #1 on the list of the things I didn't want to do.
from IPython.core.magic import register_cell_magic
from IPython import get_ipython
import functools
class MyClass(object):
def decorate(self, f):
#functools.wraps(f)
def wrapped(*args, **kwargs):
r = f(*args, **kwargs)
print('Send a message here!')
return r
return wrapped
def register_magic(self):
#register_cell_magic
def magic(line, cell):
ip = get_ipython()
#self.decorate
def f():
return ip.run_cell(cell)
return f()
Then
obj = MyClass()
obj.register_magic()
# ------
%%magic
...
This works in a script to recognise if a is of class myproject.aa.RefClass
isinstance(a, myproject.aa.RefClass)
But how could I do it so I do not have to specify the full namespace ? I would like to be able to type:
isinstance(a, RefClass)
How is this done in Python ?
EDIT: let me give more details.
In module aa.referencedatatable.py:
class ReferenceDataTable(object):
def __init__(self, name):
self.name = name
def __call__(self, f):
self._myfn = f
return self
def referencedatatable_from_tag(tag):
import definitions
defn_lst = [definitions]
for defn in defn_lst:
referencedatatable_instance_lst = [getattr(defn, a) for a in dir(defn) if isinstance(getattr(defn, a), ReferenceDataTable)]
for referencedatatable_instance in referencedatatable_instance_lst
if referencedatatable_instance.name == tag
return referencedatatable_instance
raise("could not find")
def main()
referencedata_from_tag("Example")
In module aa.definitions.py:
from aa.referencedatatable import ReferenceDataTable
#ReferenceDataTable("Example")
def EXAMPLE():
raise NotImplementedError("not written")
For some reason calling the main from aa.referencedatatable.py will throw as it will not be able to recognise the instance of the class. But if I copy this main in another module it will work:
import aa.referencedatatable
a = aa.referencedatatable.referencedatatable_from_tag("Example")
print a
This second example works, for some reason calling this function inside the same module where the class is declared does not.
The 'namespace' is just a module object, and so is the class. You can always assign the class to a different name:
RefClass = myproject.aa.RefClass
or better yet, import it directly into your own namespace:
from myproject.aa import RefClass
Either way, now you have a global name RefClass that references the class object, so you can do:
isinstance(a, RefClass)