Dynamic classes import and object building - python

I have a python package which gathers multiple modules. In those modules, I have multiple classes heriting from a Component class. I'd like to make the load of those classes dynamic and to build some object dynamically.
ex:
package/module1.py
/module2.py
in module1.py, there is multiple classes heriting from the class Component, the same with module2.py, of course the number of classes and package is unknown. The final user define which object has to be built in a config file. In order to iter throught modules, I use the pkgutil.iter_modules which is working. From my function in charge to build the components, I do like this:
[...]
myPckge = __import__('package.module1', globals(), locals(), ['class1'], -1)
cmpt_object = locals()[component_name](self, component_prefix, *args)
[...]
However, this is not working as the class is not recognized, the following works but is not dynamic:
cmpt_object = myPckge.class1(self, component_prefix, *args)
thanks for your reply

you can use execfile() to load modules on the fly and then use exec() to create new objects from them. But I dont understand why you're doing this!

To find the subclasses of a class in a specified module, you can do something like:
import inspect
def find_subclasses(module, parent_cls):
return [clazz for name, clazz in inspect.getmembers(module)
if inspect.isclass(clazz) and
issubclass(clazz, parent_cls) and
clazz.__module__ == module.__name__ and # do not keep imported classes
clazz is not parent_cls]
Note that parent_cls doesn't have to be the direct parent of a class for it to be returned.
Then you can dynamically load classes from a module, knowing the module's name and directory, and the parent class of the classes you want.
import imp
def load_classes(module_name, module_dir, parent_cls):
fle, path, descr = imp.find_module(module_name, [module_dir])
if fle:
module = imp.load_module(module_name, fle, path, descr)
classes = find_subclasses(module, parent_cls)
return classes
return [] # module not found

Related

Modify function and keep code workability

I'm not a professional coder, but I use Python time to time for my scientific needs. So I want to learn what is the most Pythonic way to do the following:
I'm working with an already existed module, and some class there looks like that
class ATS(Instrument):
def __init__(self, ..., dll_path: str):
...
self._dll = ctypes.cdll.LoadLibrary(dll_path)
...
def _call_dll(self, func_name: str, *args) -> None:
func = getattr(self._dll, func_name)
output = func(*args)
...
I found that I need to use different DLLs to call its own functions (unfortunately, the names of the functions in different DLLs can be the same).
The question is: What is the most Pythonic way to modify that _call_dll function to explicitly specify which DLL I want to use to call a particular function. In the same time, I want to keep workability of the rest of the code, where old version of _call_dll is used.
I see several ways to do this, but I'm not sure which one is most professional and good-styling.
Create its own _call_dll_n function for each dll_n I want to use, but it's not compact and nice.
Add some prefix to the function name to specify DLL, like
class ATS(Instrument):
def __init__(self, ..., dll_path, dll_path_1, ...):
...
self._dll = ctypes.cdll.LoadLibrary(dll_path)
self._dll_1 = ctypes.cdll.LoadLibrary(dll_path_1)
...
def _call_dll(self, pre_func_name: str, *args) -> None:
if prefix_func_name[:5] == 'dll_1':
dll = self._dll_1
func_name = pre_func_name[5:]
func = getattr(dll, func_name)
...
else:
dll = self._dll # Default DLL.
func_name = pre_func_name
Make my_call_dll:
def _my_call_dll(self, func_name: str, dll = None, *args))
if dll is None:
self._call_dll(self, func_name, *args)
else:
dll_bckp = self._dll
self._dll = dll
self._call_dll(self, func_name, *args)
self._dll = dll_bckp
Your help on this particular example is appreciated, but also more general ideas about how to work and modify already existed functions/classes are very welcome.
You don't need to modify the code; the ATS class, as you presented it, already allows for what you've described. Instead: create multiple instances of ATS, each one specifying which DLL to use.
Your description of the problem entails two parts:
A mapping, from some key to the DLL file path.
An API wrapper, allowing you to specify a key into the above mapping when you call the API.
In Python, the built-in ‘dict’ type is the natural way to implement a mapping. You can use a plain string as the key.
import os.path
# You might get the path to your DLL files some other way,
# for example by reading a process environment variable.
# In this example I just hard-code the path root.
dll_file_path_root = os.path.join('/usr/lib', 'ats')
dll_file_paths = {
'foo': os.path.join(dll_file_path_root, 'foo.dll'),
'bar': os.path.join(dll_file_path_root, 'bar_v5.dll'),
'baz': os.path.join(dll_file_path_root, 'super_baz.dll'),
}
The existing ATS class, as you present it above, already implements an API wrapper. Each instance will hold a reference (its internal-use _dll attribute) for which DLL that ATS instance will talk to. The class will initialise each instance of ATS with whatever DLL you specify. So:
# Create an ATS instance that holds a reference to the ‘foo.dll’ library.
foo_ats = ATS(dll_path=dll_file_paths['foo'])
# Create an ATS instance that holds a reference to the ‘bar_v5.dll’ library.
bar_ats = ATS(dll_path=dll_file_paths['bar'])
# Call the ‘lorem’ function in the ‘foo.dll’ library.
foo_ats._call_dll(func_name='lorem')
# Call the ‘lorem’ function in the ‘bar_v5.dll’ library.
bar_ats._call_dll(func_name='lorem')
This is one of the primary benefits of defining classes: They encapsulate the common behaviour of a class of objects, while allowing each object to have individual attributes that differentiate them.

Dynamically import all subclasses

I have an abstract base class with a number of derived classes. I'm trying to achieve the same behaviour that I would get by placing all the derived classes in the same file as the base class, i.e. if my classes are Base, DerivedA, DerivedB, DerivedC in the file myclass.py I can write in another file
import myclass
a = myclass.DerivedA()
b = myclass.DerivedB()
c = myclass.DerivedC()
but with each derived class in its own file. This has to be dynamic, i.e. such that I could e.g. delete derived_c.py and everything still works except that now I can no longer call myclass.DerivedC, or that if I add a derived_d.py, I could use it without touching the __init__.py so simply using from derived_c import DerivedC is not an option.
I've tried placing them all in a subdirectory and in that directory's __init__.py use pkgutil.walk_packages() to import all the files dynamically, but I can't get them to then be directly in the module's namespace, i.e. rather than myclass.DerivedC() I have to call myclass.derived_c.DerivedC() because I can't figure out how (or if it's possible) to use importlib to achieve the equivalent of a from xyz import * statement.
Any suggestions for how I could achieve this? Thanks!
Edit: The solutions for Dynamic module import in Python don't provide a method for automatically importing the classes in all modules into the namespace of the package.
I had to make something quite similar a while back, but in my case I had to dynamically create a list with all subclasses from a base class in a specific package, so in case you find it useful:
Create a my_classes package containing all files for your Base class and all subclasses. You should include only one class in each file.
Set __all__ appropriately in __init__.py to import all .py files except for __init__.py (from this answer):
from os import listdir
from os.path import dirname, basename
__all__ = [basename(f)[:-3] for f in listdir(dirname(__file__)) if f[-3:] == ".py" and not f.endswith("__init__.py")]
Import your classes using from my_classes import *, since our custom __all__ adds all classes inside the my_classes package to the namespace.
However, this does not allow us direct access to the subclasses yet. You have to access them like this in your main script:
from my_classes import *
from my_classes.base import Base
subclasses = Base.__subclasses__()
Now subclasses is a list containing all classes that derive from Base.
Since Python 3.6 there exists a method for initializing subclasses. This is done on definition, so before all of your code gets executed. In here you can simply import the sub-class that is initialized.
base.py
class Base:
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
__import__(cls.__module__)
sub1.py
class Sub1(Base):
pass
sub2.py
class Sub2(Base):
pass

Defining same method override on lots of classes: DRY?

Suppose I have a large number of classes defined by an import of a large library codebase, which I don't want to hack around with for reasons of maintainability. They all inherit from BaseClass, and BaseClass contains a method which I want to augment. I think the following is a workable solution
class MyMixin(object):
def method( self, args):
... # 1. a few lines of code copied from BaseClass's def of method
... # 2. some lines of my code that can't go before or after the copied code
... # 3. and the rest of the copied code
class MyAbcClass( MyMixin, AbcClass):
pass
# many similar lines
class MyZzzClass( MyMixin, ZzzClass):
pass
The question. Is there a way to take, say, a list of ("MyXxxClass", XxxClass) tuples, and write code that defines the MyXxxClasses? And is it sufficiently comprehensible that it beats the repetition in the above?
Use three-arg type to define the classes, then set them on the module's global dictionary:
todefine = [('MyAbcClass', AbcClass), ...]
for name, base in todefine:
globals()[name] = type(name, (MyMixin, base), {})
If the names to define follow the fixed pattern you gave (`"My" + base class name), you can repeat yourself even less by dynamically constructing the name to define:
todefine = [AbcClass, ...]
for base in todefine:
name = "My" + base.__name__
globals()[name] = type(name, (MyMixin, base), {})
And if you are trying to wrap all the classes from a given module, you can avoid even explicitly listing the classes by introspecting the module to generate todefine programmatically (if you know the module has or lacks __all__ you can just use the appropriate approach instead of trying one and defaulting to the other):
import inspect
try:
# For modules that define __all__, we want all exported classes
# even if they weren't originally defined in the module
todefine = filter(inspect.isclass, (getattr(somemodule, name) for name in somemodule.__all__))
except AttributeError:
# If __all__ not defined, heuristic approach; exclude private names
# defined with leading underscore, and objects that were imported from
# other modules (so if the module does from itertools import chain,
# we don't wrap chain)
todefine = (obj for name, obj in vars(somemodule).items() if not name.startswith('_') and inspect.isclass(obj) and inspect.getmodule(obj) is somemodule)

Adding functions from other files to a Python class

I am having trouble with this setup mainly because I am not sure what I actually want in order to solve this problem.
This is the setup
- main.py
- lib
- __init__.py
- index.py
- test.py
__init__.py has this code
import os
for module in os.listdir(os.path.dirname(__file__)+"/."):
if module == '__init__.py' or module[-3:] != '.py':
continue
__import__(module[:-3], locals(), globals())
del module
main.py has this code as of now
from lib.index import *
print User.__dict__
index.py has this code
class User(object):
def test(self):
return "hi"
pass
test.py has this code
class User(object):
def tes2(self):
return "hello"
When I execute main.py it successfully prints the method test from index.py but what I am trying to do is figure out a way where I can just create a file in the lib folder where that while has only one function in the format
class User(object):
def newFunction(self):
return abc
and this function should automatically be available for me in main.py
I am sure that this is not a hard thing to do but I honestly don't know what I want (what to search for to solve this) which is preventing me from researching the solution.
You can use a metaclass to customize class creation and add functions defined elsewhere:
import types
import os
import os.path
import imp
class PluginMeta(type):
def __new__(cls, name, bases, dct):
modules = [imp.load_source(filename, os.path.join(dct['plugindir'], filename))
for filename in os.listdir(dct['plugindir']) if filename.endswith('.py')]
for module in modules:
for name in dir(module):
function = getattr(module, name)
if isinstance(function, types.FunctionType):
dct[function.__name__] = function
return type.__new__(cls, name, bases, dct)
class User(metaclass=PluginMeta):
plugindir = "path/to/the/plugindir"
def foo(self):
print "foo"
user = User()
print dir(user)
Then in the plugin files, just create functions not classes:
def newFunction(self, abc):
self.abc = abc
return self.abc
And the metaclass will find them, turn them into methods, and attach them to your class.
Classes are objects, and methods are nothing more than attributes on class-objects.
So if you want to add a method to an existing class, outside the original class block, all that is is the problem of adding an attribute to an object, which I would hope you know how to do:
class User(object):
pass
def newFunction(self):
return 'foo'
User.newFunction = newFunction
agf's metaclass answer is basically a nifty automatic way of doing this, although it works by adding extra definitions to the class block before the class is created, rather than adding extra attributes to the class object afterwards.
That should be basically all you need to develop a framework in which things defined in one module are automatically added to a class defined elsewhere. But you still need to make a number of design decisions, such as:
If your externally-defined functions need auxiliary definitions, how do you determine what's supposed to get added to the class and what was just a dependency?
If you have more than one class you're extending this way, how do you determine what goes in which class?
At what point(s) in your program does the auto-extension happen?
Do you want to say in your class "this class has extensions defined elsewhere", or say in your extensions "this is an extension to a class defined elsewhere", or neither and somewhere bind extensions to classes externally from both?
Do you need to be able to have multiple versions of the "same" class with different extensions active at the same time?
A metaclass such as proposed by agf can be a very good way of implementing this sort of framework, because it lets you put all the complex code in one place while still "tagging" every class that doesn't work the way classes normally work. It does fix the answers to some of the questions I posed above, though.
here a working code we used in a project, I'm not sure it's the best way but it worked and there is almost no additional code to add to other files
cpu.py:
from cpu_base import CPU, CPUBase
import cpu_common
import cpu_ext
cpu_base.py:
def getClass():
return __cpu__
def setClass(CPUClass):
global __cpu__
__cpu__ = CPUClass
__classes__.append(CPUClass)
def CPU(*kw):
return __cpu__(*kw)
class CPUBase:
def __init__(self):
your_init_Stuff
# optionally a method classname_constructor to mimic __init__ for each one
for c in __classes__:
constructor = getattr(c, c.__name__ + '_constructor', None)
if constructor is not None:
constructor(self)
setClass(CPUBase)
cpu_common.py:
from cpu_base import getClass, setClass
class CPUCommon(getClass()):
def CPUCommon_constructor(self):
pass
setClass(CPUCommon)
cpu_ext.py:
from cpu_base import getClass, setClass
class CPUExt(getClass()):
pass
setClass(CPUExt)
to use the class import CPU from cpu.py

How do I extend a python module? Adding new functionality to the `python-twitter` package

What are the best practices for extending an existing Python module – in this case, I want to extend the python-twitter package by adding new methods to the base API class.
I've looked at tweepy, and I like that as well; I just find python-twitter easier to understand and extend with the functionality I want.
I have the methods written already – I'm trying to figure out the most Pythonic and least disruptive way to add them into the python-twitter package module, without changing this modules’ core.
A few ways.
The easy way:
Don't extend the module, extend the classes.
exttwitter.py
import twitter
class Api(twitter.Api):
pass
# override/add any functions here.
Downside : Every class in twitter must be in exttwitter.py, even if it's just a stub (as above)
A harder (possibly un-pythonic) way:
Import * from python-twitter into a module that you then extend.
For instance :
basemodule.py
class Ball():
def __init__(self,a):
self.a=a
def __repr__(self):
return "Ball(%s)" % self.a
def makeBall(a):
return Ball(a)
def override():
print "OVERRIDE ONE"
def dontoverride():
print "THIS WILL BE PRESERVED"
extmodule.py
from basemodule import *
import basemodule
def makeBalls(a,b):
foo = makeBall(a)
bar = makeBall(b)
print foo,bar
def override():
print "OVERRIDE TWO"
def dontoverride():
basemodule.dontoverride()
print "THIS WAS PRESERVED"
runscript.py
import extmodule
#code is in extended module
print extmodule.makeBalls(1,2)
#returns Ball(1) Ball(2)
#code is in base module
print extmodule.makeBall(1)
#returns Ball(1)
#function from extended module overwrites base module
extmodule.override()
#returns OVERRIDE TWO
#function from extended module calls base module first
extmodule.dontoverride()
#returns THIS WILL BE PRESERVED\nTHIS WAS PRESERVED
I'm not sure if the double import in extmodule.py is pythonic - you could remove it, but then you don't handle the usecase of wanting to extend a function that was in the namespace of basemodule.
As far as extended classes, just create a new API(basemodule.API) class to extend the Twitter API module.
Don't add them to the module. Subclass the classes you want to extend and use your subclasses in your own module, not changing the original stuff at all.
Here’s how you can directly manipulate the module list at runtime – spoiler alert: you get the module type from types module:
from __future__ import print_function
import sys
import types
import typing as tx
def modulize(namespace: tx.Dict[str, tx.Any],
modulename: str,
moduledocs: tx.Optional[str] = None) -> types.ModuleType:
""" Convert a dictionary mapping into a legit Python module """
# Create a new module with a trivially namespaced name:
namespacedname: str = f'__dynamic_modules__.{modulename}'
module = types.ModuleType(namespacedname, moduledocs)
module.__dict__.update(namespace)
# Inspect the new module:
name: str = module.__name__
doc: tx.Optional[str] = module.__doc__
contents: str = ", ".join(sorted(module.__dict__.keys()))
print(f"Module name: {name}")
print(f"Module contents: {contents}")
if doc:
print(f"Module docstring: {doc}")
# Add to sys.modules, as per import machinery:
sys.modules.update({ modulename : module })
# Return the new module instance:
return module
… you could then use such a function like so:
ns = {
'func' : lambda: print("Yo Dogg"), # these can also be normal non-lambda funcs
'otherfunc' : lambda string=None: print(string or 'no dogg.'),
'__all__' : ('func', 'otherfunc'),
'__dir__' : lambda: ['func', 'otherfunc'] # usually this’d reference __all__
}
modulize(ns, 'wat', "WHAT THE HELL PEOPLE")
import wat
# Call module functions:
wat.func()
wat.otherfunc("Oh, Dogg!")
# Inspect module:
contents = ", ".join(sorted(wat.__dict__.keys()))
print(f"Imported module name: {wat.__name__}")
print(f"Imported module contents: {contents}")
print(f"Imported module docstring: {wat.__doc__}")
… You could also create your own module subclass, by specifying types.ModuleType as the ancestor of your newly declared class, of course; I have never personally found this necessary to do.
(Also, you don’t have to get the module type from the types module – you can always just do something like ModuleType = type(os) after importing os – I specifically pointed out this one source of the type because it is non-obvious; unlike many of its other builtin types, Python doesn’t offer up access to the module type in the global namespace.)
The real action is in the sys.modules dict, where (if you are appropriately intrepid) you can replace existing modules as well as adding your new ones.
Say you have an older module called mod that you use like this:
import mod
obj = mod.Object()
obj.method()
mod.function()
# and so on...
And you want to extend it, without replacing it for your users. Easily done. You can give your new module a different name, newmod.py or place it by same name at a deeper path and keep the same name, e.g. /path/to/mod.py. Then your users can import it in either of these ways:
import newmod as mod # e.g. import unittest2 as unittest idiom from Python 2.6
or
from path.to import mod # useful in a large code-base
In your module, you'll want to make all the old names available:
from mod import *
or explicitly name every name you import:
from mod import Object, function, name2, name3, name4, name5, name6, name7, name8, name9, name10, name11, name12, name13, name14, name15, name16, name17, name18, name19, name20, name21, name22, name23, name24, name25, name26, name27, name28, name29, name30, name31, name32, name33, name34, name35, name36, name37, name38, name39
I think the import * will be more maintainable for this use-case - if the base module expands functionality, you'll seamlessly keep up (though you might shade new objects with the same name).
If the mod you are extending has a decent __all__, it will restrict the names imported.
You should also declare an __all__ and extend it with the extended module's __all__.
import mod
__all__ = ['NewObject', 'newfunction']
__all__ += mod.__all__
# if it doesn't have an __all__, maybe it's not good enough to extend
# but it could be relying on the convention of import * not importing
# names prefixed with underscores, (_like _this)
Then extend the objects and functionality as you normally would.
class NewObject(object):
def newmethod(self):
"""this method extends Object"""
def newfunction():
"""this function builds on mod's functionality"""
If the new objects provide functionality you intend to replace (or perhaps you are backporting the new functionality into an older code base) you can overwrite the names
May I suggest not to reinvent the Wheel here? I'm building a >6k line Twitter Client for 2 month now, at first I checked python-twitter too, but it's lagging a lot behind the recent API changes,, Development doesn't seem to be that active either, also there was(at least when I last checked) no support for OAuth/xAuth).
So after searching around a bit more I discovered tweepy:
http://github.com/joshthecoder/tweepy
Pros: Active development, OAauth/xAuth and up to date with the API.
Chances are high that what you need is already in there.
So I suggest going with that, it's working for me, the only thing I had to add was xAuth(that got merge back to tweepy :)
Oh an a shameless plug, if you need to parse Tweets and/or format them to HTML use my python version of the twitter-text-* libraries:
http://github.com/BonsaiDen/twitter-text-python
This thing is unittestetd an guaranteed to parse Tweets just like Twitter.com does it.
Define a new class, and instead of inherit it from the class you want to extend from the original module, add an instance of the original class as an attribute to your new class.
And here comes the trick: intercept all non-existing method calls on your new class and try to call it on the instance of the old class.
In your NewClass just define new or overridden methods as you like:
import originalmodule
class NewClass:
def __init__(self, *args, **kwargs):
self.old_class_instance = originalmodule.create_oldclass_instance(*args, **kwargs)
def __getattr__(self, methodname):
"""This is a wrapper for the original OldClass class.
If the called method is not part of this NewClass class,
the call will be intercepted and replaced by the method
in the original OldClass instance.
"""
def wrapper(*args, **kwargs):
return getattr(self.old_class_instance, methodname)(*args, **kwargs)
return wrapper
def new_method(self, arg1):
"""Does stuff with the OldClass instance"""
thing = self.old_class_instance.get_somelist(arg1)
# returns the first element only
return thing[0]
def overridden_method(self):
"""Overrides an existing method, if OldClass has a method with the same name"""
print("This message is coming from the NewClass and not from the OldClass")
In my case I used this solution when simple inheritance from the old class was not possible, because an instance had to be created not by its constructor, but with an init script from an other class/module. (It is the originalmodule.create_oldclass_instance in the example above.)

Categories

Resources