Polymorphism or Inheritance or any other suggestion? - python

I trying my hands on python. I am trying to implement a crypto class which does enc/dec . In my crypto class i require user to pass 3 args to do the enc dec operations. Till now i was reading key from file and doing the operations. Now i want to provide a generate key function also. But problem is that to call generate keys i dont want user to provide any arguments while initiating the class.
So what essentially i am trying to achieve is that when the crypto class is instantiated without giving any arguments, i just want to expose generate_key function. and when all the 3 args are provided while instantiating class, I want to expose all other enc/dec functions but not key gen function.
I am not able to understand is it a polymorphism situation, or inheritance or should i just use 2 different classes one having generate keys and other for enc dec functions..
Please give me some suggestion about how can i handle this situation efficiently?
Example:
class crypto:
def __init__(self,id, val1):
self._id = id
self._val1 = val1
def encrypt(self):
""" encryption here """
def save(self):
""" save to file """
def load(self):
""" load from file"""
def decrypt(self):
""" decryption here"""
def gen_keys(self):
""" gen key here"""
So Now, when this crypto class is instantiate with no arguments, i just want to expose gen keys function. and if it is instantiated with the id and val1, then i want to expose all functions but not gen keys.
I hope this will provide some clarification about my question. Please suggest me how can i achieve this.
Thanks,
Jon

You want a factory with either inherited or duck-typed objects. For example:
class CryptoBasic(object):
def __init__(self, *args):
"""Do what you need to do."""
def basic_method(self, *args):
"""Do some basic method."""
class CryptoExtended(CryptoBasic):
def __init__(self, *args):
"""Do what you need to do."""
def extended_method(self, *args):
"""Do more."""
# This is the factory method
def create_crypto(req_arg, opt_arg=None):
if opt_arg:
return CryptoExtended(req_arg, opt_arg)
else:
return CryptoBasic(req_arg)

Related

Refactor the python design patterns

I have a class called resources and I have defined one method called get_connect. I want to use the data of which get_connect returns to the other classes. I need at least three classes and I use the data of get_connect and I have to parse that data. To implement this I have written the code below
class resources:
#staticmethod
def get_connect():
return 1 + 2
class Source1(resources):
def __init__(self):
self.response = resources.get_connect()
def get__details1(self):
print(self.response)
class Source2(resources):
def __init__(self):
self.response = resources.get_connect()
def get_details2(self):
print(self.response)
class Source3(resources):
def __init__(self):
self.response = resources.get_connect()
def get__detail3(self):
print(self.response)
source1 = Source1()
source2 = Source2()
source3 = Source3()
source1.get__details1()
source2.get_details2()
source3.get__detail3()
But the problem with the code is for every class in init method I am calling the get_connect method. I don't want to repeat the code. I need help for avoiding redundancy which I have asked below
Is there any way I can call get_connect in one place and use it for other classes maybe a decorator or anything? if yes how can I?
While creating objects also I am calling each class and calling each method every time. is there a way to use any design pattern here?
If anyone helps me with these oops concepts it will be useful.
First of all, is there any reason why you are using get_connect method as static?
Because what you can do here is declare it in the parent class:
class resources:
def __init__(self):
self.response = self.get_connect()
def get_connect(self):
return 1 + 2
This way you do not need to define the __init__ method on every class, as it will be automatically inherited from the parent.
Regarding the second question, it really depends on the context, but you can use a strategy pattern in order to retrieve the class that you require to call. For this rename the method of get details into the same for each of the classes, as basically they're used for the same purpose, but changed on the context of the class implementation:
class Source1(resources):
def get_details(self):
print(self.response)
class Source2(resources):
def get_details(self):
print(self.response)
class Source3(resources):
def get_details(self):
print(self.response)
classes = {
"source_1": Source1,
"source_2": Source2,
"source_3": Source3
}
source_class = classes["source_1"]
source = source_class()
source.get_details()
Hope this helped!

Using Configparser to create objects of a class?

I'm a little stumped on how to do this.
Let's say I have a employees.ini file that looks like this:
[amber]
sex=female
age=29
location=usa
income=60000
debt=300
[john]
sex=male
age=19
location=usa
income=19000
debt=nan
I have a for loop to access each piece of information and assign to a variable.
from configparser import ConfigParser
config=ConfigParser()
config.read('employees.ini')
for section in config.sections():
name=section
sex=config[section]['sex']
age=config[section]['age']
location=config[section]['location']
income=config[section]['income']
debt=config[section]['debt']
I also have a class where each section can be accepted as an object:
class Users:
def __init__(self, name, sex, age, location, debt):
self.__name=name
self.__sex=sex
self.__age=age
self.__location=location
self.__income=income
self.__debt=debt
def foo(self):
do a thing
def bar(self):
do a different thing ...
My hope was to now be able to access amber.foo and john.bar. However, I'm struggling on how to pass the variables out of the for loop into the class before they are overwritten by the next iteration of the loop. I feel like I might be overthinking this one.
My thought was this would make the code much more user friendly as a bulk of the code could be left untouched and only the .ini would need to be updated when a new user is needed.
Thanks for any help you could give.
I would add a class method to parse the configuration file data into a new object.
class User:
def __init__(self, name, sex, age, location, debt):
self.__name=name
self.__sex=sex
self.__age=age
self.__location=location
self.__income=income
self.__debt=debt
#classmethod
def from_config(cls, name, config):
return cls(name, config['sex'], config['age'], config['location'], config['debt']
def foo(self):
do a thing
def bar(self):
do a different thing ...
Now the details of how to actually create an instance of User are abstracted away in the class itself; the code that iterates through the configuration need only pass the relevant data to the class method.
from configparser import ConfigParser
config=ConfigParser()
config.read('employees.ini')
users = [User.from_config(section, config[section]) for section in config.sections()]
Since your class uses the key names of configuration file as parameter names, you could just unpack the dict and use __init__ directly instead of defining a class method.
from configparser import ConfigParser
config=ConfigParser()
config.read('employees.ini')
users = [User(section, **config[section]) for section in config.sections()]

Python - Abstract class subclassing

I'm trying to find the best for users of my python library to implement an abstract class I wrote.
Namely, my abstract class define an API to access specific values stored in a database, but I would like to let the user choose how to store it (simple text file, json, sqlite, etc.)
My problem is, how should I retrieve the class the user create and use it in my library ?
This is the solution I came up with, but I don't find it very graceful and wonder if there is a more pythonic way.
In my library:
from abc import ABC, abstractmethod
class Database(ABC):
#abstractmethod
def get(self, index):
pass
#abstractmethod
def insert(self, data):
pass
def get_database():
"""call this anywhere I need a concrete database class"""
return Database.__subclasses__()[-1]
In the user code
class SqliteDatabase(Database):
def get(self, index):
# sqlite SELECT and such
return data
def insert(self, data):
# sqlite INSERT INTO
# return data with index provided
return data
Of course, I will return a better error than IndexError if there is no subclass defined, but you get the idea.
Thank you in advance !
I finally settled for something else, as Blorgbeard suggested
_databases = {}
def register(dbname="default"):
def wrapper(klass):
_databases[dbname] = klass
return klass
return wrapper
def get_db(name="default"):
return _databases[name]
And the user only needs to declare
#register()
class SqliteDatabase:
def __get__(self, index):
# retrieve data
if data is None:
raise KeyError(index)
return data
This way, anybody can declare as many as databases as they want.
If you have improvements over this version, I'll gladly take them.

How to execute BaseClass method before it gets overridden by DerivedClass method in Python

I am almost sure that there is a proper term for what I want to do but since I'm not familiar with it, I will try to describe the whole idea explicitly. So what I have is a collection of classes that all inherit from one base class. All the classes consist almost entirely of different methods that are relevant within each class only. However, there are several methods that share similar name, general functionality and also some logic but their implementation is still mostly different. So what I want to know is whether it's possible to create a method in a base class that will execute some logic that is similar to all the methods but still continue the execution in the class specific method. Hopefully that makes sense but I will try to give a basic example of what I want.
So consider a base class that looks something like that:
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
And an example of a derived class:
class App1(App):
def __init__(self, testName):
. . .
super(App1, self).__init__(testName)
def access(self):
LOGIC_SPECIFIC
So what I'd like to achieve is that the LOGIC_SHARED part in base class access method to be executed when calling the access method of any App class before executing the LOGIC_SPECIFIC part which is(as it says) specific for each access method of all derived classes.
If that makes any difference, the LOGIC_SHARED mostly consists of logging and maintenance tasks.
Hope that is clear enough and the idea makes sense.
NOTE 1:
There are class specific parameters which are being used in the LOGIC_SHARED section.
NOTE 2:
It is important to implement that behavior using only Python built-in functions and modules.
NOTE 3:
The LOGIC_SHARED part looks something like that:
try:
self.localLog.info("Checking the actual link for %s", self.application)
self.link = self.checkLink(self.application)
self.localLog.info("Actual link found!: %s", self.link)
except:
self.localLog.info("No links found. Going to use the default link: %s", self.link)
So, there are plenty of specific class instance attributes that I use and I'm not sure how to use these attributes from the base class.
Sure, just put the specific logic in its own "private" function, which can overridden by the derived classes, and leave access in the Base.
class Base(object):
def access(self):
# Shared logic 1
self._specific_logic()
# Shared logic 2
def _specific_logic(self):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedB specific logic
def test():
x = Base()
x.access() # Shared logic 1
# Shared logic 2
a = DerivedA()
a.access() # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access() # Shared logic 1
# Derived B specific logic
# Shared logic 2
The easiest method to do what you want is to simply call the parent's class access method inside the child's access method.
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
class App1(App):
def __init__(self, testName):
super(App1, self).__init__(testName)
def access(self):
App.access(self)
# or use super
super(App1, self).access()
However, your shared functionality is mostly logging and maintenance. Unless there is a pressing reason to put this inside the parent class, you may want to consider is to refactor the shared functionality into a decorator function. This is particularly useful if you want to reuse similar logging and maintenance functionality for a range of methods inside your class.
You can read more about function decorators here: http://www.artima.com/weblogs/viewpost.jsp?thread=240808, or here on Stack Overflow: How to make a chain of function decorators?.
def decorated(method):
def decorated_method(self, *args, **kwargs):
LOGIC_SHARED
method(self, *args, **kwargs)
return decorated_method
Remember than in python, functions are first class objects. That means that you can take a function and pass it as a parameter to another function. A decorator function make use of this. The decorator function takes another function as a parameter (here called method) and then creates a new function (here called decorated_method) that takes the place of the original function.
Your App1 class then would look like this:
class App1(App):
#logged
def access(self):
LOGIC_SPECIFIC
This really is shorthand for this:
class App1(App):
def access(self):
LOGIC_SPECIFIC
decorated_access = logged(App.access)
App.access = decorated_access
I would find this more elegant than adding methods to the superclass to capture shared functionality.
If I understand well this commment (How to execute BaseClass method before it gets overridden by DerivedClass method in Python) you want that additional arguments passed to the parent class used in derived class
based on Jonathon Reinhart's answer
it's how you could do
class Base(object):
def access(self,
param1 ,param2, #first common parameters
*args, #second positional parameters
**kwargs #third keyword arguments
):
# Shared logic 1
self._specific_logic(param1, param2, *args, **kwargs)
# Shared logic 2
def _specific_logic(self, param1, param2, *args, **kwargs):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param3):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param4):
# DerivedB specific logic
def test():
x = Base()
a = DerivedA()
a.access("param1", "param2", "param3") # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access("param1", "param2", param4="param4") # Shared logic 1
# Derived B specific logic
# Shared logic 2
I personally prefer Jonathon Reinhart's answer, but seeing as you seem to want more options, here's two more. I would probably never use the metaclass one, as cool as it is, but I might consider the second one with decorators.
With Metaclasses
This method uses a metaclass for the base class that will force the base class's access method to be called first, without having a separate private function, and without having to explicitly call super or anything like that. End result: no extra work/code goes into inheriting classes.
Plus, it works like maaaagiiiiic </spongebob>
Below is the code that will do this. Here http://dbgr.cc/W you can step through the code live and see how it works :
#!/usr/bin/env python
class ForceBaseClassFirst(type):
def __new__(cls, name, bases, attrs):
"""
"""
print("Creating class '%s'" % name)
def wrap_function(fn_name, base_fn, other_fn):
def new_fn(*args, **kwargs):
print("calling base '%s' function" % fn_name)
base_fn(*args, **kwargs)
print("calling other '%s' function" % fn_name)
other_fn(*args, **kwargs)
new_fn.__name__ = "wrapped_%s" % fn_name
return new_fn
if name != "BaseClass":
print("setting attrs['access'] to wrapped function")
attrs["access"] = wrap_function(
"access",
getattr(bases[0], "access", lambda: None),
attrs.setdefault("access", lambda: None)
)
return type.__new__(cls, name, bases, attrs)
class BaseClass(object):
__metaclass__ = ForceBaseClassFirst
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
def access(self):
print("in OtherClass access function")
print("OtherClass attributes:")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
This uses a metaclass to replace OtherClass's access function with a function that wraps a call to BaseClass's access function and a call to OtherClass's access function. See the best explanation of metaclasses here https://stackoverflow.com/a/6581949.
Stepping through the code should really help you understand the order of things.
With Decorators
This functionality could also easily be put into a decorator, as shown below. Again, a steppable/debuggable/runnable version of the code below can be found here http://dbgr.cc/0
#!/usr/bin/env python
def superfy(some_func):
def wrapped(self, *args, **kwargs):
# NOTE might need to be changed when dealing with
# multiple inheritance
base_fn = getattr(self.__class__.__bases__[0], some_func.__name__, lambda *args, **kwargs: None)
# bind the parent class' function and call it
base_fn.__get__(self, self.__class__)(*args, **kwargs)
# call the child class' function
some_func(self, *args, **kwargs)
wrapped.__name__ = "superfy(%s)" % some_func.__name__
return wrapped
class BaseClass(object):
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
#superfy
def access(self):
print("in OtherClass access function")
print("OtherClass attributes")
print("----------------------")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
print("")
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
The decorator above retrieves the BaseClass' function of the same name, and calls that first before calling the OtherClass' function.
May this simple approach can help.
class App:
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
self.application = None
self.link = None
def access(self):
print('There is something BaseClass must do')
print('The application is ', self.application)
print('The link is ', self.link)
class App1(App):
def __init__(self, testName):
# ...
super(App1, self).__init__(testName)
def access(self):
self.application = 'Application created by App1'
self.link = 'Link created by App1'
super(App1, self).access()
print('There is something App1 must do')
class App2(App):
def __init__(self, testName):
# ...
super(App2, self).__init__(testName)
def access(self):
self.application = 'Application created by App2'
self.link = 'Link created by App2'
super(App2, self).access()
print('There is something App2 must do')
and the test result:
>>>
>>> app = App('Baseclass')
>>> app.access()
There is something BaseClass must do
The application is None
The link is None
>>> app1 = App1('App1 test')
>>> app1.access()
There is something BaseClass must do
The application is Application created by App1
The link is Link created by App1
There is something App1 must do
>>> app2 = App2('App2 text')
>>> app2.access()
There is something BaseClass must do
The application is Application created by App2
The link is Link created by App2
There is something App2 must do
>>>
Adding a combine function we can combine two functions and execute them one after other as bellow
def combine(*fun):
def new(*s):
for i in fun:
i(*s)
return new
class base():
def x(self,i):
print 'i',i
class derived(base):
def x(self,i):
print 'i*i',i*i
x=combine(base.x,x)
new_obj=derived():
new_obj.x(3)
Output Bellow
i 3
i*i 9
it need not be single level hierarchy it can have any number of levels or nested

Python decorators and class inheritance

I'm trying to use decorators in order to manage the way users may or may not access resources within a web application (running on Google App Engine). Please note that I'm not allowing users to log in with their Google accounts, so setting specific access rights to specific routes within app.yaml is not an option.
I used the following resources :
- Bruce Eckel's guide to decorators
- SO : get-class-in-python-decorator2
- SO : python-decorators-and-inheritance
- SO : get-class-in-python-decorator
However I'm still a bit confused...
Here's my code ! In the following example, current_user is a #property method which belong to the RequestHandler class. It returns a User(db.model) object stored in the datastore, with a level IntProperty().
class FoobarController(RequestHandler):
# Access decorator
def requiredLevel(required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
return f
return wrap
#requiredLevel(100)
def get(self, someparameters):
#do stuff here...
#requiredLevel(200)
def post(self):
#do something else here...
However, my application uses different controllers for different kind of resources. In order to use the #requiredLevel decorator within all subclasses, I need to move it to the parent class (RequestHandler) :
class RequestHandler(webapp.RequestHandler):
#Access decorator
def requiredLevel(required_level):
#See code above
My idea is to access the decorator in all controller subclasses using the following code :
class FoobarController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self):
#do stuff here...
I think I just reached the limit of my knowledge about decorators and class inheritance :). Any thoughts ?
Your original code, with two small tweaks, should also work. A class-based approach seems rather heavy-weight for such a simple decorator:
class RequestHandler(webapp.RequestHandler):
# The decorator is now a class method.
#classmethod # Note the 'klass' argument, similar to 'self' on an instance method
def requiredLevel(klass, required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
return f
return wrap
class FoobarController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self, someparameters):
#do stuff here...
#RequestHandler.requiredLevel(200)
def post(self):
#do something else here...
Alternately, you could use a #staticmethod instead:
class RequestHandler(webapp.RequestHandler):
# The decorator is now a static method.
#staticmethod # No default argument required...
def requiredLevel(required_level):
The reason the original code didn't work is that requiredLevel was assumed to be an instance method, which isn't going to be available at class-declaration time (when you were decorating the other methods), nor will it be available from the class object (putting the decorator on your RequestHandler base class is an excellent idea, and the resulting decorator call is nicely self-documenting).
You might be interested to read the documentation about #classmethod and #staticmethod.
Also, a little bit of boilerplate I like to put in my decorators:
#staticmethod
def requiredLevel(required_level):
def wrap(func):
def f(self, *args):
if self.current_user.level >= required_level:
func(self, *args)
else:
raise Exception('Insufficient level to access this resource')
# This will maintain the function name and documentation of the wrapped function.
# Very helpful when debugging or checking the docs from the python shell:
wrap.__doc__ = f.__doc__
wrap.__name__ = f.__name__
return f
return wrap
After digging through StackOverflow, and carefully reading Bruce Eckel's guide to decorators, I think I found a possible solution.
It involves implementing the decorator as a class in the Parent class :
class RequestHandler(webapp.RequestHandler):
# Decorator class :
class requiredLevel(object):
def __init__(self, required_level):
self.required_level = required_level
def __call__(self, f):
def wrapped_f(*f_args):
if f_args[0].current_user.level >= self.required_level:
return f(*f_args)
else:
raise Exception('User has insufficient level to access this resource')
return wrapped_f
This does the work ! Using f_args[0] seems a bit dirty to me, I'll edit this answer if I find something prettier.
Then you can decorate methods in subclasses the following way :
FooController(RequestHandler):
#RequestHandler.requiredLevel(100)
def get(self, id):
# Do something here
#RequestHandler.requiredLevel(250)
def post(self)
# Do some stuff here
BarController(RequestHandler):
#RequestHandler.requiredLevel(500)
def get(self, id):
# Do something here
Feel free to comment or propose an enhancement.

Categories

Resources