Python: Importing Libraries as Variables - python

class A():
def __init__(self):
install_package('abc')
import abc
def sq(self):
print(abc.sqrt(5))
Background
I am writing helper classes (all stored in a helper.py) for my clients who use python to run a application(which I send them). These helper classes help the application function. To allow for quicker deployment on client side, I wrote a function called install_package that implicitly calls pip install.
All clients receive the same helper.py file but a different application. The application I send them usually use a subset of classes from helper.py.
Motive
So the reason I use the above class structure is now pretty obvious, I do not want to load all the libraries in the start of the helper.py as the libraries will have a corresponding install_package() with them. So for a client who's application may not use all the classes, there is no need to install all the unnecessary libraries
Issue
Now the issue is that using the above structure to A() seems like a feasible option but the package that I import in the __init__ function does not get globally imported and hence sq() cannot use functions in the abc library. What's the best way to fix this? One way would be to store the library in a variable local to the class. How would I go about doing it? Suggestions to change the class structure also welcome!

What about something like this:
import importlib
class A():
def __init__(self):
install_package('abc')
self.abc = importlib.import_module('abc')
def sq(self):
print(self.abc.sqrt(5))

Related

What is the right way to use service methods in Django?

As an example, let's say I am building an Rest API using Django Rest Framework. Now as part of the application, a few methods are common across all views. My approach is that in the root directory, I have created a services.py file. Inside that module, is a class (CommonUtils) containing all the common utility methods. In that same services.py module I have instantiated an object of CommonUtils.
Now across the application, in the different views.py files I am importing the object from the module and calling the methods on that object. So, essentially I am using a singleton object for the common utility methods.
I feel like this is not a good design approach. So, I want to get an explanation for why this approach is not a good idea and What would the best practice or best approach to achieve the same thing, i.e use a set of common utility methods across all views.py files.
Thanks in advance.
Is this the right design? Why? How to do better?
I feel like this is not a good design approach. So, I want to get an explanation for why this approach is not a good idea and What would the best practice or best approach to achieve the same thing, i.e use a set of common utility methods across all views.py files.
Like #Dmitry Belaventsev wrote above, there is no general rule to solve this problem. This is a typical case of cross-cutting-concerns.
Now across the application, in the different views.py files I am importing the object from the module and calling the methods on that object. So, essentially I am using a singleton object for the common utility methods.
Yes, your implementation is actually a singleton and there is nothing wrong with it. You should ask yourself what do you want to achieve or what do you really need. There are a lot of solutions and you can start with the most basic one:
A simple function in a python module
# file is named utils.py and lives in the root directory
def helper_function_one(param):
return transcendent_all_evil_of(param)
def helper_function_two(prename, lastname):
return 'Hello {} {}'.format(prename, lastname)
In Python it is not uncommon to use just plain functions in a module. You can upgrade it to a method (and a class) if this is really necessary and you need the advantages of classes and objects.
You also can use a class with static methods:
# utils.py
class Utils():
#staticmethod
def helper_one():
print('do something')
But you can see, this is nothing different than the solution with plain functions besides the extra layer of the class. But it has no further value.
You could also write a Singleton Class but in my opinion, this is not very pythonic, because you get the same result with a simple object instance in a module.

Class names and imports as variables in Python

I want a separate Python code where I can define default .py files that have to be created at the start of a project, depending on what models I want. So when I start a new project, I don't have to copy the code from a different project and adjust class names, etc. So for instance, I want to automatically create a model_1.py as:
class Model1(object):
code
and a model_2.py as:
class Model2(object):
code
I want these to be created from another file, where I define which models have to be created. So for instance:
models = ['Model1', 'Model2']
Is it possible to have the class name as a variable? So something like:
class models[0]()
Moreover, is something similar possible for the import part? So
from model_type_x.test import *
where model_type_x is a variable?
What other possibilities are there? Let Python create a text file and turn this into a .py file?
You need this module named cookiecutter. You can have templates for your project and have them configured with a prompt to create your project
First of all, python file are simply text files. You just have to save them with a .py extension.
What you're trying to achieve is more or less out of the scope of python. Python by itself doesn't generate code. If you want to generate code, you can use templates in any language that you like. It doesn't really matter much since the code isn't going to get executed.
Class names and import names cannot be variables. These are syntax sugar allowing you to define types or import code.
If you want to import using a variable name, you can import modules as such:
__import__(module_name)
Where module_name is a variable so you can import modules at runtime with this if you can guess how they are called or going to be imported... Even though it's possible to do that, I do not recommend using this method as it's pretty ugly and pretty much useless to do that that way since we usually know beforehand what we're importing. You can always use the "*" but that's also not a particularly good idea because some things inside a module won't get exported and it's usually better to explicitly tell what you're importing.
class models[0]()
This is clearly not possible, the keyword class is used to define a type. What you can do on the other hand is this:
locals()[models[0]] = type(models[0], parent_tuple, attributes)
But accessing locals() to define a local variable using the type constructor to define a new type manually instead of using the class keyword that makes things much more easily to read... There's just no point to do otherwise.
The real question here is... What are you trying to achieve? Chances are that you're not looking for the right solution to a problem you don't have.

BaseClass inheritance from a separate file

I am automating an environment that consists of multiple independent applications. At some point I have decided that it will make the most sense to define each application as a class and save it as a separate file.
What I have at the moment:
A directory with *py files where each file defines a single class for a single application. So, for example, I have application App1. It is saved in App1.py file and looks something like this:
class App1:
def __init__(self):
self.link = config.App1_link
self.retry = config.App1_retry
def access(self):
pass
def logOut(self):
pass
So each class defines all operations App1 can perform, like start\login to the application, perform operation, log out from the application, etc..
Then, in a separate file I create application objects and call object's methods one by one to create a multiple-steps scenario.
What I want to achieve
It all works fine but there is a place for improvement. Many of the class methods are more or less similar (for example methods like login\logout to\from the application) and clearly could be defined in some kind of base class I can then inherit from. However, since I have all the applications defined in separate files (which seems the logical solution at the moment), I don't know where to define such base class and how to inherit from it. I had 2 options in mind but I'm not sure which one (if any) will be the proper way to do it.
Define a base class in a separate file and then import it into each one of the application classes. Not sure if that's even makes sense.
Create one very long file with all the applications classes and include the base class in the same file. Note: such file will be really huge and hard to maintain since each application has its own methods and some of them a fairly big files by themselves.
So are these the only options I have and if they are, which one of them makes more sense?
This is exactly what importing is for. Define your base class in a separate module, and import it wherever you need it. Option 1 makes excellent sense.
Note that Python's module naming convention is to use lowercase names; that way you can distinguish between the class and the module much easier. I'd rename App1.py to app1.py.

Proper way of setting classes and constants in python package

I'm writing a small package for internal use and come to a design problem. I define a few classes and constants (i.e., server IP address) in some file, let's call it mathfunc.py. Now, some of these classes and constants will be used in other files in the same package. My current setup is like this:
/mypackage
__init__.py
mathfunc.py
datefunc.py
So, at the moment I think I have to import mathfunc.py in datefunc.py to use the classes defined there (or alternatively import both of them all the time). This sounds wrong to me because then I'll be in a lot of pain importing lots of files everywhere. Is it a proper design at all or there is some other way? Maybe I can put all definitions in some file which will not be a subpackage on its own, but will be used by all other files?
Nope, that's pretty much how Python works. If you want to use objects declared in another file, you have to import from it.
Tips:
You can keep your namespace clean by only importing the things you need, rather than using from foo import *.
If you really need to do a "circular import" (where A needs things in B, and B needs things in A) you can solve that by only importing inside the functions where you need the object, not at the top of a file.

Calling a class from an imported module

Probably stupid question, and I've read many of the similar threads on here but still can't fathom the answer:
in main.py
from userMod import *
class Handler(webapp2.RequestHandler):
def write(self): #some code here etc
in userMod.py
class signup(Handler):
def get(self): #some code here etc
I get an error saying that Handler is not defined. My simple yet clearly stupid question is how can I access classes from a parent script within a loaded module? Or is it that I simply need to duplicate Handler in each module I create?
Bear in mind I've very new to Python and trying to make my code more modular by splitting out certain types of functions (in this case the user login and registration component of the site I'm building).
In usermod.py you need to import main, not the other way around.
Can't say for certain without seeing your code, but in that very basic snippet you basically have the imports reversed. #IgnacioVazquez-Abrams provided a link that will describe the whole process in general (and definitely better than I can), but at a basic level, each module exists in its own namespace and isn't aware of other modules unless you tell it about them.
So in your case, when you subclass Handler, the module has no idea what Handler is because it 1.) isn't a built-in, and 2.) hasn't been imported. Try this in usermod.py:
import main
class signup(main.Handler):
def get(self): #some code here etc
and see if it does what you want.

Categories

Resources