Class names and imports as variables in Python - python

I want a separate Python code where I can define default .py files that have to be created at the start of a project, depending on what models I want. So when I start a new project, I don't have to copy the code from a different project and adjust class names, etc. So for instance, I want to automatically create a model_1.py as:
class Model1(object):
code
and a model_2.py as:
class Model2(object):
code
I want these to be created from another file, where I define which models have to be created. So for instance:
models = ['Model1', 'Model2']
Is it possible to have the class name as a variable? So something like:
class models[0]()
Moreover, is something similar possible for the import part? So
from model_type_x.test import *
where model_type_x is a variable?
What other possibilities are there? Let Python create a text file and turn this into a .py file?

You need this module named cookiecutter. You can have templates for your project and have them configured with a prompt to create your project

First of all, python file are simply text files. You just have to save them with a .py extension.
What you're trying to achieve is more or less out of the scope of python. Python by itself doesn't generate code. If you want to generate code, you can use templates in any language that you like. It doesn't really matter much since the code isn't going to get executed.
Class names and import names cannot be variables. These are syntax sugar allowing you to define types or import code.
If you want to import using a variable name, you can import modules as such:
__import__(module_name)
Where module_name is a variable so you can import modules at runtime with this if you can guess how they are called or going to be imported... Even though it's possible to do that, I do not recommend using this method as it's pretty ugly and pretty much useless to do that that way since we usually know beforehand what we're importing. You can always use the "*" but that's also not a particularly good idea because some things inside a module won't get exported and it's usually better to explicitly tell what you're importing.
class models[0]()
This is clearly not possible, the keyword class is used to define a type. What you can do on the other hand is this:
locals()[models[0]] = type(models[0], parent_tuple, attributes)
But accessing locals() to define a local variable using the type constructor to define a new type manually instead of using the class keyword that makes things much more easily to read... There's just no point to do otherwise.
The real question here is... What are you trying to achieve? Chances are that you're not looking for the right solution to a problem you don't have.

Related

Splitting python code into different files

I am a beginner in Python, and I am trying to learn by making a simple game. I started by having everything in one big file (let's call it main.py), but it is getting to the point where it has so many classes and functions that I would like to split this code into more manageable components.
I have some experience with LaTeX (although certainly not an expert either) and, in LaTeX there is a function called \input which allows one to write part of the code in a different file. For example, if I have files main.tex and sub.tex which look like:
main.tex:
Some code here.
\input{sub}
Lastly, some other stuff.
and
sub.tex:
Some more code here
then, when I execute main.tex, it will execute:
Some code here.
Some more code here
Lastly, some other stuff.
I wonder, is there a similar thing in Python?
Note 1: From what I have seen, the most commonly suggested way to go about splitting your code is to use modules. I have found this a bit uncomfortable for a few reasons, which I will list below (of course, I understand that I find them uncomfortable because I am a inexperienced, and not because this is the wrong way to do things).
Reasons why I find modules uncomfortable:
My main.py file imports some other modules, like Pygame, which need to be imported into all the new modules I create. If for some reason I wanted to import a new module into main.py later in the process I would then need to import it on every other module I create.
My main.py file has some global variables that are used in the different classes; for example, I have a global variable CITY_SIZE that controls the size of all City instances on the screen. Naturally, CITY_SIZE is used in the definition of the class City. If I were to move the class City to a module classes.py, then I need to define CITY_SIZE on classes.py as well, and if I ever wanted to change the value of CITY_SIZE I would need to change its value on classes.py as well.
Again, suppose that I add a classes.py module where I store all my classes, like City. Then in main.py I need to write classes.City in my code instead of City. I understand this can be overcome by using from classes import City but then I need to add a line of code every time I add a new class to classes.py.
Note 2: I would very much appreciate any comments about how to use modules comfortably in Python, but please note that because this is not my question I would not be able to accept those as valid answers (but, again, they would be appreciated!).
If you have all of your modules in the same directory, you can simply use:
import <name of submodule without .py>
For example, if a submodule file was named sub.py, you would import it like this:
import sub

PyCharm procedural __all__ generation and syntax highlighting

I'm using this decorator to manage __all__ in a DRY manner:
def export(obj):
mod = sys.modules[obj.__module__]
if hasattr(mod, '__all__'):
mod.__all__.append(obj.__name__)
else:
mod.__all__ = [obj.__name__]
return obj
For names imported with import * PyCharm issues an unresolved reference error, which is understandable, since it doesn't run the code before analysis. But it is an obvious inconvenience.
How would you solve it (or maybe already solved)?
My assumptions:
Adding some automatic linter plugin or altering existing PyCharm's inspection code would be fine.
Something that's actually editing a .py source is viable, but not fine.
This method is probably not the best one, therefore suggesting another convenient technique of dealing with exports is fine too.
You may be interested in an alternative approach to managing __all__:
https://pypi.org/project/auto-all/
This provides a start_all() and end_all() function to place in your module around the items you want to make accessible. This approach works with PyCharms code inspection.
from auto_all import start_all, end_all
# Imports outside the start and end function calls are not included in __all__.
from pathlib import Path
def a_private_function():
print("This is a private function.")
# Start defining externally accessible objects.
start_all(globals())
def a_public_function():
print("This is a public function.")
# Stop defining externally accessible objects.
end_all(globals())
I feel like this is a reasonable approach to managing __all__, and one that I used on more complex packages. The source code for the package is small, so could easily be included direct in your code to avoid external dependencies if you need.
The reason I use this is I have some modules where lots of items need to be "exported" and a I want to keep imported items out of the export list. I have multiple developers working on the code and it's easy to add new items and forget to include them in __all__, so automating this helps.

Does a python dynamic import of module generates .pyc?

I would like to have dotted name strings that I can evaluate. Those dotted names could point to functions from new files the project do not know about (to quickly add a new functionality to the project without being part of the development team).
Right now, I resolve and compile the dotted names using a library (pyramid) then I save the compiled function object somewhere to be able to use it later.
I've seen that importlib let us import a module and it works perfectly fine, like so:
importlib.import_module('my_library')
Still, normally when you import a module, a .pyc will be generated so other calls won't take as much time to execute (as they won't have to be compiled again).
Do imports using importlib create .pyc files?
If not, would adding it to locals() change anything? (Adding it to globals() did not seem to work for me) Like so:
locals()['my_library'] = importlib.import_module('my_library')

Proper way of setting classes and constants in python package

I'm writing a small package for internal use and come to a design problem. I define a few classes and constants (i.e., server IP address) in some file, let's call it mathfunc.py. Now, some of these classes and constants will be used in other files in the same package. My current setup is like this:
/mypackage
__init__.py
mathfunc.py
datefunc.py
So, at the moment I think I have to import mathfunc.py in datefunc.py to use the classes defined there (or alternatively import both of them all the time). This sounds wrong to me because then I'll be in a lot of pain importing lots of files everywhere. Is it a proper design at all or there is some other way? Maybe I can put all definitions in some file which will not be a subpackage on its own, but will be used by all other files?
Nope, that's pretty much how Python works. If you want to use objects declared in another file, you have to import from it.
Tips:
You can keep your namespace clean by only importing the things you need, rather than using from foo import *.
If you really need to do a "circular import" (where A needs things in B, and B needs things in A) you can solve that by only importing inside the functions where you need the object, not at the top of a file.

Python Superglobal?

Is there is a super global (like PHP) in Python? I have certain variables I want to use throughout my whole project in separate files, classes, and functions, and I don't want to have to keep declaring it throughout each file.
In theory yes, you can start spewing crud into __builtin__:
>>> import __builtin__
>>> __builtin__.rubbish= 3
>>> rubbish
3
But, don't do this; it's horrible evilness that will give your applications programming-cancer.
classes and functions and i don't want to have to keep declaring
Put them in modules and ‘import’ them when you need to use them.
I have certain variables i want to use throughout my whole project
If you must have unqualified values, just put them in a file called something like “mypackage/constants.py” then:
from mypackage.constants import *
If they really are ‘variables’ in that you change them during app execution, you need to start encapsulating them in objects.
Create empty superglobal.py module.
In your files do:
import superglobal
superglobal.whatever = loacalWhatever
other = superglobal.other
Even if there are, you should not use such a construct EVER. Consider using a borg pattern to hold this kind of stuff.
class Config:
"""
Borg singlton config object
"""
__we_are_one = {}
__myvalue = ""
def __init__(self):
#implement the borg patter (we are one)
self.__dict__ = self.__we_are_one
def myvalue(self, value=None):
if value:
self.__myvalue = value
return self.__myvalue
conf = Config()
conf.myvalue("Hello")
conf2 = Config()
print conf2.myvalue()
Here we use the borg pattern to create a singlton object. No matter where you use this in the code, the 'myvalue' will be the same, no matter what module or class you instantiate Config in.
in years of practice, i've grown quite disappointed with python's import system: it is complicated and difficult to handle correctly. also, i have to maintain scores of imports in each and every module i write, which is a pita.
namespaces are a very good idea, and they're indispensable---php doesn't have proper namespaces, and it's a mess.
conceptually, part of writing an application consists in defining a suitable vocabulary, the words that you'll use to do the things you want to. yet in the classical way, it's exactly these words that won't come easy, as you have to first import this, import that to obtain access.
when namespaces came into focus in the javascript community, john resig of jquery fame decided that providing a single $ variable in the global namespace was the way to go: it would only affect the global namespace minimally, and provide easy access to everything with jquery.
likewise, i experimented with a global variable g, and it worked to an extent. basically, you have two options here: either have a startup module that must be run prior to any other module in your app, which defines what things should be available in g, so it is ready to go when needed. the other approach which i tried was to make g lazy and to react with custom imports when a new name was required; so whenever you need to g.foo.frob(42) for the first time, the mechanism will try something like import foo; g.foo = foo behind the scenes. that was considerable more difficult to do right.
these days, i've ditched the import system almost completely except for standard library and site packages. most of the time i write wrappers for hose libraries, as 90% of those have inanely convoluted interfaces anyhow. those wrappers i then publish in the global namespace, using spelling conventions to keep the risk of collisions to a minimum.
i only tell this to alleviate the impression that modifying the global namespace is something that is inherently evil, which the other answers seem to state. not so. what is evil is to do it thoughtlessly, or be compelled by language or package design to do so.
let me add one remark, as i almost certainly will get some fire here: 99% of all imports done by people who religiously defend namespace purity are wrong. proof? you'll read in the beginning lines of any module foo.py that needs to do trigonometrics something like from math import sin. now when you correctly import foo and have a look at that namespace, what are you going to find? something named foo.sin. but that sin isn't part of the interface of foo, it is just a helper, it shouldn't clutter that namespace---hence, from math import sin as _sin or somesuch would've been correct. however, almost nobody does it that way.
i'm sure to arouse some heated comments with these views, so go ahead.
The reason it wasn't obvious to you is that Python intentionally doesn't try to support such a thing. Namespaces are a feature, and using them is to your advantage. If you want something you defined in another file, import it. This means from reading your source code you can figure out where everything came from, and also makes your code easier to test and modify.

Categories

Resources