I'm developing an interactive modeling platform called QREDIS. It is composed of a set of interrelated modules, which get imported by a setup script and provide a set of model research functionality to the user.
As part of the setup, I load a class QMod_Template using
exec(open('QMod_Template.py').read())
This class is a base class for user-definable models; i.e. the user can define and save a new model QMod_MyModel(QMod_Template). All this works perfectly. I don't want to force the user to later reload his model with exec(open('QMod_MyModel.py').read()), so I've created a function LoadModel in model QREDIS_Model which will load a specified model class file (basically a wrapper for the exec code).
When I execute this function, I get an error NameError: name 'QMod_Template' is not defined. To summarize
First I load a class from a file:
exec(open('QMod_Template.py').read())
Then I import a module
import QREDIS_Model as QM
Then I try to load another model class file
QM.LoadModel('QMod_MyModel.py')
and get the NameError.
Essentially, QREDIS_Model.LoadModel needs to be able to access the already loaded QMod_Template class, but can't. I have tried declaring QMod_Template as global in both the module and function, but no luck.
This should be a simple fix I think. What am I missing? I hope this extended question makes my issue much more clear.
Related
I am working on a Django application that uses the SimpleGmail package to fetch mails from a Gmail inbox and I need to persist them. Normally I'd have written a model for the class, but given it's an external class, I can't figure out how to cleanly persist it.
I have come across solutions such as making it an attribute of a new model that is persisted or multiple inheritance of the desired class, but none of these seem correct to me.
How do I properly register an external class as a model for Django's persistence?
It is too long to comment. So I will try to write an answer.
One way is is to create a model class with properties which can be mapped from external class with all properties.
Another way would be just import external class in your application and create an instance of this external class. I am sorry, I am not Python guy, so code implementation would not be provided.
I created a model in a Django application. Let's say I have a model class Car in models.py. This class has a method called refuel() that interacts with the class' properties.
I want to write a Python script that imports models.py, instanciates the class and calls refuel() so to then debug the class using Breakpoints with my IDE.
However, for some reason, I cannot import the model file in another script. When I use this syntax to import the file:
from .models import Car
I get this error:
ModuleNotFoundError: No module named '__main__.models'; '__main__' is not a package
Note that the script I import the model in is located in the same folder as models.py.
What's the way of solving this problem, or, more broadly, what's the best way of debugging my model file ?
Thanks.
I initially started a small python project (Python, Tkinter amd PonyORM) and became larger that is why I decided to divide the code (used to be single file only) to several modules (e.g. main, form1, entity, database). Main acting as the main controller, form1 as an example can contain a tkinter Frame which can be used as an interface where the user can input data, entity contains the db.Enttiy mappings and database for the pony.Database instance along with its connection details. I think problem is that during import, I'm getting this error "pony.orm.core.ERDiagramError: Cannot define entity 'EmpInfo': database mapping has already been generated". Can you point me to any existing code how should be done.
Probably you import your modules in a wrong order. Any module which contains entity definitions should be imported before db.generate_mapping() call.
I think you should call db.generate_mapping() right before entering tk.mainloop() when all imports are already done.
A good approach to avoid this is rather than having your db.generate_mapping() call happening at a module's top-level code, have a function that a module exports that calls db.generate_mapping() after all other modules have been imported.
The pattern I use is to put all of my db.Entity subclasses into a single module named model, and then at the bottom of model.py is:
def setup():
""" Set up the database """
db.bind(**database_config, create_db=True)
db.generate_mapping(create_tables=True)
This function is called by my application's own startup (which is also responsible for setting up database_config). This way the correct import and setup order can be guaranteed.
The db object itself is also owned by this model module; if I need to use it somewhere else I import model and use model.db.
If you want to further separate things out (with different model classes living in different modules) you can have a module that owns db, then your separate model modules, and then a third module that imports db and the models and provides the setup function. For example, your directory structure could look like this:
model/
__init__.py -- imports all of the model sub-modules and provides a setup function
db.py -- provides the db object itself and any common entities objects that everyone else needs
form1.py, form2.py, etc. -- imports db and uses its database object to define the entities
Then your main app can do something like:
import model
model.setup()
I created a custom module that overrides the message_new method of the mail.thread model to allow creation of quotations from incoming emails by setting the values of the required fields "partner_id" etc. based on the content of the incoming email. Everything worked correctly with my method being called instead of the original one as expected.
I'm trying to move that code into another custom module now, I placed the python file within the custom module folder, added the import to the init.py file of the custom module and added "mail" to the depends sections of the openerp.py file like I did before.
But now with the new custom module installed the message_new method isn't being overridden and it's calling the original method from mail.thread. This custom module is inheriting the sale.order and sale.order.line models and the changes it's making to those models are being executed so I have no idea why mail.thread isn't being affected when the only difference between this new custom module and the old one is that the enw module is inheriting and applying changes to several models at once rather than to only the mail.thread model.
Has anyone experienced problems inheriting a model and overriding it's methods like this before?
Update:
Based on the answer to this question I guess it's a bug:
How to inherit Mail.Thread AbstractModel and override function from this class in Odoo?
The workaround in that question didn't work exactly for me, but removing the message_new_orig declaration and subsequent call like so did:
from openerp.addons.mail.mail_thread import mail_thread
def message_new(self, cr, uid, msg_dict, custom_values=None, context=None):
# put custom code here
# ...
return res_id
# install overide
mail_thread.message_new = message_new
I'm having problems structuring classes in the Model part of an MVC pattern in my Python app. No matter how I turn things, I keep running into circular imports. Here's what I have:
Model/__init__p.y
should hold all Model class names so
I can do a "from Model import User"
e.g. from a Controller or a unit
test case
Model/Database.py
holds Database class
needs to import all Model classes to do ORM
initialization should be performed on first module import, i.e. no extra init calls or instantiations (all methods on Database class are #classmethods)
Model/User.py
contains User model class
needs access to Database class to do queries
should inherit from base class common to all Model classes to share functionality (database persistency methods, parameter validation code etc.)
I have yet to see a real world Python app employing MVC, so my approach is probably un-Pythonic (and possibly a language-agnostic mess on top of that...) - any suggestions on how to solve this?
Thanks, Simon
There is an inconsistency in your specification. You say Database.py needs to import all Model classes to do ORM but then you say the User class need access to the Database to do queries.
Think of these as layers of an API. The Database class provides an API (maybe object-oriented) to some physical persistence layer (such as DB-API 2.0). The Model classes, like User, use the Database layer to load and save their state. There is no reason for the Database.py class to import all the Model classes, and in fact you wouldn't want that because you'd have to modify Database.py each time you created a new Model class - which is a code smell.
Generally, we put it all in one file. This isn't Java or C++.
Start with a single file until you get some more experience with Python. Unless your files are gargantuan, it will work fine.
For example, Django encourages this style, so copy their formula for success. One module for the model. A module for each application; each application imports a common model.
Your Database and superclass stuff can be in your __init__.py file, since it applies to the entire package. That may reduce some of the circularity.
I think you have one issue that should be straightened. Circular references often result from a failure to achieve separation of concerns. In my opinion, the database and model modules shouldn't know much about each other, working against an API instead. In this case the database shouldn't directly reference any specific model classes but instead provide the functionality the model classes will need to function. The model in turn, should get a database reference (injected or requested) that it would use to query and persist itself.