I created a custom module that overrides the message_new method of the mail.thread model to allow creation of quotations from incoming emails by setting the values of the required fields "partner_id" etc. based on the content of the incoming email. Everything worked correctly with my method being called instead of the original one as expected.
I'm trying to move that code into another custom module now, I placed the python file within the custom module folder, added the import to the init.py file of the custom module and added "mail" to the depends sections of the openerp.py file like I did before.
But now with the new custom module installed the message_new method isn't being overridden and it's calling the original method from mail.thread. This custom module is inheriting the sale.order and sale.order.line models and the changes it's making to those models are being executed so I have no idea why mail.thread isn't being affected when the only difference between this new custom module and the old one is that the enw module is inheriting and applying changes to several models at once rather than to only the mail.thread model.
Has anyone experienced problems inheriting a model and overriding it's methods like this before?
Update:
Based on the answer to this question I guess it's a bug:
How to inherit Mail.Thread AbstractModel and override function from this class in Odoo?
The workaround in that question didn't work exactly for me, but removing the message_new_orig declaration and subsequent call like so did:
from openerp.addons.mail.mail_thread import mail_thread
def message_new(self, cr, uid, msg_dict, custom_values=None, context=None):
# put custom code here
# ...
return res_id
# install overide
mail_thread.message_new = message_new
Related
I am working on a Django application that uses the SimpleGmail package to fetch mails from a Gmail inbox and I need to persist them. Normally I'd have written a model for the class, but given it's an external class, I can't figure out how to cleanly persist it.
I have come across solutions such as making it an attribute of a new model that is persisted or multiple inheritance of the desired class, but none of these seem correct to me.
How do I properly register an external class as a model for Django's persistence?
It is too long to comment. So I will try to write an answer.
One way is is to create a model class with properties which can be mapped from external class with all properties.
Another way would be just import external class in your application and create an instance of this external class. I am sorry, I am not Python guy, so code implementation would not be provided.
Django's model Model class defines a __repr__ method that combines the model class name with the string representation of the instance, so that a typical object will show up in the shell or in debugging tools in the following format:
<MyClass: string description of instance>
What I want is for all of my objects instances to show up with their ids in their __repr__, e.g.
<MyClass 123: string description of instance>
This is for debugging convenience.
Now, it would be easy enough (in principle) to override my __unicode__ methods (which generate the string description of instances) to include the ids, or for that matter to override my __repr__ methods (or to have all my model classes derive from a base class that does so).
However, I'm working with an existing code base and want to avoid changing all the existing model class definitions for this. The quick-and-dirty way to change things is to edit the source code for __repr__ in Django's Model class. But that creates deployment issues as my project always deploys third-party libraries like Django from pip.
So: how can I get Django to include the id in the repr for all object instances without either changing the Django source code or my project model class definitions?
NOTE: I'm thinking some kind of monkey patch to Model.__repr__ should do the trick, but I'm not sure if that would work and if so where in my Django project to do it.
Monkeypatching Model.__repr__ should work. Something like:
def debug_repr(self):
return "<{} {}: {}>".format(self.__class__.__name__, self.pk, self)
Model.__repr__ = debug_repr
This is based on the implementation in the source code.
Where you do this depends on your debugging setup. If you're working in the console you can just type this in directly. If you want to put it in code but not interfere with other deployments, you should probably create a new settings file for your debugging environment and then trigger the monkeypatch that way.
I'm working on a Django application connected to a LDAP server. Here's the trick i'm trying to do.
I have a model called system containing some information about computers. When i add a new system, the model generates a unique UUID, like and AutoField. The point is that this parameter is generated when saving, and only the first time.
After saved, i need a function to keep that UUID and create a new object on my LDAP.
Since i didn't know a lot about signals, i tried overriding the model save function in this way:
def save(self):
# import needed modules
import ldap
import ldap.modlist as modlist
[--OPERATIONS ON THE LDAP--]
super(System, self).save()
In this way, if i modify an existing system all work as should, because its UUID has been generated already. But if i try adding a new system i find the error UUID is None, and i can't work with an empty variable on LDAP (also it would be useless, don't u think so?)
It seems i need to call the function that work on LDAP after the system is saved, and so after an UUID has been generated. I tried to unserstand how to create a post_save function but i couldn't get it.
How can i do that?
Thanks
As you stated on your own, you do need signals, it will allow your code to stay more clean and seperate logic between parts.
The usual approach is to place signals just at the end of your models file:
# Signals
from django.dispatch import receiver
#receiver(models.signals.post_save, sender=YourModel)
def do_something(sender, instance, created, **kwargs):
....
On the above example we connect the post_save signal with the do_something function, this is performed through the decorator #receiver, sender of the decorator points to your Model Class.
Inside your function you have instance which holds the current instance of the model and the created flag which allows you to determine if this is a new record or an old (if the Model is being updated).
Signals would be excellent for something like this, but moving the line super(System, self).save() to the top of the save method might work as well. That means you first save the instance, before passing the saved object to the LDAP.
I'm developing an interactive modeling platform called QREDIS. It is composed of a set of interrelated modules, which get imported by a setup script and provide a set of model research functionality to the user.
As part of the setup, I load a class QMod_Template using
exec(open('QMod_Template.py').read())
This class is a base class for user-definable models; i.e. the user can define and save a new model QMod_MyModel(QMod_Template). All this works perfectly. I don't want to force the user to later reload his model with exec(open('QMod_MyModel.py').read()), so I've created a function LoadModel in model QREDIS_Model which will load a specified model class file (basically a wrapper for the exec code).
When I execute this function, I get an error NameError: name 'QMod_Template' is not defined. To summarize
First I load a class from a file:
exec(open('QMod_Template.py').read())
Then I import a module
import QREDIS_Model as QM
Then I try to load another model class file
QM.LoadModel('QMod_MyModel.py')
and get the NameError.
Essentially, QREDIS_Model.LoadModel needs to be able to access the already loaded QMod_Template class, but can't. I have tried declaring QMod_Template as global in both the module and function, but no luck.
This should be a simple fix I think. What am I missing? I hope this extended question makes my issue much more clear.
In my application i have the requirement of keppling logs of all models changes and delete.
So i have created baseclass Audit and extended all classes from it.
I have overridden save , delete methods in it so that i keep old chnages as well when we do some updation.
I want to know that is there any better way of doing that rather than extending all classes fron base class. Or is it all right like that.
For this use case, you may be able to write a generic function that could be used with django signals.
https://docs.djangoproject.com/en/dev/topics/signals/