I am working on a Django application that uses the SimpleGmail package to fetch mails from a Gmail inbox and I need to persist them. Normally I'd have written a model for the class, but given it's an external class, I can't figure out how to cleanly persist it.
I have come across solutions such as making it an attribute of a new model that is persisted or multiple inheritance of the desired class, but none of these seem correct to me.
How do I properly register an external class as a model for Django's persistence?
It is too long to comment. So I will try to write an answer.
One way is is to create a model class with properties which can be mapped from external class with all properties.
Another way would be just import external class in your application and create an instance of this external class. I am sorry, I am not Python guy, so code implementation would not be provided.
Related
I'm trying to find a way to build a robust report library that I can use with a Django project.
Essentially, what I'm looking for is a way to access a list of functions(reports) that I can allow an end-user to attach to a Django model.
For example, let's say I have an Employee object. Well, I have a module called reports.py that has a growing list of possible reports that take an employee object and output a report, usually in JSON form. There might be number of timecards submitted, number of supervisions created, etc.
I want to be able to link those changing report lists to the Employee object via a FK called (job description), so admins can create custom reports per job description.
What I've tried:
Direct model methods: good for some things, but it requires a programmer to call them in a template or via API to generate some output. Since the available reports are changing, I don't want to hard-code anything and would rather allow the end-user to choose from a list of available reports and attach them to a related model (say a JobDescription).
dir(reports): I could offer up a form where the select values are the results from dir(reports), but then I'd get the names of variables/libraries called in the file, not just a list of available reports
Am I missing something? Is there a way to create a custom class from which I can call all methods available? Where would I even start with that type of architecture?
I really appreciate any sort of input re: the path to take. Really just a 'look in this direction' response would be really appreciated.
What I would do is expand on your dir(reports) idea and create a dynamically loaded module system. Have a folder with .py files containing module classes. Here's an example of how you can dynamically load classes in Python.
Each class would have a static function called getReportName() so you could show something readable to the user, and a member function createReport(self, myModel) which gets the model and does it's magic on it.
And then just show all the possible reports to the user, user selects one and you run the createReport on the selected class.
In the future you might think about having different report folders for different models, and this too should be possible by reflection using model's __name__ attribute.
I am trying to do some pre-processing at Django startup (I put a startup script that runs once in urls.py) and then use the created instance of an object in my views. How would I go about doing that?
Try to use the singleton design pattern.
You can use a Context Processor to add it to your template context.
If you want it in the View, rather than the Template, then you can either have a base View class that has this, or just import the reference into the module your view is in (and access it directly).
Be aware that each django thread may have a different copy of the object in memory, so this should really only be used for read-only access. If you make changes to it, you are likely to find yourself in a world of hurt.
I have a models in different files (blog/models.py, forum/models.py, article/models.py). In each of this files I have defined model classes with application prefix (BlobPost, BlogTag, ForumPost, ForumThread, Article, ArticleCategory).
Also I have appliation - comment, for adding comment attached to any model object. For example, I want to comment BlogPost, or add comment referenced to ForumPost. For this I use property with type ReferenceProperty() - without specify type of references. Any model can attached to comment.
What a problem? If I have show all comments in administration section, I see a problem with autoloading models for ReferenceProperty. I don't know, what type of model used for current comment. I need to autoload package with model, if this need.
Yes, exists simple solution - include all models from all applications. But, this is not good solution. I need load only need models. How to do this autoloading?
My idea is based on detect kind of property, and by first part of this name detect application name for load all models in this application. For example, I have comment with Reference to BlogPost model. I get name of application - Blog and load all models from blog.models import *
For implement my idea I need to understand - how to intercept creating property instances. In my case, if I loop over comments, I see that App Engine automatically (thanks, but not in my case) create instances for properties.
How to inject my logic for loading my models before creating property instance?
Thank you!
This isn't possible in the standard db framework, as there's not enough information present to find your models. The only information the framework has to work with is the kind name, which doesn't include the fully qualified package - so it has no way to figure out what package your model definition might be in.
If you're writing an admin interface, though, you probably want to use the low-level google.appengine.api.datastore interface, instead, which operates on dicts instead of model classes, and doesn't require a model definition.
I am building an application that will send an API call and save the resulting information after processing the information in a APIRecord(models.Model) class.
1) Should I build a separate class in such a way that the class does the API call, processes the information (including checking against business rules) and then creates an instance of my APIRecord() class?
Or
2) Should I build a separate class with the appropriate methods for processing, and calling the API, and then in my model, override the APIRecord.save() method to call the separate class's API methods and then save the results?
Or
3) Should I build my model class with the appropriate methods for calling the API and processing the response (including checking for certain values and other business rules)?
I tried # 2 and ran into problems with flexibility (but am still open to suggestion). I'm leaning towards # 1, but I'm not sure of all the negatives yet?
it is design decision.
it depends to your design and programming interests.
i used the combination of three methods you said. if i need to some informations that can be build from other fields then i will create an internal function in model class. if i need other records of database to do something i will create an function outside of model class. and other unusual needs will be computed everywhere i need them.
I'm having problems structuring classes in the Model part of an MVC pattern in my Python app. No matter how I turn things, I keep running into circular imports. Here's what I have:
Model/__init__p.y
should hold all Model class names so
I can do a "from Model import User"
e.g. from a Controller or a unit
test case
Model/Database.py
holds Database class
needs to import all Model classes to do ORM
initialization should be performed on first module import, i.e. no extra init calls or instantiations (all methods on Database class are #classmethods)
Model/User.py
contains User model class
needs access to Database class to do queries
should inherit from base class common to all Model classes to share functionality (database persistency methods, parameter validation code etc.)
I have yet to see a real world Python app employing MVC, so my approach is probably un-Pythonic (and possibly a language-agnostic mess on top of that...) - any suggestions on how to solve this?
Thanks, Simon
There is an inconsistency in your specification. You say Database.py needs to import all Model classes to do ORM but then you say the User class need access to the Database to do queries.
Think of these as layers of an API. The Database class provides an API (maybe object-oriented) to some physical persistence layer (such as DB-API 2.0). The Model classes, like User, use the Database layer to load and save their state. There is no reason for the Database.py class to import all the Model classes, and in fact you wouldn't want that because you'd have to modify Database.py each time you created a new Model class - which is a code smell.
Generally, we put it all in one file. This isn't Java or C++.
Start with a single file until you get some more experience with Python. Unless your files are gargantuan, it will work fine.
For example, Django encourages this style, so copy their formula for success. One module for the model. A module for each application; each application imports a common model.
Your Database and superclass stuff can be in your __init__.py file, since it applies to the entire package. That may reduce some of the circularity.
I think you have one issue that should be straightened. Circular references often result from a failure to achieve separation of concerns. In my opinion, the database and model modules shouldn't know much about each other, working against an API instead. In this case the database shouldn't directly reference any specific model classes but instead provide the functionality the model classes will need to function. The model in turn, should get a database reference (injected or requested) that it would use to query and persist itself.