I am trying to import my KML file into a model using GeoDjango's LayerMapping functionality. I've run tests and had no issues when doing regular imports. However, I recently added a foreign key to my model. My model is called PlaceMark and it now has a FK to a model called Layer. I would like to either
override the import and manually set the value of the foreign key field or
update my KML file to contain a new element that connects the PlaceMark to the layer via either the pk or name field of Layer.
Here is how I am testing from the shell and the relevant error:
>>>from locator import load
>>>load.run()
...
TypeError: ForeignKey mapping must be of dictionary type.
....
Here is my load.py file:
import os
from django.contrib.gis.utils import LayerMapping
from models import PlaceMark
placemark_mapping = {
'name' : 'Name',
'description' : 'Description',
# This line below is the one that is suspect #
'layer': 'Layer',
'geom' : 'POINT25D',
}
placemark_kml = os.path.abspath(os.path.join(os.path.dirname(__file__), 'data/claim.kml'))
def run(verbose=True):
lm = LayerMapping(PlaceMark, placemark_kml, placemark_mapping,
transform=False, encoding='iso-8859-1')
lm.save(strict=True, verbose=verbose)
KML File:
<?xml version="1.0" encoding="Windows-1252"?>
<kml xmlns="http://earth.google.com/kml/2.1">
<Folder>
<description><![CDATA[TankSafe_Claims]]></description>
<Placemark>
<name><![CDATA[G2184729A]]></name>
<description><![CDATA[<br><br><br>
<table border="1" padding="0">
<tr><td>Policy_Number</td><td>53645645</td></tr>
<tr><td>Claim_Number</td><td>2342342234</td></tr>
<tr><td>Policy_Type</td><td>TSP</td></tr>
<tr><td>Name</td><td>Al's Total</td></tr>
<tr><td>Street_Address</td><td>555 109th Avenue</td></tr>
<tr><td>City</td><td>Pullman</td></tr>
<tr><td>State</td><td>NY</td></tr>
<tr><td>Zip_Code</td><td>55555</td></tr>
<tr><td>County</td><td>Allegan</td></tr>
]]></description>
<visibility>1</visibility>
<open>0</open>
<Point>
<extrude>1</extrude>
<altitudeMode>relativeToGround</altitudeMode>
<coordinates>-86.092641,42.483953,0</coordinates>
</Point>
<!--- ***Should I add the line below?*** -->
<Layer><name>claims</name></Layer>
</Placemark>
</Folder>
</kml>
My goal is to just get all the PlaceMarks imported with references to the relevant layer. Any ideas?
Thanks!
Larry
layer_mapping = {
'fk': {'nm_field': 'NAME'}, # foreign key field
'this_field': 'THIS',
'that_field': 'THAT',
'geom': 'POLYGON',
}
the error you're receiving that the Foreign Key field should be a dictionary is basically requesting an additional mapping to the model which the foreign key relates.
in the above snippet:
'fk' is the foreign key field name from the model the data is being loaded into (lets call it 'load model')
'nm_field' is the field name from the model the 'load model' has the foreign key relationship to (lets call it 'primary model')
'NAME' is the field name from the data being loaded into 'load model' which holds the relationship to 'primary model'
more explicitly, imagine if 'primary model' is a dataset of lakes and they have a field called 'nm_field' that is the lake name as a string.
now imagine, 'load model' is a dataset of points representing all the buoys on all the lakes and has a field name 'fk' that is a ForeignKey to 'primary model' for the assignment of the lake each buoy belongs to.
finally, the data you're loading into 'load model' has a string field called 'NAME' and it contains the pre-populated name of the lake each buoy belongs to. that string name is the relationship tie. it allows the 'load model' to use that name to identify which lake in the 'primary model' it should establish a foreign key with.
I tricked the LayerMapper into loading the ForeignKey field as a plain data-type after creating the tables.
Give USCounty an FK "state" to USState and run manage.py syncdb
Replace the "state" with "state_id" and the real datatype,
usually models.IntegerField and execute the load.run() LayerMapper.
Return the "state" FK to the USCounty model.
Use Django normally.
In my case below, the "state" keys are 2-character FIPS codes.
class USCounty(models.Model):
state = models.ForeignKey(USState)
## state_id = models.CharField(max_length=2)
...
geom = models.MultiPolygonField(srid=4326)
objects = models.GeoManager()
I worked around this by manually adding a temporary pre_save callback. You can connect it just for the record creation, then disconnect as soon as LayerMapping has done its work.
See 'My Solution' here - the 'black box' method I refer to is in fact exactly this use case.
The code that works for me:
def pre_save_callback(sender, instance, *args, **kwargs):
fkey = some_method_that_gets_the_foreign_key()
instance.type = fkey
# other mappings defined as usual
mapping = {
'key1': 'KEY1',
...,
}
lm = LayerMapping(models.MyModel, PATH_TO_SHAPEFILE, mapping, transform=True)
# temporarily connect pre_save method
pre_save.connect(pre_save_callback, sender=models.MyModel)
try:
lm.save(strict=True)
except Exception as exc:
optional_error_handling()
raise
finally:
# disconnect pre_save callback
pre_save.disconnect(pre_save_callback, sender=models.MyModel)
It doesn't look like there is an easy way to hook into LayerMapping for foreign key fields. I solved this by using a for loop and the get_geoms() call. Thanks to http://invisibleroads.com/tutorials/geodjango-googlemaps-build.html
Here is an example of what I did:
placemark_kml = os.path.abspath(os.path.join(os.path.dirname(locator.__file__), 'data/claim.kml'))
datasource = DataSource(placemark_kml)
lyr = datasource[0]
waypointNames = lyr.get_fields('Name')
waypointDescriptions = lyr.get_fields('Description')
waypointGeometries = lyr.get_geoms()
for waypointName, waypointGeometry, waypointDescription in itertools.izip(waypointNames, waypointGeometries, waypointDescriptions):
placemark = PlaceMark(name=waypointName, description=waypointDescription, geom=waypointGeometry.wkt)
placemark.layer = Layer.objects.get(pk=8)
placemark.save()
Not an answer but hopefully a hint.
The error thrown comes from this part of the code. line ~220 of layermapping.py
elif isinstance(model_field, models.ForeignKey):
if isinstance(ogr_name, dict):
# Is every given related model mapping field in the Layer?
rel_model = model_field.rel.to
for rel_name, ogr_field in ogr_name.items():
idx = check_ogr_fld(ogr_field)
try:
rel_model._meta.get_field(rel_name)
except models.fields.FieldDoesNotExist:
raise LayerMapError('ForeignKey mapping field "%s" not in %s fields.' %
(rel_name, rel_model.__class__.__name__))
fields_val = rel_model
else:
raise TypeError('ForeignKey mapping must be of dictionary type.')
At the beginning of the for loop, it looks for a dict: ogr_name.items()
ogr_name is actually defined as the value part of the mapping dict.
The dict is supposed to be composed of the org field name and the related field name from the related model.
If anyone understands the origin of that ogr_name dict, it would be of great use.
Related
I have two fields in Sale order, i need to get values from each one change its value in text and display in another field in custom model
Two fiedls in sale.order module:
amount_total = fields.Monetary(string="Total", store=True, compute='_compute_amounts', tracking=4)
date_order = fields.Datetime()
and this is my code so far:
from odoo import fields, models, api
from odoo.exceptions import ValidationError
import random
readonly_fields_states = {
state: [('readonly', True)]
for state in {'sale', 'done', 'cancel'}
}
class SaleOrder(models.Model):
_inherit = "sale.order"
test = fields.Many2one(string="Test",
comodel_name='sale.order',
default=lambda x: random.randint(1, 10),
states=readonly_fields_states,
)
#api.constrains('test')
def check_test_length(self):
for rec in self:
if rec.test:
if len(rec.test) > 50:
raise ValidationError('Длина текста строки "test" должна быть меньше 50 символов!')
else:
pass
The goal is: get data from fields turn its type into text and display that data in fiedl test when the two from sale.order is changed. For now i get only names S00001, S00002.. etc.
I have no working solution. I have tried various fucntion but non of them seems to work or i doing something wrong. I realise that my case bit unclear thats because i can`t wrap my head around it. So ask me whatever needed if you want to help.
You can use odoo related field attribute.
In your custom model add a relational field with sale.order and then create related fields.
sale_order_id = fields.Many2one(comodel_name="sale.order")
sale_order_amount_total = fields.Monetary(related="sale_order_id.amount_total")
sale_order_date_order = fields.Datetime(related="sale_order_id.date_order")
Now in custom model views you'll be able to use created fields.
<field name="sale_order_amount_total" />
<field name="sale_order_date_order" />
Note that these fields will be directly related to your Many2one model instance (sale.order record)
Good practice is to set them to readonly=True
In case you are going to execute queries in your custom models using your related fields, you should consider using store=True as attribute.
For example
sale_order_amount_total = fields.Monetary(
related="sale_order_id.amount_total",
store=True,
readonly=True
)
In developing a website for indexing system documentation I've come across a tough nut to crack regarding data "matching"/relations across databases in Django.
A simplified model for my local database:
from django.db import models
class Document(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
Imagined model, system details are stored in a remote database.
from django.db import models
class System(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
The idea is that when creating a new Document entry at my website the ID of the related system is to be stored in the local database. When presenting the data I would have to use the stored ID to retrieve the system name among other details from the remote database.
I've looked into foreign keys across databases, but this seems to be very extensive and I'm not sure if I want relations. Rather I visualize a function inside the Document model/class which is able to retrieve the matching data, for example by importing a custom router/function.
How would I go about solving this?
Note that I won't be able to alter anything on the remote database, and it's read-only. Not sure if I should create a model for System aswell. Both databases use PostgreSQL, however my impression is that it's not really of relevance to this scenario which database is used.
In the django documentation multi-db (manually-selecting-a-database)
# This will run on the 'default' database.
Author.objects.all()
# So will this.
Author.objects.using('default').all()
# This will run on the 'other' database.
Author.objects.using('other').all()
The 'default' and 'other' are aliases for you databases.
In your case it would could be 'default' and 'remote'.
of course you could replace the .all() with anything you want.
Example: System.objects.using('remote').get(id=123456)
You are correct that foreign keys across databases are a problem in Django ORM, and to some extent at the db level too.
You already have the answer basically: "I visualize a function inside the Document model/class which is able to retrieve the matching data"
I'd do it like this:
class RemoteObject(object):
def __init__(self, remote_model, remote_db, field_name):
# assumes remote db is defined in Django settings and has an
# associated Django model definition:
self.remote_model = remote_model
self.remote_db = remote_db
# name of id field on model (real db field):
self.field_name = field_name
# we will cache the retrieved remote model on the instance
# the same way that Django does with foreign key fields:
self.cache_name = '_{}_cache'.format(field_name)
def __get__(self, instance, cls):
try:
rel_obj = getattr(instance, self.cache_name)
except AttributeError:
system_id = getattr(instance, self.field_name)
remote_qs = self.remote_model.objects.using(self.remote_db)
try:
rel_obj = remote_qs.get(id=system_id)
except self.remote_model.DoesNotExist:
rel_obj = None
setattr(instance, self.cache_name, rel_obj)
if rel_obj is None:
raise self.related.model.DoesNotExist
else:
return rel_obj
def __set__(self, instance, value):
setattr(instance, self.field_name, value.id)
setattr(instance, self.cache_name, value)
class Document(models.Model:
name = models.CharField(max_length=200)
system_id = models.IntegerField()
system = RemoteObject(System, 'system_db_name', 'system_id')
You may recognise that the RemoteObject class above implements Python's descriptor protocol, see here for more info:
https://docs.python.org/2/howto/descriptor.html
Example usage:
>>> doc = Document.objects.get(pk=1)
>>> print doc.system_id
3
>>> print doc.system.id
3
>>> print doc.system.name
'my system'
>>> other_system = System.objects.using('system_db_name').get(pk=5)
>>> doc.system = other_system
>>> print doc.system_id
5
Going further you could write a custom db router:
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-routers
This would let you eliminate the using('system_db_name') calls in the code by routing all reads for System model to the appropriate db.
I'd go for a method get_system(). So:
class Document:
def get_system(self):
return System.objects.using('remote').get(system_id=self.system_id)
This is the simplest solution. A possible solution is also to use PostgreSQL's foreign data wrapper feature. By using FDW you can abstract away the multidb handling from django and do it inside the database - now you can use queries that need to use the document -> system relation.
Finally, if your use case allows it, just copying the system data periodically to the local db can be a good solution.
I have a table that looks like this:
class fnx_sr_shipping(osv.Model):
_name = 'fnx.sr.shipping'
_description = 'shipping & receiving entries'
_inherits = {
'res.partner': 'partner_id',
}
_order = 'appointment_date desc, appointment_time asc'
_columns = {
.
.
.
'partner_id': fields.many2one(
'res.partner',
'Partner',
required=True,
ondelete='restrict'),
.
.
.
The required=True is required by OpenERP (if it's not there, OE adds it).
When I use the web interface I am able to create a new shipping record and pick existing partners; however, if I try the same thing using XML-RPC (supplying the partner_ids in the shipping record create call) OpenERP tries to create a new record in res.partner using default settings, which of course fails because some required fields have no default (such as the name).
Here's the XML-RPC code I'm using:
import openerplib
OE = PropertyDict() # allows attribute-style access for keys
OE.conn = openerplib.get_connection(
hostname='xxx',
database='yyy',
login='zzz',
password='...'
OE.res_partner = conn.get_model('res.partner')
.
.
.
values['partner_id'] = 77 # or whatever it actually is ;)
OE.fnx_shipping.create(values)
I have verified that the ids being passed are correct.
Is this a bug in my code, or in OpenERP?
In browsing through orm.py I found this:
def name_create(self, cr, uid, name, context=None):
"""Creates a new record by calling :meth:`~.create` with only one
value provided: the name of the new record (``_rec_name`` field).
The new record will also be initialized with any default values applicable
to this model, or provided through the context.
The usual behavior of :meth:`~.create` applies.
Similarly, this method may raise an exception if the model has multiple
required fields and some do not have default values.
"""
So I tried supplying the already existing partner_id in the context dict as
OE.fnx_shipping.create(values, context={'default_partner_id':partner_id})
I consider the whole mess a bug in OpenERP, but at least there is a not-horrible work-a-round.
I'm trying to set up a database ORM with peewee and am not clear on the use of foreign key relationships.
from peewee import *
db = SqliteDatabase('datab.db')
class datab(Model):
class Meta:
database = db
class Collection(datab):
identifier = CharField()
title = CharField()
class File(datab):
identifier = ForeignKeyField(Collection, related_name='files')
name = CharField()
Later, I do an import of "Collections"
for value in collection:
Collection(**value).save()
Finally, where I am having trouble is adding the Files to the collections
for value in collectionFiles:
File(**value).save()
Within the value dict, there is a keyword pair with key of "identifier" and a value that should associate with the Collection identifier keyword.
However I get an error message:
ValueError: invalid literal for int() with base 10: 'somevalue'
If I change the File(datab): identifier Type to VarChar, it saves the data.
I'm realizing I'm doing it wrong. My assumption was that the unique identifier value in each table would apply the foreign key. After reading the documentation, it looks like the foreign key setup is a bit different. Do I need to do something like
Collections.File.files(**values).save() ? In other words, instead of doing a data import, loading the collection object and then adding the file associated fields through peewee?
Values that make up class File
{'crc32': '63bee49d',
'format': 'Metadata',
'identifier': u'somevalue',
'md5': '34104ffce9e4084fd3641d0decad910a',
'mtime': '1368328224',
'name': 'lupi.jpg_meta.txt',
'sha1': '1448ed1159a5d770da76067dd1c53e94d5a01535',
'size': '1244'}
I think the naming of your fields might be part of the confusion. Rather than calling the foreign key from File -> Collection "identifier", you might call it "collection" instead.
class File(datab):
collection = ForeignKeyField(Collection, related_name='files')
name = CharField()
Peewee prefers that, when setting the value of a Foreign Key, it be a model instance. For example, rather than doing:
File.create(collection='foobar', name='/secret/password')
It is preferable to do something like this:
collection = Collection.get(Collection.identifier == 'foobar')
File.create(collection=collection, name='/secret/password')
As a final note, if the Collection "identifier" is the unique primary key, you can set it up thus:
class Collection(datab):
identifier = CharField(primary_key=True)
title = CharField()
(I'm not familiar with peewee, but if it's like Django then this should work.)
class File has a ForeignKeyField and a CharField, so you can't simply save a pair of strings with File(**value). You need to convert a string to a key first, like this:
for value in collectionFiles:
identifier = value['identifier']
name = value['name']
collection_entity = Collection.objects.filter(identifier=identifier).get()
File(identifier=collection_entity, name=name).save()
I'm thinking about this for a while now,
I'm creating a chat application, in chat.models a class Room is specified, however, a Room can be related to anything in my project, since it uses a generic relation in it's foreign key.
Is there a way to know which model that Room is related knowing only the models name?
Like:
ctype = 'user'
related_to_user = Room.objects.filter(content_type=ctype)
The problem I'm having is, the code below is in a view:
doc = get_object_or_404(Document, id=id)
# get *or create* a chat room attached to this document
room = Room.objects.get_or_create(doc)
If I don't want to use Document model, if I want a model associated to a string, a string that can be anything, without having to write tons of if's to get a specific Model for the specific string. Is there a way to find a model just by it's 'name'?
Thanks
http://docs.djangoproject.com/en/dev/ref/contrib/contenttypes/#methods-on-contenttype-instances
user_type = ContentType.objects.get(app_label="auth", model="user")
user_type = ContentType.objects.get(model="user")
# but this can throw an error if you have 2 models with the same name.
Very similar to django's get_model
from django.db.models import get_model
user_model = get_model('auth', 'user')
To use your example exactly:
ctype = ContentType.objects.get(model='user')
related_to_user = Room.objects.filter(content_type=ctype)