I have a post(self) and I want to add some logic here to add lat and lng (these are computed from google maps) to the data store as defined in my db model. Should I add to data, or should I do it some other way such as with the original class. What is the best way to do this?
so...
class Company(db.Model):
company_type = db.StringProperty(required=True, choices=["PLC", "LTD", "LLC", "Sole Trader", "Other"])
company_lat = db.StringProperty(required=True)
company_lng = db.StringProperty(required=True)
class CompanyForm(djangoforms.ModelForm):
company_description = forms.CharField(widget=forms.Textarea(attrs={'rows':'2', 'cols':'20'}))
company_address = forms.CharField(widget=forms.Textarea(attrs={'rows':'2', 'cols':'20'}))
class Meta:
model = Company
exclude = ['company_lat,company_lng']
def post(self):
data = CompanyForm(data=self.request.POST)
map_url = ''
address = self.request.get("company_postcode")
...
lat = response['results'][0]['geometry']['location']['lat']
lng = response['results'][0]['geometry']['location']['lng']
...
# How do I add these fields lat and lng to my data store?
# Should I add them to data? if this is possible?
# Or shall I do it some other way?
Thanks
The djangoforms help page explains how to add data to your datastore entity. Call save method with commit=False. It returns datastore entity and then you can add fields before saving it with put()
def post(self):
...
# This code is after the code above
if data.is_valid():
entity=data.save(commit=False)
entity.company_lat=lat
entity.company_lng=lng
entity.put()
It really depends on the types of queries you intend to do. If you want to perform geospatial queries, GeoModel is built for your use case.
Related
I am using python with flask as a backend and I am trying to create a route to a table in my database based on user input. During one of my post requests a table is created in the database. I then want to create a connection to this table. The table name also depends on user input. I want to emphasize that I am able to create the tables in the database based on user input and now need to create the route to these tables.
Is there a way to create such route?
I understand that doing something like this may lead to security vulnerability so I am open to suggestions for different approaches.
I am attaching below my current python code for creating routes:
# Destination Class/model
class Destinations(db.Model):
ID = db.Column(db.Integer, primary_key=True)
city = db.Column(db.String(100))
country = db.Column(db.String(100))
def __init__(self, city, country):
self.city = city
self.country = country
# Destinations Schema
class DestinationsSchema(ma.Schema):
class Meta:
fields = ('ID', 'city', 'country')
# Init Destinations Schema
destination_schema = DestinationsSchema()
destinations_schema = DestinationsSchema(many=True)
# Create a Destination
#app.route('/destinations', methods=['POST'])
def add_destination():
city = request.json['city']
country = request.json['country']
new_destination = Destinations(city, country)
db.session.add(new_destination)
db.session.commit()
return destination_schema.jsonify(new_destination)
# Get All Destinations
#app.route('/destinations', methods=['GET'])
def get_destinations():
all_destinations = Destinations.query.all()
result = destinations_schema.dump(all_destinations)
return jsonify(result)
This is my answer to what I understood from your question. I hope it helps you:
You can put a variable in the url. Something like this:
#app.route('/destinations/<tablename>', methods=['GET'])
def get_destination(tablename):
destination = Destinations.query.filter_by(name=tablename)
result = destinations_schema.dump(destination )
return jsonify(result)
We have product like:
class Product(Model):
"""
Base Product Model
"""
shop_id = columns.UUID(primary_key=True, required=True)
product_id = columns.UUID(primary_key=True, required=True, default=uuid.uuid4)
wikimart_id = columns.Integer(index=True) # Convert to user defined type?
yandex_id = columns.Integer(index=True)
Periodically (once a day) we update products from list.
Currently we have to use constructions like
if Product.filter(wikimart_id=external_id):
p = Product.get(shop_id=shop_id, wikimart_id=external_id)
d['product_id'] = p.product_id # Setting key in dict from which model will be updated
Is it ok for Cassandra, or we should think how to create models that will have external_id as primary key for updating products?
Like:
class ProductWikimart(Model):
"""
Wikimart Product Model
"""
shop_id = columns.UUID(primary_key=True, required=True)
wikimart_id = columns.Integer(primary_key=True)
product_id = columns.UUID(index=True)
class ProductYandex(Model):
"""
Yandex Product Model
"""
shop_id = columns.UUID(primary_key=True, required=True)
yandex_id = columns.Integer(primary_key=True)
product_id = columns.UUID(index=True)
Which way is more preferable?
UPD This question is about generic modelling for NoSQL. Not only about cassandra :)
Maybe this article would be helpful for you.
I don't think the product_id is a good candidate for a clustering key due to it relatively frequent changes. So, I think the second version of product model (with ProductWikimart and ProductYandex) would be better. But then you can get new problems: for instance, how to match ProductWikimart and ProductYandex product ids?
Speaking of data modeling for Cassandra in general there is Model Around Your Queries rule. So, to tell what kind of table structure would be better we should know how it would be requested.
I am creating a sample application storing user detail along with its class information.
Modal classes being used are :
Model class for saving user's class data
class MyData(ndb.Model):
subject = ndb.StringProperty()
teacher = ndb.StringProperty()
strength = ndb.IntegerProperty()
date = ndb.DateTimeProperty()
Model class for user
class MyUser(ndb.Model):
user_name = ndb.StringProperty()
email_id = ndb.StringProperty()
my_data = ndb.StructuredProperty(MyData, repeated = True)
I am able to successfully store data into the datastore and can also make simple query on the MyUser entity using some filters based on email_id and user_name.
But when I try to query MyUser result using filter on a property from the MyUser modal's Structured property that is my_data, its not giving correct result.
I think I am querying incorrectly.
Here is my query function
function to query based upon the repeated structure property
def queryMyUserWithStructuredPropertyFilter():
shail_users_query = MyUser.query(ndb.AND(MyUser.email_id == "napolean#gmail.com", MyUser.my_data.strength > 30))
shail_users_list = shail_users_query.fetch(10)
maindatalist=[]
for each_user in shail_users_list:
logging.info('NEW QUERY :: The user details are : %s %s'% (each_user.user_name, each_user.email_id))
# Class data
myData = each_user.my_data
for each_my_data in myData:
templist = [each_my_data.strength, str(each_my_data.date)]
maindatalist.append(templist)
logging.info('NEW QUERY :: The class data is : %s %s %s %s'% (each_my_data.subject, each_my_data.teacher, str(each_my_data.strength),str(each_my_data.date)))
return maindatalist
I want to fetch that entity with repeated Structured property (my_data) should be a list which has strength > 30.
Please help me in knowing where I am doing wrong.
Thanks.
Queries over StructuredProperties return objects for which at least one of the structured ones satisfies the conditions. If you want to filter those properties, you'll have to do it afterwards.
Something like this should do the trick:
def queryMyUserWithStructuredPropertyFilter():
shail_users_query = MyUser.query(MyUser.email_id == "napolean#gmail.com", MyUser.my_data.strength > 30)
shail_users_list = shail_users_query.fetch(10)
# Here, shail_users_list has at most 10 users with email being
# 'napolean#gmail.com' and at least one element in my_data
# with strength > 30
maindatalist = [
[[data.strength, str(data.date)] for data in user.my_data if data.strength > 30] for user in shail_users_list
]
# Now in maindatalist you have ONLY those my_data with strength > 30
return maindatalist
From App Engine NDB documentation:
The NDB API provides persistent storage in a schemaless object
datastore. It supports automatic caching, sophisticated queries, and
atomic transactions. NDB is well-suited to storing structured data
records.
I want to create a structure like the following using NDB, where each instance looks like :
{
city: 'SFO'
date: '2013-01-27'
data: {
'keyword1': count1,
'keyword2': count2,
'keyword3': count3,
'keyword4': count4,
'keyword5': count5,
....
}
}
How can I design such a schema-less entity in Google App Engine(GAE) using NDB?
I am new to GAE and not sure how to achieve this
Thank you
If you don't need to query for the attributes in data you can use one of the properties as mentioned by #voscausa:
JsonProperty
class MyModel(ndb.Model):
city = ndb.StringProperty()
date = ndb.DateProperty()
data = ndb.JsonProperty()
my_model = MyModel(city="somewhere",
date=datetime.date.today(),
data={'keyword1': 3,
'keyword2': 5,
'keyword3': 1,})
StructuredProperty:
class Data(ndb.Model):
keyword = ndb.StringProperty()
count = ndb.IntegerProperty()
class MyModel(ndb.Model):
city = ndb.StringProperty()
date = ndb.DateProperty()
data = ndb.StructuredProperty(Data, repeated=True)
my_model = MyModel(city="somewhere",
date=datetime.date.today(),
data=[Data(keyword="keyword1", count=3),
Data(keyword="keyword2", count=5),
Data(keyword="keyword3", count=1)])
my_model.put()
The problem here is filtering for structured properties. The properties of Keyword are viewed as parallel arrays. Doing a query such as:
q = MyModel.query(MyModel.data.keyword=='keyword1',
MyModel.data.count > 4)
would incorrectly include my_model.
https://developers.google.com/appengine/docs/python/ndb/queries#filtering_structured_properties
Using an expando model would work and allow you to query for keywords:
class MyModel(ndb.Expando):
city = ndb.StringProperty()
date = ndb.DateProperty()
m = MyModel(city="Somewhere", date=datetime.date.today())
m.keyword1 = 3
m.keyword2 = 5
m.keyword3 = 1
m.put()
q = MyModel.query(ndb.GenericProperty('keyword1') > 2)
https://developers.google.com/appengine/docs/python/ndb/entities#expando
You can use the ndb.JsonProperty to represent a list a dictionary or a string in your model. You can have a look in the documentation for more information.
I'm having a problem with the datastore trying to replicate a left join to find items from model a that don't have a matching relation in model b:
class Page(db.Model):
url = db.StringProperty(required=True)
class Item(db.Model):
page = db.ReferenceProperty(Page, required=True)
name = db.StringProperty(required=True)
I want to find any pages that don't have any associated items.
You cannot query for items using a "property is null" filter. However, you can add a boolean property to Page that signals if it has items or not:
class Page(db.Model):
url = db.StringProperty(required=True)
has_items = db.BooleanProperty(default=False)
Then override the "put" method of Item to flip the flag. But you might want to encapsulate this logic in the Page model (maybe Page.add_item(self, *args, **kwargs)):
class Item(db.Model):
page = db.ReferenceProperty(Page, required=True)
name = db.StringProperty(required=True)
def put(self):
if not self.page.has_items:
self.page.has_items = True
self.page.put()
return db.put(self)
Hence, the query for pages with no items would be:
pages_with_no_items = Page.all().filter("has_items =", False)
The datastore doesn't support joins, so you can't do this with a single query. You need to do a query for items in A, then for each, do another query to determine if it has any matching items in B.
Did you try it like :
Page.all().filter("item_set = ", None)
Should work.