I want to retrieve all of my Class entities key_names and not the Key.
Is there a way to do this with gae?
Here is my current code:
entities = db.GqlQuery("SELECT __key_name__ FROM Class").fetch(1000)
logging.info(entities)
which of course dosen't work because there is no property key_name. Does anyone know how to get the same effect?
I know how to return the Key property but what I want is the key_name.
The key contains the key name, of course. So when you've got a list of keys, you can call name() on each of them to get the name component.
Related
I have the following models:
class Company(ndb.Model):
name = ndb.StringProperty(indexed=False)
# some other fields
class User(polymodel.PolyModel):
company = ndb.KeyProperty(kind=Company)
# some other fields
class Object(ndb.Model):
user = ndb.KeyProperty(kind=User)
# some other fields
Now I have a user and I want to query Objects that are associated with other Users in the same company like this:
Object.query(Object.user.company == user.company)
Of course, this doesn't work, since Object.user is a key and I cannot access anything beyond that.
Is there any way to do it? I only need the company key, I was thinking on a ComputedProperty but I'm not sure if it's the best solution. Also, it would be better to query based on any field in company.
You need to denormalize and store redundant information, as the datastore doesn't support joins.
For instance given your models above, a user can only be a member of one company, if you really need to search all objects whose user is a member of a particular company then store the company key in the Object.
Use a computed property if that works best for you.
Alternately use a factory that always takes the User as a argument and construct Object that way.
Say we have the following class:
Class Alert(models.Model):
contact = models.ForeignKey('emails.Contact',null=True,blank=True)
If I wanted to get the foreign key of the contact I would do somealert.contact.pk or somealert.contact_id. Do these commands pull down the whole contact object and then get the key? Or do any of them just yield the foreign key without pulling all off the attributes of the instance from the database. I worried about performance and would prefer to just get the key itself.
The first one - somealert.contact.pk - will get the Contact object. The second - somealert.contact_id - won't.
You can verify this in the shell by looking at the contents of django.db.connection.queries.
I have tried to delete an entity from the GAE datastore for hours now and it doesn't work as it should. I pretty much did the same thing as how to delete NDB entity using ID?, however I'm sure the problem is with the ancestor relationship.
This is the relevant piece of code:
try:
ndb.Key('NewsBase', int(self.request.get('delid'))).delete()
When I print out the ndb.Key (self.request.out.write...) I get something like Key('NewsBase', 8008), which is the correct ID (checked in datastore). In the dashboard I also get the "Decoded entity key", which is
NewsBase: name=mynews > NewsBase: id=8001
I am a little confused on how to include the ancestor information but as far as I can tell from here Using Key in NDB to retrieve an entity I don't need it at all, or do I?
EDIT: How I create keys
def news_key(base_name='mynews'):
return ndb.Key('NewsBase', base_name)
t = NewsBase(parent=news_key('mynews'))
t.user = user
t.put()
You need the full key, including the ancestor if there is one. That's because the child ID by itself is not necessarily unique: only the full path is, so you need it to identify the particular entity.
In your case, you probably just want nb.Key('NewsBase', 'mynews', 'NewsBase', 8001).
(I suspect however that you are doing something strange to create your keys in the first place: it's unusual to have an ancestor, with a name key, of the same type as the numeric ID of the child.)
Try using the urlsafe version of the key instead of the ID:
Output the key as:
key.urlsafe() instead of key.id()
and delete it in your request handler as:
ndb.Key(urlsafe=self.request.get('delkey')).delete()
the urlsafe version of the key will contain all necessary ancestor information.
Also, does your news_key function know that the key its making exists? You should not store an entity with a parent key for an entity that does not exist.
You're news_key should probably be something more like:
def news_key(base_name='mynews'):
return NewsBase.get_or_insert(id=base_name).key
Just as an FYI - Deleting the parent does not delete all children. Also, the way you have it shown here, the Parent to your NewsBase entity will be another NewsBase entity.
I do not really understand what does the key.from_path() do.
If you could explained it a little better and more concise then here.
Also, the parent parameter intrigues me .
Every item in the datastore has a key.
k = Key.from_path('User', 'Boris', 'Address', 9876)
You can either create that key and then use it to retrieve the object in the datastore that has that key or you can save an object to the datastore with that key for later retrieval.
address_k is a key after this operation.
address_k = db.Key.from_path('Employee', 'asalieri', 'Address', 1)
address = db.get(address_k)
Then the second line gets the datastore object that has that key.
Parent simply says that this object is a child of another object. So when you set the parent it becomes part of the key also.
address = Address(parent=employee)
You could have multiple address objects, all with the same parent, employee. Your employee might have many homes! Read this: https://developers.google.com/appengine/docs/python/datastore/entities#Ancestor_Paths
I need some properties to be unique. How can I achieve this?
Is there something like unique=True?
I'm using Google App Engine for Python.
Google has provided function to do that:
http://code.google.com/appengine/docs/python/datastore/modelclass.html#Model_get_or_insert
Model.get_or_insert(key_name, **kwds)
Attempts to get the entity of the model's kind with the given key name. If it exists, get_or_insert() simply returns it. If it doesn't exist, a new entity with the given kind, name, and parameters in kwds is created, stored, and returned.
The get and subsequent (possible) put are wrapped in a transaction to ensure atomicity. Ths means that get_or_insert() will never overwrite an existing entity, and will insert a new entity if and only if no entity with the given kind and name exists.
In other words, get_or_insert() is equivalent to this Python code:
def txn():
entity = MyModel.get_by_key_name(key_name, parent=kwds.get('parent'))
if entity is None:
entity = MyModel(key_name=key_name, **kwds)
entity.put()
return entity
return db.run_in_transaction(txn)
Arguments:
key_name
The name for the key of the entity
**kwds
Keyword arguments to pass to the model class's constructor if an instance with the specified key name doesn't exist. The parent argument is required if the desired entity has a parent.
Note: get_or_insert() does not accept an RPC object.
The method returns an instance of the model class that represents the requested entity, whether it existed or was created by the method. As with all datastore operations, this method can raise a TransactionFailedError if the transaction could not be completed.
There's no built-in constraint for making sure a value is unique. You can do this however:
query = MyModel.all(keys_only=True).filter('unique_property', value_to_be_used)
entity = query.get()
if entity:
raise Exception('unique_property must have a unique value!')
I use keys_only=True because it'll improve the performance slightly by not fetching the data for the entity.
A more efficient method would be to use a separate model with no fields whose key name is made up of property name + value. Then you could use get_by_key_name to fetch one or more of these composite key names and if you get one or more not-None values, you know there are duplicate values (and checking which values were not None, you'll know which ones were not unique.)
As onebyone mentioned in the comments, these approaches – by their get first, put later nature – run the risk concurrency issues. Theoretically, an entity could be created just after the check for an existing value, and then the code after the check will still execute, leading to duplicate values. To prevent this, you will have to use transactions: Transactions - Google App Engine
If you're looking to check for uniqueness across all entities with transactions, you'd have to put all of them in the same group using the first method, which would be very inefficient. For transactions, use the second method like this:
class UniqueConstraint(db.Model):
#classmethod
def check(cls, model, **values):
# Create a pseudo-key for use as an entity group.
parent = db.Key.from_path(model.kind(), 'unique-values')
# Build a list of key names to test.
key_names = []
for key in values:
key_names.append('%s:%s' % (key, values[key]))
def txn():
result = cls.get_by_key_name(key_names, parent)
for test in result:
if test: return False
for key_name in key_names:
uc = cls(key_name=key_name, parent=parent)
uc.put()
return True
return db.run_in_transaction(txn)
UniqueConstraint.check(...) will assume that every single key/value pair must be unique to return success. The transaction will use a single entity group for every model kind. This way, the transaction is reliable for several different fields at once (for only one field, this would be much simpler.) Also, even if you've got fields with the same name in one or more models, they will not conflict with each other.