I have tried to delete an entity from the GAE datastore for hours now and it doesn't work as it should. I pretty much did the same thing as how to delete NDB entity using ID?, however I'm sure the problem is with the ancestor relationship.
This is the relevant piece of code:
try:
ndb.Key('NewsBase', int(self.request.get('delid'))).delete()
When I print out the ndb.Key (self.request.out.write...) I get something like Key('NewsBase', 8008), which is the correct ID (checked in datastore). In the dashboard I also get the "Decoded entity key", which is
NewsBase: name=mynews > NewsBase: id=8001
I am a little confused on how to include the ancestor information but as far as I can tell from here Using Key in NDB to retrieve an entity I don't need it at all, or do I?
EDIT: How I create keys
def news_key(base_name='mynews'):
return ndb.Key('NewsBase', base_name)
t = NewsBase(parent=news_key('mynews'))
t.user = user
t.put()
You need the full key, including the ancestor if there is one. That's because the child ID by itself is not necessarily unique: only the full path is, so you need it to identify the particular entity.
In your case, you probably just want nb.Key('NewsBase', 'mynews', 'NewsBase', 8001).
(I suspect however that you are doing something strange to create your keys in the first place: it's unusual to have an ancestor, with a name key, of the same type as the numeric ID of the child.)
Try using the urlsafe version of the key instead of the ID:
Output the key as:
key.urlsafe() instead of key.id()
and delete it in your request handler as:
ndb.Key(urlsafe=self.request.get('delkey')).delete()
the urlsafe version of the key will contain all necessary ancestor information.
Also, does your news_key function know that the key its making exists? You should not store an entity with a parent key for an entity that does not exist.
You're news_key should probably be something more like:
def news_key(base_name='mynews'):
return NewsBase.get_or_insert(id=base_name).key
Just as an FYI - Deleting the parent does not delete all children. Also, the way you have it shown here, the Parent to your NewsBase entity will be another NewsBase entity.
Related
I have a datastore entity with several properties. Each property is updated using a separate method. However, every so often I find that either a method overwrites a property it is not modifying with an old value (Null).
For example.
class SomeModel(ndb.Model):
property1 = ndb.StringProperty()
property2 = ndb.StringProperty()
def method1(self, entity_key_urlsafe):
data1 = ndb.Key(urlsafe = entity_key_urlsafe).get()
data1.property1 = "1"
data1.put()
The data 1 entity now has property1 with value of "1"
def method2(self, entity_key_urlsafe):
data1 = ndb.Key(urlsafe = entity_key_urlsafe).get()
data1.property2 = "2"
data1.put()
The data 1 entity now has property2 with value of "2"
However, if these methods are run to closely in succession - method2 seems to overwrite property1 with its initial value (Null).
To get around this issue, I've been using the deferred library, however it's not reliable (deferred entities seem to disappear every now-and-then) or predictable (the _countdown time seems to be for guidance at best) enough.
My question is: Is there a way to only retrieve and modify one property of a datastore entity without overwriting the rest when you call data1.put()? I.e. In the case of method2 - could I only write to property2 without overwriting property1?
The way to prevent such overwrites, is to make sure your updates are done inside transactions. With NDB this is really easy - just attach the #ndb.transactional decorator to your methods:
#ndb.transactional
def method1(self, entity_key_urlsafe):
data1 = ndb.Key(urlsafe = entity_key_urlsafe).get()
data1.property1 = "1"
data1.put()
The documentation on transactions with NDB doesn't give as much background as the (older) DB version, so to familiarise yourself fully with the limitations and options, you should read both.
I say No
I have never seen a reference to that or a trick or a hack.
I also think that it would be quite difficult for such an operation to exist.
When you perform .put() on an entity the entity is serialised and then written.
An entity is an instance of the Class that you can save or retrieve from the Datastore.
Imagine if you had a date property that has auto_now? What would have to happen then? Which of the 2 saves should edit that property?
Though your problem seems to be different. One of your functions commits first and nullifies the other methods value because it retrieves an outdated copy, and not the expected one.
#Greg's Answer talks about transactions. You might want to take a look at them.
Transactions are used for concurrent requests and not that much for succession.
Imagine that 2 users pressing the save button to increase a counter at the same time. There transactions work.
#ndb.transactional
def increase_counter(entity_key_urlsafe):
entity = ndb.Key(urlsafe = entity_key_urlsafe).get()
entity.counter += 1
entity.put()
Transactions will ensure that the counter is correct.
The first that tries to commit the above transaction will succeed and the later will have to retry if retries are on (3 by default).
Though succession is something different. Said that, I and #Greg advise you to change your logic towards using transaction if the problem you want to solve is something like the counter example.
I want to retrieve all of my Class entities key_names and not the Key.
Is there a way to do this with gae?
Here is my current code:
entities = db.GqlQuery("SELECT __key_name__ FROM Class").fetch(1000)
logging.info(entities)
which of course dosen't work because there is no property key_name. Does anyone know how to get the same effect?
I know how to return the Key property but what I want is the key_name.
The key contains the key name, of course. So when you've got a list of keys, you can call name() on each of them to get the name component.
I do not really understand what does the key.from_path() do.
If you could explained it a little better and more concise then here.
Also, the parent parameter intrigues me .
Every item in the datastore has a key.
k = Key.from_path('User', 'Boris', 'Address', 9876)
You can either create that key and then use it to retrieve the object in the datastore that has that key or you can save an object to the datastore with that key for later retrieval.
address_k is a key after this operation.
address_k = db.Key.from_path('Employee', 'asalieri', 'Address', 1)
address = db.get(address_k)
Then the second line gets the datastore object that has that key.
Parent simply says that this object is a child of another object. So when you set the parent it becomes part of the key also.
address = Address(parent=employee)
You could have multiple address objects, all with the same parent, employee. Your employee might have many homes! Read this: https://developers.google.com/appengine/docs/python/datastore/entities#Ancestor_Paths
I'm trying to do:
MyModel({'text': db.Text('text longer than 500 byets')})
But get:
BadValueError: Indexed value fb_education must be at most 500 bytes
I'm thinking this is just a carry over from this issue with the old db api.
https://groups.google.com/forum/?fromgroups#!topic/google-appengine/wLAwrjtsuks
First create entity dynamically :
kindOfEntity = "MyTable"
class DynamicEntity(ndb.Expando):
#classmethod
def _get_kind(cls):
return kindOfEntity
then after to assign Text Properties run time/dynamically as shown below
dbObject = DynamicEntity()
key = "studentName"
value = "Vijay Kumbhani"
textProperties = ndb.TextProperty(key)
dbObject._properties[key] = {}
dbObject._values[key] = {}
dbObject._properties[key] = textProperties
dbObject._values[key] = value
dbObject.put()
then after key properties assign with Text properties
You're trying to use a db.Text, part of the old API, with NDB, which isn't going to work.
To the best of my knowledge, there's no good way to set unindexed properties in an Expando in NDB, currently. You can set _default_indexed = False on your expando subclass, as (briefly) documented here, but that will make the default for all expando properties unindexed.
A better solution would be to avoid the use of Expando alltogether; there are relatively few compelling uses for it where you wouldn't be better served by defining a model (or even defining one dynamically).
Yeah, I know question is old. But I also googled for same solutions and not found any result.
So here receipt that works for me (I expand User() with "permissions" property):
prop = ndb.GenericProperty("permissions", indexed=False)
prop._code_name = "permissions"
user._properties["permissions"] = prop
prop._set_value(user, permissions)
The previous answer was VERY use to me... Thanks!!! I just wanted to add that it appears you can also create a specific property type using this technique (if you know the datatype you want to create). When the entity is later retrieved, the dynamic property is set to the specific type instead of GenericProperty. This can be handy for ndb.PickleProperty and ndb.JsonProperty values in particular (to get the in/out conversions).
prop = ndb.TextProperty("permissions", indexed=False)
prop._code_name = "permissions"
user._properties["permissions"] = prop
prop._set_value(user, permissions)
I was trying to just change one property of an entity to Text. But when you don't map your properties explicitly, Expando/Model seem to change all properties of an entity to GenericProperty (after get).
When you put those entities again (to change the desired property), it affects other existing TextProperties, changing then to regular strings.
Only the low-level datastore api seems to work:
https://gist.github.com/feroult/75b9ab32b463fe7f9e8a
You can call this from the remote_api_shell.py:
from yawp import *
yawp(kind).migrate(20, 'change_property_to_text', 'property_name')
I need some properties to be unique. How can I achieve this?
Is there something like unique=True?
I'm using Google App Engine for Python.
Google has provided function to do that:
http://code.google.com/appengine/docs/python/datastore/modelclass.html#Model_get_or_insert
Model.get_or_insert(key_name, **kwds)
Attempts to get the entity of the model's kind with the given key name. If it exists, get_or_insert() simply returns it. If it doesn't exist, a new entity with the given kind, name, and parameters in kwds is created, stored, and returned.
The get and subsequent (possible) put are wrapped in a transaction to ensure atomicity. Ths means that get_or_insert() will never overwrite an existing entity, and will insert a new entity if and only if no entity with the given kind and name exists.
In other words, get_or_insert() is equivalent to this Python code:
def txn():
entity = MyModel.get_by_key_name(key_name, parent=kwds.get('parent'))
if entity is None:
entity = MyModel(key_name=key_name, **kwds)
entity.put()
return entity
return db.run_in_transaction(txn)
Arguments:
key_name
The name for the key of the entity
**kwds
Keyword arguments to pass to the model class's constructor if an instance with the specified key name doesn't exist. The parent argument is required if the desired entity has a parent.
Note: get_or_insert() does not accept an RPC object.
The method returns an instance of the model class that represents the requested entity, whether it existed or was created by the method. As with all datastore operations, this method can raise a TransactionFailedError if the transaction could not be completed.
There's no built-in constraint for making sure a value is unique. You can do this however:
query = MyModel.all(keys_only=True).filter('unique_property', value_to_be_used)
entity = query.get()
if entity:
raise Exception('unique_property must have a unique value!')
I use keys_only=True because it'll improve the performance slightly by not fetching the data for the entity.
A more efficient method would be to use a separate model with no fields whose key name is made up of property name + value. Then you could use get_by_key_name to fetch one or more of these composite key names and if you get one or more not-None values, you know there are duplicate values (and checking which values were not None, you'll know which ones were not unique.)
As onebyone mentioned in the comments, these approaches – by their get first, put later nature – run the risk concurrency issues. Theoretically, an entity could be created just after the check for an existing value, and then the code after the check will still execute, leading to duplicate values. To prevent this, you will have to use transactions: Transactions - Google App Engine
If you're looking to check for uniqueness across all entities with transactions, you'd have to put all of them in the same group using the first method, which would be very inefficient. For transactions, use the second method like this:
class UniqueConstraint(db.Model):
#classmethod
def check(cls, model, **values):
# Create a pseudo-key for use as an entity group.
parent = db.Key.from_path(model.kind(), 'unique-values')
# Build a list of key names to test.
key_names = []
for key in values:
key_names.append('%s:%s' % (key, values[key]))
def txn():
result = cls.get_by_key_name(key_names, parent)
for test in result:
if test: return False
for key_name in key_names:
uc = cls(key_name=key_name, parent=parent)
uc.put()
return True
return db.run_in_transaction(txn)
UniqueConstraint.check(...) will assume that every single key/value pair must be unique to return success. The transaction will use a single entity group for every model kind. This way, the transaction is reliable for several different fields at once (for only one field, this would be much simpler.) Also, even if you've got fields with the same name in one or more models, they will not conflict with each other.