I'm using the Google Cloud Datastore in a very simple way, and I try to retrieve an entity by its id. I've read this (it's in Java but seems to follow the same logic)
Def of my entity is here:
class Logs(ndb.Model):
startDate = ndb.DateTimeProperty()
endDate = ndb.DateTimeProperty()
requestedDate = ndb.DateProperty()
taskName = ndb.StringProperty()
status = ndb.StringProperty()
Then when I insert a new one I do
logs = Logs(startDate=datetime.utcnow(),
taskName=taskName,
requestedDate=requestedDate,
status=u'IN_PROGRESS')
key = logs.put()
id = key.id() # I use this variable later
And when I want to retrieve it
logs = Logs.get_by_id(id)
But it never returns any entity...
What's wrong with this ?
Thanks for helping
According to the documentation, you should be able to call get() directly from the Key object to retrieve the entity from Datastore:
logs_entity = Logs(startDate=datetime.utcnow(),
taskName=taskName,
requestedDate=requestedDate,
status=u'IN_PROGRESS')
# Saves entity to Datastore and returns Key
entity_key = logs_entity.put()
# Retrieves entity from Datastore using the previous Key
result = entity_key.get()
Edit:
In the case where you need to pass around the key as a string to rebuild the Key object later you might try using the urlsafe() method, which allows embedding it in a URL:
urlsafe_string = entity_key.urlsafe()
[...]
entity_key= ndb.Key(urlsafe=urlsafe_string)
logs_entity = entity_key.get()
Related
Cannot find an effective way to set 'Date time' for a property when creating an entity in Cloud Datastore using Cloud Functions
I am using a Python 3.7 Cloud Function which takes a JSON file as input. The JSON has an array of objects, which are read one by one to create a datastore entity object, which is then written to the datastore.
In datastore all properties are stored as 'string'. I would like the date fields to be stored as 'Date time' property of the Cloud Datastore entity.
dsclient = datastore.Client(os.environ.get('project_id'))
client = storage.Client()
bucket = client.get_bucket(event['bucket'])
blob = bucket.blob(event['name'])
input_bytes = blob.download_as_string()
input_string = input_bytes.decode("utf-8")
obj = json.loads(input_string)
length = len(obj[kind])
for i in range(0, length):
key = obj[kind][i]['Key']
score = create_score(obj[kind][i], key, ns)
dsclient.put(score)
def create_score(score, key, ns):
"""creates the score datastore entity"""
pkey = dsclient.key(kind, key, namespace='test')
score_entity = datastore.Entity(
key=pkey,
exclude_from_indexes=exclude_index_list
)
score_entity.update(score)
return score_entity
The statement score_entity.update(score) creates all properties as Strings. Is there a way I can specify the datatypes of each property on creation?
I have seen that this is possible using the Python nbd model on App Engine. But I am not using App Engine to create the entity, it is a Cloud Function.
You need to make sure that the property is a datetime.datetime, not a datetime.date or another type of object (including strings).
You should be able do something like:
score_entity = datastore.Entity(
key=pkey,
exclude_from_indexes=exclude_index_list
)
score_entity["your_datetime"] = datetime.datetime.now()
...
client.put(score_entity)
Unable to filter the instance by tags and get the list of instances.Please help how can i proceed further.
import boto3
ec2 = boto3.resource('ec2')
def lambda_handler(event, context):
# Use the filter() method of the instances collection to retrieve
# all running EC2 instances.
filters = [{'Name':'OS_Name', 'Values':['Rstudio']}]
#filter the instances
instances = ec2.instances.filter(Filters=filters)
#locate all running instances
RunningInstances = [instance.id for instance in instances]
#print the instances for logging purposes
#print RunningInstances
#make sure there are actually instances to shut down.
if len(RunningInstances) > 0:
#perform the shutdown
shuttingDown = ec2.instances.filter(InstanceIds=RunningInstances).stop()
print (shuttingDown)
else:
print ("Nothing to see here")
you need to specify the type of filter so in this case it would be tag. filters = [{'Name':'tag:OS_Name', 'Values':['Rstudio']}]
from boto3 docs
tag : - The key/value combination of a tag assigned to the resource. Use the tag key in the filter name and the tag value as the filter value. For example, to find all resources that have a tag with the key Owner and the value TeamA , specify tag:Owner for the filter name and TeamA for the filter value.
I need your help to order listed item.
I am trying to make apps that can send message to his/her friends ( just like social feeds ). After watching Bret Slatkin talk about create microblogging here's my code:
class Message(ndb.Model):
content = ndb.TextProperty()
created = ndb.DateTimeProperty(auto_now=True)
class MessageIndex(ndb.Model):
receivers = ndb.StringProperty(repeated=True)
class BlogPage(Handler):
def get(self):
if self.request.cookies.get("name"):
user_loggedin = self.request.cookies.get("name")
else:
user_loggedin = None
receive = MessageIndex.query(MessageIndex.receivers == user_loggedin)
receive = receive.fetch()
message_key = [int(r.key.parent().id()) for r in receive]
messages = [Message.get_by_id(int(m)) for m in message_key]
for message in messages:
self.write(message)
The first I do a query to get all message that has my name in the receivers. MessageIndex is child of Message, then I can get key of all message that I receive. And the last is I iter get_by_id using list of message key that I get.
This works fine, but I want to filter each message by its created datetime and thats the problem. The final output is listed item, which cant be ordered using .order or .filter
Maybe some of you can light me up.
You can use the message keys in an 'IN' clause in the Message query. Note that you will need to use the parent() key value, not the id() in this case.
eg:
# dtStart, dtEnd are datetime values
message_keys = [r.key.parent() for r in receive]
query = Message.query(Message._key.IN(message_keys), Message.created>dtStart, Message.created<dtEnd)
query = query.order(Message.created) # or -Message.created for desc
messages = query.fetch()
I am unsure if you wish to simply order by the Message created date, or whether you wish to filter using the date. Both options are catered for above.
How do we use function clone_entity() as described in Copy an entity in Google App Engine datastore in Python without knowing property names at 'compile' time to copy the values to an entity in a different Kind? (since the keys get copied as well so cloning happens in the same Kind so the solution at the above link does not work for this particular purpose!)
Tried the following (and other variations but to no avail)
query = db.GqlQuery("SELECT * FROM OrigKind")
results = query.fetch(10);
for user in results:
new_entry = models.NewKind()
new_entry_complete_key = new_entry.put()
new_entry = clone_entity(user, Key=new_entry_complete_key)
new_entry.put()
(need to copy all entities from OrigKind to NewKind)
You need a modified version of clone_entity:
There are some pitfalls to the original clone method that are discussed in the answers of the original implementation.
def clone_entity(e, to_klass, **extra_args):
"""Clones an entity, adding or overriding constructor attributes.
The cloned entity will have exactly the same property values as the original
entity, except where overridden. By default it will have no parent entity or
key name, unless supplied.
Args:
e: The entity to clone
extra_args: Keyword arguments to override from the cloned entity and pass
to the constructor.
Returns:
A cloned, possibly modified, copy of entity e.
"""
klass = e.__class__
props = dict((k, v.__get__(e, klass)) for k, v in klass.properties().iteritems())
props.update(extra_args)
return to_klass(**props)
# Use the clone method
query = db.GqlQuery("SELECT * FROM OrigKind")
results = query.fetch(10);
for user in results:
new_entry = clone_entity(user, NewKind)
new_entry.put()
I would like to add a couple of things to Shay's answer:
Treating the case where to_klass doesn't have properties that e does.
Adapting the clone_entity method to work with ndb.Model
.
def clone_entity(e, to_klass, **extra_args):
"""Clones an entity, adding or overriding constructor attributes.
The cloned entity will have exactly the same property values as the original
entity, except where overridden or missing in to_klass. By default it will have
no parent entity or key name, unless supplied.
Args:
e: The entity to clone
to_klass: The target class
extra_args: Keyword arguments to override from the cloned entity and pass
to the constructor.
Returns:
A cloned, possibly modified, instance of to_klass with the same properties as e.
"""
klass = e.__class__
props = dict((k, v.__get__(e, klass))
for k, v in klass._properties.iteritems()
if type(v) is not ndb.ComputedProperty
)
props.update(extra_args)
allowed_props = to_klass._properties
for key in props.keys():
if key not in allowed_props:
del props[key]
return to_klass(**props)
It just wrote a utility to copy enties from one appid to another and to zip entities of a kind. This utility makes an exact clone, including keys, NDB repeated properties, serving_urls and blobs referenced in the kind. To make this work I have to know the property types of the entities. I use Python 27 and NDB, but the utility also transfers db.Models.
Here is the code to find all the property types for a kind :
self.kind = 'Books' # the entities to copy
self.model_mods = {'Books' : 'models'} # modules to import the model from for a kind
module = __import__(self.model_mods[self.kind], globals(), locals(), [self.kind], -1)
self.model_class = getattr(module, self.kind)
entity = self.model_class() # ndb or db
if isinstance(entity, ndb.Model):
self.ndb = True
self.query = self.model_class.query() # prepare the query to get all the entities
self.makePage = self._pager(self.ndbPager) # wrap the ndb pager
elif isinstance(entity, db.Model):
self.ndb = False
self.query = self.model_class.all()
self.makePage = self._pager(self.dbPager) # wrap the db pager
else :
raise ValueError('Failed to classify entities of kind : ' + str(self.kind))
logging.info('Entities of kind : %s inherits from class : %s.Model'
%(self.kind, self.ndb * 'ndb' + (not self.ndb) * 'db'))
self.data_types = {} # create a dict of property data types
for key in self.model_class._properties : # the internals of the model_class object
property_object = getattr(self.model_class, key.split('.')[0]) # strip, so it works for repeated structured properties
self.data_types[key] = property_object.__class__.__name__ # get the property type
logging.debug(self.data_types)
In the above code I wrap a pager (paged transfer using a cursor) for db or NDB to transfer the entities between the GAE appid's.
Based on the properties I can encode and decode the properties to transfer the model. To do this I first create a dict of the entities using NDB : entity.to_dict() or db: entity.to_dict(). And I add the key to the dict. Now I can encode the properties of the entity and pickle the result to transfer the encoded entity:
data = pickle.dumps(entity_dict, 1)
encoded_entity = base64.b64encode(data)
I have a list of about 20 objects and for each object I return a list of 10 dictionaries.
I am trying to store the list of 10 dictionaries for each object in the list on GAE; I do not think I am writing the code correctly to store this information to GAE.
Here is what I have:
Before my main request handler I have this class:
class Tw(db.Model):
tags = db.ListProperty()
ip = db.StringProperty()
In my main request handler I have the following:
for city in lst_of_cities: # this is the list of 20 objects
dict_info = hw12.twitter(city) # this is the function to get the list of 10 dictionaries for each object in the list
datastore = Tw() # this is the class defined for db.model
datastore.tags.append(dict_info) #
datastore.ip = self.request.remote_addr
datastore.put()
data = Data.gql("") #data entities we need to fetch
I am not sure if this code is write at all. If anyone could please help it would be much appreciated.
Welcome to Stack Overflow!
I see a few issues:
Dictionaries are not supported value types for App Engine properties.
You're only storing the last entity; the rest are discarded.
You're using a ListProperty, but instead of appending each element of dict_info, you're doing a single append of the entire list.
Since you can't store a raw dictionary inside a property, you need to serialize it to some other format, like JSON or pickle. Here's a revised example using pickle:
from google.appengine.ext import db
import pickle
class Tw(db.Model):
tags = db.BlobProperty()
ip = db.StringProperty()
entities = []
for city in lst_of_cities:
dict_info = hw12.twitter(city)
entity = Tw()
entity.tags = db.Blob(pickle.dumps(dict_info))
entity.ip = self.request.remote_addr
entities.append(entity)
db.put(entities)
When you fetch the entity later, you can retrieve your list of dictionaries with pickle.loads(entity.tags).
When I deal with data types that are not directly supported by Google App Engine like dictionaries or custom data type, I usually adopt the handy PickleProperty.
from google.appengine.ext import db
import pickle
class PickleProperty(db.Property):
def get_value_for_datastore(self, model_instance):
value = getattr(model_instance, self.name, None)
return pickle.dumps(value)
def make_value_from_datastore(self, value):
return pickle.loads(value)
Once declared the PickleProperty class in your commons.py module, you can use it to store your custom data with something like this:
from google.appengine.ext import db
from commons import PickleProperty
class Tw(db.Model):
tags = PickleProperty()
ip = db.StringProperty()
entities = []
for city in lst_of_cities:
dict_info = hw12.twitter(city)
entity = Tw()
entity.tags = dict_info
entity.ip = self.request.remote_addr
entities.append(entity)
db.put(entities)
To retrieve the data back go with:
entity.tags
Since this was written, the App Engine has pushed out their experimental "ndb" Python database model, which contains in particular the JsonProperty, something that pretty well directly implements what you want.
Now, you need to be running the Python 2.7 version of the App Engine, which is still not quite ready for production, but it all seems pretty stable these days, GvR himself seems to be writing a lot of the code which bodes well for the code quality, and I'm intending to use this in production sometime this year...