i build a blog with gae, and stored many items in memcache, including the paged entries.
the key to store these pages is use query object and pageindex:
#property
def _query_id(self):
if not hasattr(self, '__query_id'):
hsh = hashlib.md5()
hsh.update(repr(self.query))
self.__query_id = hsh.hexdigest()
return self.__query_id
def _get_cache_key(self, page):
return '%s%s' % (self._query_id, page)
it'll show in admin console like: NDB9:xxxxxx,
beside this, I stored any other item start with sitename-obj.
In some case, I want to only clear all the paged cache, but I don't know how.
I wonder if there is a way to delete memcache by key name which start with NDB9?
yes, I'v found such function,
delete_multi(keys, seconds=0, key_prefix='', namespace=None)
but it seems that the key_prefix is just add to every key in the first argument, and I want to only delete memcache by key_prefix.
You cannot delete keys by prefix; you can only delete specific keys, or flush all the of the cache.
In this case, you'd have to loop over all page ids to produce all possible keys. Pass those to delete_multi().
The key_prefix argument is just a convenience method; you can send shorter 'keys' if they all have the same prefix. If all your keys start with NDB9, use that as the key prefix, and send a list of keys over without that prefix. The prefix will be added to each key by the memcached server when looking for what keys to delete.
Use memcache to store all other keys.
keys = [key1, key2, key3 ....]
When you need to delete keys by pattern, iterate over this value and use delete_multi to delete those keys
Related
I've got the following problem. I don't seem to figure out on how to access the "UID" value within the warning field. I want to iterate over every value inside the document, and access every existing UID to check if a randomly generated UID already exists. I'm honestly about to have a mental breakdown due this, I just can't seem to figure it out
This is what my MongoDB structure looks like:
https://i.imgur.com/sfKGLnf.png
warnings will be a list after you've got the object in python code - so you can just iterate over the docs in the warnings list and access their UID keys
edited code for comments:
We can get all the documents in the collection by using find with an empty query dict, though I've linked the docs on find below. As your document structure seems to have a random number as a key, and then an embedded doc, we find the keys in the document that aren't _id. It may be that there's only one key, but I didn't want to assume. If there's then a warnings key in our embedded doc, we iterate over each of the warnings and add the "UID" to our warning_uids list.
# this is your collection object
mod_logs_collection = database["modlogs"]
all_mod_docs = mod_logs_collection.find({})
warning_uids = []
for doc in all_mod_docs:
# filter all the doc keys that aren't the default `_id`
doc_keys = [key for key in doc.keys() if key != "_id"]
# not sure if it's possible for you to have multiple keys in a doc
# with this structure but we'll iterate over the top-level doc keys
# just in case
for key in doc_keys:
sub_doc = doc[key]
if warnings := sub_doc.get("warnings")
for warning in warnings:
warning_uids.append(warning["UID"])
print(warning["UID"])
pymongo find
mongo docs on querying
A newbie to DynamoDb and python in general here. I have a task to complete where I have to retrieve information from a DynamoDB using some information provided. I've set up the access keys and such and I've been provided a 'Hash Key' and a table name. I'm looking for a way to use the hash key in order to retrieve the information but I haven't been able to find something specific online.
#Table Name
table_name = 'Waypoints'
#Dynamodb client
dynamodb_client = boto3.client('dynamodb')
#Hash key
hash_key = {
''
}
#Retrieve items
response = dynamodb_client.get_item(TableName = table_name, Key = hash_key)
Above is what I have writtenbut that doesn't work. Get item only returns one_item from what I can gather but I'm not sure what to pass on to make it work in the first place.
Any sort of help would be greatly appreaciated.
First of all, in get_item() request the key should not be just the key's value, but rather a map with the key's name and value. For example, if your hash-key attribute is called "p", the Key you should pass would be {'p': hash_key}.
Second, is the hash key the entire key in your table? If you also have a sort key, in a get_item() you must also specify that part of the key - and the result is one item. If you are looking for all the items with a particular hash key but different sort keys, then the function you need to use is query(), not get_item().
So i am trying to fetch data from the mysql into a python dictionary
here is my code.
def getAllLeadsForThisYear():
charges={}
cur.execute("select lead_id,extract(month from transaction_date),pid,extract(Year from transaction_date) from transaction where lead_id is not NULL and transaction_type='CHARGE' and YEAR(transaction_date)='2015'")
for i in cur.fetchall():
lead_id=i[0]
month=i[1]
pid=i[2]
year=str(i[3])
new={lead_id:[month,pid,year]}
charges.update(new)
return charges
x=getAllLeadsForThisYear()
when i prints (len(x.keys()) it gave me some number say 450
When i run the same query in mysql it returns me 500 rows.Although i do have some same keys in dictionary but it should count them as i have not mentioned it if i not in charges.keys(). Please correct me if i am wrong.
Thanks
As I said, the problem is that you are overwriting your value at a key every time a duplicate key pops up. This can be fixed two ways:
You can do a check before adding a new value and if the key already exists, append to the already existing list.
For example:
#change these lines
new={lead_id:[month,pid,year]}
charges.update(new)
#to
if lead_id in charges:
charges[lead_id].extend([month,pid,year])
else
charges[lead_id] = [month,pid,year]
Which gives you a structure like this:
charges = {
'123':[month1,pid1,year1,month2,pid2,year2,..etc]
}
With this approach, you can reach each separate entry by chunking the value at each key by chunks of 3 (this may be useful)
However, I don't really like this approach because it requires you to do that chunking. Which brings me to approach 2.
Use defaultdict from collections which acts in the exact same way as a normal dict would except that it defaults a value when you try to call a key that hasn't already been made.
For example:
#change
charges={}
#to
charges=defaultdict(list)
#and change
new={lead_id:[month,pid,year]}
charges.update(new)
#to
charges[lead_id].append((month,pid,year))
which gives you a structure like this:
charges = {
'123':[(month1,pid1,year1),(month2,pid2,year2),(..etc]
}
With this approach, you can now iterate through each list at each key with:
for key in charges:
for entities in charges[key]:
print(entities) # would print `(month,pid,year)` for each separate entry
If you are using this approach, dont forget to from collections import defaultdict. If you don't want to import external, you can mimic this by:
if lead_id in charges:
charges[lead_id].append((month,pid,year))
else
charges[lead_id] = [(month,pid,year)]
Which is incredibly similar to the first approach but does the explicit "create a list if the key isnt there" that defaultdict would do implicitly.
Is there a function in Google App Engine to test if a string is valid 'string key' prior to calling memcache.get(key) without using db.get() or db.get_by_key_name() first?
In my case the key is being passed from the user's get request:
obj = memcache.get(self.request.get("obj"))
Somehow I'd like to know if that string is a valid key string without calling the db first, which would defeat the purpose of using memcache.
That is probably the most efficient (and practical) way to determine if the key string is valid. The code is obviously performing that test for you before it attempts to retrieve the entity from memcache/datastore. Even better, Google will update that code if necessary.
try:
obj = memcache.get(self.request.get("obj"))
except BadKeyError:
# give a friendly error message here
Also, consider switching to ndb. Performing a get() on a key automatically uses two levels of cache, local and memcache. You don't need to write separate code for memcache.
A db module key sent to a client should pass through str(the_key) which gives you an URL safe encoded key. Your templating environment etc.. will do this for you just by rendering the key into a template.
On passing the key back from a client, you should recreate the key with
key = db.Key(encoded=self.request.get("obj"))
At this point it could fail with something like
BadKeyError: Invalid string key "thebadkeystring"=.
If not you have a valid key
obj = memcache.get(self.request.get("obj")) won't actually raise BadKeyError because at that point you are just working with a string, and you just get None returned or a value.
So at that point all you know is you have a key missing.
However you need to use the memcache.get(self.request.get("obj")) to get the object from memcache, as a db.Key instance is not a valid memcache key.
So you will be constructing a key to validate the key string at this point. Of course if the memcache get fails then you can use the just created key to fetch the object with db.get(key)
Any object is a valid key, provided that the object can be serialized using pickle. If pickle.dumps(key) succeeds, then you shouldn't get a BadKeyError.
After fetching an entity from the datastore, I'd like to save its key into memcache and also pass it as part of a url parameter for a task for referencing it later.
However, since Key is a composite item, you can't forward it as is and when I try to re-construct the key, the values aren't identical.
What's the best approach for passing of a key to reference the entity at a later time?
entity_key = feed_entity.key()
logging.info(entity_key) # produces a string like key value
# would like to save a way to reference the key later
memcache.set(entity_key.id_or_name(), some_piece_of_data);
# Will produce the error:
# Key must be a string instance, received datastore_types.Key.from_path
# (u'FeedEntity', u'My_Entity_Name', u'FeedEntity', 2L, _app=u'dev~test_app')
reconstructed_key = Key.from_path('FeedEntity', 'My_Entity_Name', 'FeedEntity', entity_key.id_or_name());
logging.info(reconstructed_key)
# Not the same value as entity_key
params = {"key": entity_key_string_value} # this would be ideal
task = taskqueue.Task(url='/feed_entity/list', params=params).add(queue_name="feed-gopher")
See http://code.google.com/appengine/docs/python/datastore/keyclass.html#Key
A key can be encoded to a string by passing the Key object to str() (or calling the object's __str__() method). A string-encoded key is an opaque value using characters safe for including in URLs. The string-encoded key can be converted back to a Key object by passing it to the Key constructor (the encoded argument).