How can avoid DatabaseSessionIsOver from outside calls - python

I have some definitions of models where I had overridden their __repr__ methods. So for example, let's take into account the following entities:
def A(db.Entity):
id = PrimaryKey(int, auto=True)
name = Required(unicode)
b = Optional("B")
def __repr__(self):
return self.name
def B(db.Entity):
id = PrimaryKey(int, auto=True)
name = Required(unicode)
a = Required("A")
def __repr__(self):
return '{n} from a={aname}'.format(n=self.name, aname = self.a)
It is raising the DatabaseSessionIsOver exception whilst I was using search(B, 'aaaa) method from Flask-PonyWhoosh even if it are using db_session wrapped inside:
#orm.db_session
def search(model, *arg, **kw):
return model._wh_.search(*arg, **kw)
The exception raises only when some entity override the __repr__ method in that way that I did in the example above.
However, I'am using to avoid the problem the following sentences:
with db_session:
print(search(A, 'karl'))
So, shortly, the question is, is there any way to avoid the using of with ..., maybe modifying the __repr__ method or maybe modifying the methods from the package?.
Thanks,
PD: I've been reading prefetch method but it seems to be not appropriate. I'm not sure.

The exception DatabaseSessionIsOver happens because in your repr method you're trying to access the relationship attribute, which wasn't loaded from the database (self.a which tries to return the name attribute of the A entity).
One way to avoid this exception is to load all the necessary objects before you leave the db_session. In this case, those objects will sit in the Identity Map and no request to the database will be required.
Another way is to wrap all your code with a bigger scope db_session, so when you access the attribute which was not loaded from the database, Pony can do this within the db_session.
Pony requires using the #db_session because it sets the boundaries for the database conversation and allows freeing the resources:
Clear the Identity Map cache
Return the database connection to the connection pool
If we don't clear the cache, then all objects which were loaded from the database will sit in the memory until you clear the cache manually or your program ends.
Let's say we introduce the mode when the db_session never ends and you need to clear the cache manually. Do you think it would solve your problem and you would use it?

Related

Python : Passing parameters by reference down chain

I am trying to pass a variable (MySQL connection class instance) down into a method and then into a class, the issue is that it needs to be done 'by reference' as the main class can change the value. The variable in the final class does not update though:
application:
def __init__(self, quart_instance) -> None:
self._db_object : mysql_connection.MySQLConnection = None
def initialise_app(self):
self.view_blueprint = health_view.create_blueprint(self._db_object)
Health View:
def create_blueprint(db_connector : mysql_connection.MySQLConnection):
view = View(db_connector)
class View:
def __init__(self, db_connector):
self._db_connector = db_connector
When the application performs the database connection in the background in the application I was expecting self._db_connector in the view to update. Any help would be appreciated as I am very confused.
Don't confuse changing the state of an object with changing the value of a variable; the former is visible through all references to that object, the latter only affects that particular variable.
For this to work, the application's _db_object and the view's db_connector must refer to the same object at all times.
There are essentially two solutions:
Give MySQLConnection a default state, so you can create one immediately to pass along to View rather than starting with None and modify it later, or
Wrap MySQLConnection in another object that you can do the same with
Both options have both benefits and drawbacks.

Django model class can't be serialized due to `default=` field parameter poiting to class method [duplicate]

I have a very basic class that looks something like the following:
class Car(Model):
name = CharField(max_length=255, unique=True)
#classmethod
def create_simple_examples(cls):
for c in ['Sedan', 'Coupe', 'Van', 'SUV']:
cls.objects.get_or_create(name=c)
#classmethod
def get_default(cls):
c, _ = cls.objects.get_or_create(name='Sedan')
return c
def __unicode__(self):
return self.name
I am trying to add it to a django app. I have the two class methods to 1. a function to populate the table quickly, and 2. to grab a default one which will be used often.
When I run
python manage.py makemigrations myapp
I get the following error
ValueError: Cannot serialize: <bound method ModelBase.get_default of <class 'crunch.django.myapp.models.Car'>>
I am not quite sure why it's trying to serialize my get_default function as that's not really part of the migration of the table itself. Any help would be greatly appreciated
UPDATE I think I may have found the source of the problem (still not sure how to fix it though...)
I have other classes that are FKing to my new class, and the default uses my default above...something like this
class OtherClass(Model):
car = ForeignKey(Car, default=Car.get_default)
It looks like the migration is trying to serialize the function because of this. Any tips on how to get around this?
Add the #deconstructible decorator to the classes which have a classmethod
from django.utils.deconstruct import deconstructible
#deconstructible
class Car(Model):
...
More documentation on deconstructible can be found here
As explained in Django's migrations docs, Django can serialize function and method references, (in Python 3) unbound methods used from within the class body, and a bunch of other stuff, but it can't serialize everything.
In this case, because you've made get_default a #classmethod, Car.get_default is a bound method (i.e., it takes an implicit reference to Car as its first parameter), rather than a plain function or method reference, and Django doesn't know what to do with that.
Try making get_default a #staticmethod instead, or make a free function (top-level function) that calls Car.get_default.

Calling type(dict) functions within classes on class variables (Python 3.4)

I am creating a class and trying to define class variables that correspond to a function like .keys() or .values() that are called on another class variable.
For example:
class DATA(object):
def __init__(self, id, database = {}):
self.id = id
self.database = database
self.addresses = database.keys()
self.data = database.values()
This does not seem to work, as when I create an instance of the class
foo = DATA(0,{"a":1,"b":2})
and then ask for:
print(foo.addresses)
>>> []
and it gives back an empty list.
Note:
On my actual program I start out with an empty dictionary for any class instance, then later on I use a function to add to the dictionary. In this case calling the ".database" still works but ".addresses" does not.
Can anyone help me with this problem?
I'm not sure that this is the problem, but using a mutable such as {} as a default argument often leads to bugs. See: "Least Astonishment" and the Mutable Default Argument
This is safer:
def __init__(self, id, database=None):
if database is None:
self.database = {}
else:
self.database = database
I don't understand the purpose of DATA.addresses and DATA.data. Could you use functions with the property decorator instead, to avoid redundancy?
#property:
def addresses(self):
return self.database.keys()
#property:
def data(self):
return self.database.values()
The issue is that you're calling keys right in your __init__ method, and saving the result. What you want to do instead is to call keys only when you want to access it.
Now, depending on the requirements of your class, you may be able to do this in a few different ways.
If you don't mind exposing changing the calling code quite a bit, you could make it very simple, just use foo.database.keys() rather than foo.addresses. The latter doesn't need to exist, since all the information it contains is already available via the methods of the databases attribute.
Another approach is to save the bound instance method database.keys to an instance variable of your DATA object (without calling it):
class DATA(object)
def __init__(self, database=None):
if database is None:
database = {}
self.database = database
self.addresses = database.keys # don't call keys here!
In the calling code, instead of foo.addresses you'd use foo.addresses() (a function call, rather than just an attribute lookup). This looks like a method call on the DATA instance, though it isn't really. It's calling the already bound method on the database dictionary. This might break if other code might replace the database dictionary completely (rather than just mutating it in place).
A final approach is to use a property to request the keys from the database dict when a user tries to access the addresses attribute of a DATA instance:
class DATA(object)
def __init__(self, database=None):
if database is None:
database = {}
self.database = database
# don't save anything as "addresses" here
#property
def addresses(self):
return self.database.keys()
This may be best, since it lets the calling code treat addresses just like an attribute. It will also work properly if you completely replace the database object in some other code (e.g. foo.database = {"foo":"bar"}). It may be a bit slower though, since there'll be an extra function call that the other approaches don't need.

How do I memoize expensive calculations on Django model objects?

I have several TextField columns on my UserProfile object which contain JSON objects. I've also defined a setter/getter property for each column which encapsulates the logic for serializing and deserializing the JSON into python datastructures.
The nature of this data ensures that it will be accessed many times by view and template logic within a single Request. To save on deserialization costs, I would like to memoize the python datastructures on read, invalidating on direct write to the property or save signal from the model object.
Where/How do I store the memo? I'm nervous about using instance variables, as I don't understand the magic behind how any particular UserProfile is instantiated by a query. Is __init__ safe to use, or do I need to check the existence of the memo attribute via hasattr() at each read?
Here's an example of my current implementation:
class UserProfile(Model):
text_json = models.TextField(default=text_defaults)
#property
def text(self):
if not hasattr(self, "text_memo"):
self.text_memo = None
self.text_memo = self.text_memo or simplejson.loads(self.text_json)
return self.text_memo
#text.setter
def text(self, value=None):
self.text_memo = None
self.text_json = simplejson.dumps(value)
You may be interested in a built-in django decorator django.utils.functional.memoize.
Django uses this to cache expensive operation like url resolving.
Generally, I use a pattern like this:
def get_expensive_operation(self):
if not hasattr(self, '_expensive_operation'):
self._expensive_operation = self.expensive_operation()
return self._expensive_operation
Then you use the get_expensive_operation method to access the data.
However, in your particular case, I think you are approaching this in slightly the wrong way. You need to do the deserialization when the model is first loaded from the database, and serialize on save only. Then you can simply access the attributes as a standard Python dictionary each time. You can do this by defining a custom JSONField type, subclassing models.TextField, which overrides to_python and get_db_prep_save.
In fact someone's already done it: see here.
For class methods, you should use django.utils.functional.cached_property.
Since the first argument on a class method is self, memoize will maintain a reference to the object and the results of the function even after you've thrown it away. This can cause memory leaks by preventing the garbage collector from cleaning up the stale object. cached_property turns Daniel's suggestion into a decorator.

Google App Engine singletons (Python)

The standard way of doing singletons in Python is
class Singleton(object):
_instance = None
def __new__(cls, *args, **kwargs):
if not cls._instance:
cls._instance = super(Singleton, cls).__new__(cls, *args, **kwargs)
return cls._instance
However, this doesn't work on App Engine, since there are may be many servers and we would get one instance per server. So how would we do it for an app engine entity?
Something like:
class MySingleton(db.models):
def __init__(self):
all = MySingleton.all()
if all.count() > 0:
return all.fetch(1).get()
super(MySingleton, self).__init__ (*args, **kwargs)
This leads to a recusion error, since get() calls __init__.
How we're going to use it:
We just want to represent a configuration file, ie:
{ 'sitename': "My site", 'footer': "This page owned by X"}
Singletons are usually a bad idea, and I'd be interested to see what makes this an exception. Typically they're just globals in disguise, and apart from all the old problems with globals (eg. see http://c2.com/cgi/wiki?GlobalVariablesAreBad, in particular the bit at the top talking about non-locality, implicit coupling, concurrency issues, and testing and confinement), in the modern world you get additional problems caused by distributed and concurrent systems. If your app is potentially running across multiple servers, can you meaningfully have both instances of your application operate on the same singleton instance both safely and correctly?
If the object has no state of its, then the answer is yes, but you don't need a singleton, just a namespace.
But if the object does have some state, you need to worry about how the two application instances are going to keep the details synchronised. If two instances try reading and then writing to the same instance concurrently then your results are likely to be wrong. (eg. A HitCounter singleton that reads the current value, adds 1, and writes the current value, can miss hits this way - and that's about the least damaging example I can think of.)
I am largely unfamiliar with it, so perhaps Google App Engine has some transactional logic to handle all this for you, but that presumably means you'll have to add some extra stuff in to deal with rollbacks and the like.
So my basic advice would be to see if you can rewrite the algorithm or system without resorting to using a singleton.
If you aren't going to store the data in the datastore, why don't you just create a module with variables instead of a db.Model?
Name your file mysettings.py and inside it write:
sitename = "My site"
footer = "This page owned by X"
Then the python module effectively becomes a "singleton". You can even add functions, if needed. To use it, you do something like this:
import mysettings
print mysettings.sitename
That's how django deals with this with their DJANGO_SETTINGS_MODULE
Update:
It sounds like you really want to use a db.Model, but use memcached so you only retrieve one object once. But you'll have to come up with a way to flush it when you change data, or have it have a timeout so that it gets get'd occasionally. I'd probably go with the timeout version and do something like this in mysettings.py:
from google.appengine.api import memcache
class MySettings(db.Model):
# properties...
def Settings():
key = "mysettings"
obj = memcache.get(key)
if obj is None:
obj = MySettings.all().get() # assume there is only one
if obj:
memcache.add(key, zone, 360)
else:
logging.error("no MySettings found, create one!")
return obj
Or, if you don't want to use memcache, then just store the object in a module level variable and always use the Settings() function to reference it. But then you'll have to implement a way to flush it until the interpreter instance is recycled. I would normally use memcached for this sort of functionality.
__init__ cannot usefully return anything: just like in the first example, override __new__ instead!
I don't think there's a real "singleton" object you can hold in a distributed environment with multiple instances running. The closest you can come to this is using memcache.
Perhaps it's better to think less in terms of singletons and more in terms of data consistency. For this App Engine provides transactions, which allow you to trap any changes in an entity that might happen while you're working with that entity.

Categories

Resources