SqlAlchemy: Refer to the bound session from an object? - python

Suppose I am using a declarative object like the following:
class MyObject(DeclarativeBase):
...
other_object_id = Column(Integer, ForeignKey('other_object.id'))
other_object = relationship("OtherObject")
...
Suppose I would like to have a convenience method set_other_object_value, which would modify other_object.value, creating an instance of OtherObject and adding it to the session, if necessary:
def set_other_object_value(self, value):
if self.other_object is None:
<create instance of OtherObject and associate with the session>
self.other_object.value = value
The question is: how can I create OtherObject and associate it with the session? A possibly equivalent question: is it possible to access the instance of a Session, to which MyObject was added from within an instance of MyObject?

Use object_session:
from sqlalchemy.orm.session import object_session
# ...
def set_other_object_value(self, value):
if self.other_object is None:
value = OtherObject(...)
self.other_object.value = value
object_session(self).add(value)
But if you have setup you relationships properly, you do not need to add the new value to the session, as sqlalchemy will figure it out and save it to your database on commit anyways.

Related

FactoryBoy is accessing normal DB instead of TEST DB

I'm trying to create some objects in setUp method of Django test case. I use FactoryBoy that helps me with creating the objects. But it seems that FactoryBoy can't find any objects in the database.
factories.py
class ProductFactory(DjangoModelFactory):
...
market_category = factory.fuzzy.FuzzyChoice(list(MarketplaceCategory.objects.all()))
class Meta:
model = Product
tests.py
from django.test import TestCase
from marketplaces.models import MarketplaceCategory
class MyTestCase(TestCase):
def setUp(self) -> None:
...
self.marketplace_category = MarketplaceCategoryFactory.create()
print(MarketplaceCategory.objects.first().pk) # prints 1
self.product = ProductFactory(created_by=self.user)
As you can see, ProductFactory tries to populate Product.market_category by random MarketCategory object.
The problem is that it seems like it does not exist even when I've created it before and made sure it is in the db (it has pk).
EDIT: It chose a MarketCategory object with pk=25 but there is only one such objects in the test db with pk=1. I think it accesses Django development DB instead of testing one.
The error:
psycopg2.errors.ForeignKeyViolation: insert or update on table "products_product" violates foreign key constraint "products_product_market_category_id_2d634517_fk"
DETAIL: Key (market_category_id)=(25) is not present in table "marketplaces_marketplacecategory".
Do you have any idea why it behaves this way? It looks like the Factory is accessing the real DB instead of testdb for some reason.
Defining the "market_category" field like that is going to cause issues, the queryset that populates the choices is going to be executed at some random time whenever the module is imported and the instances returned may no longer exist. You should use a SubFactory
class ProductFactory(DjangoModelFactory):
market_category = factory.SubFactory(MarketplaceCategoryFactory)
class Meta:
model = Product
Pass the queryset directly to FuzzyChoice to get a random existing value, don't convert it to a list
class ProductFactory(DjangoModelFactory):
market_category = factory.fuzzy.FuzzyChoice(MarketplaceCategory.objects.all())
class Meta:
model = Product
This will then create an instance whenever you create a product but you can pass "market_category" to the factory to override it
class MyTestCase(TestCase):
def setUp(self) -> None:
self.marketplace_category = MarketplaceCategoryFactory.create()
self.product = ProductFactory(created_by=self.user, market_category =self.marketplace_category)

django: how to specify database dynamically for factory boy

I am setting up a Django application with a lot of databases, and some of them are using the same models (they are not replicas). I already configured my routers and everything is working great. The problem appears when making the tests as I want to use factory-boy.
On a different project I could setup the database inside Meta but now, I have to select on which database to create an instance dynamically (if not, I would have to create a DjangoModelFactory for each database, which wouldn't be pretty).
Is there an (easier) way to specify the database dynamically for each creation?
As far as I know factory_boy (version <=2.10.0) doesn't provide anything like that.
Though, your problem is the perfect use case to use a context manager. It will allow you to set the database dynamically wherever you need and only under the desired scope, and also DRY!:
# factoryboy_utils.py
#classmethod
def _get_manager(cls, model_class):
return super(cls, cls)._get_manager(model_class).using(cls.database)
class DBAwareFactory(object):
"""
Context manager to make model factories db aware
Usage:
with DBAwareFactory(PersonFactory, 'db_qa') as personfactory_on_qa:
person_on_qa = personfactory_on_qa()
...
"""
def __init__(self, cls, db):
# Take a copy of the original cls
self.original_cls = cls
# Patch with needed bits for dynamic db support
setattr(cls, 'database', db)
setattr(cls, '_get_manager', _get_manager)
# save the patched class
self.patched_cls = cls
def __enter__(self):
return self.patched_cls
def __exit__(self, type, value, traceback):
return self.original_cls
and then, in your tests, you can do something like:
from factoryboy_utils import DBAwareFactory
class MyTest(TestCase):
def test_mymodel_on_db1(self):
...
with DBAwareFactory(MyModelFactory, 'db1') as MyModelFactoryForDB1:
mymodelinstance_on_db1 = MyModelFactoryForDB1()
# whatever you need with that model instance
...
# something else here
def test_mymodel_on_db2(self):
...
with DBAwareFactory(MyModelFactory, 'db2') as MyModelFactoryForDB2:
mymodelinstance_on_db2 = MyModelFactoryForDB2()
# whatever you need with that model instance
...
# something else here

set table attribute value with other attribute values in sqlalchemy

i have this column in a table:
name = Column(String, default='ts-1')
how do i set the default value to increment automatically so that the names of objects will be 'ts-1', 'ts-2', 'ts-3'... when created.
i tried writing a constructor:
def __init__(self):
self.name = 'ts-' + str(self.id)
but it always returns None.
thanks.
I found a approach to solving this problem using sqlalchemy orm events. turns out to be fairly simple. the idea is to write an event listener on init so that it will be trigger whenever an object is created.
so now i can access the value of id of an instance.
from sqlalchemy import event
#event.listens_for(SomeClass, 'init')
def change_name(target, args, kwargs):
session.add(target)
session.commit()
target.name = 'sheet#' + str(target.id)

SQLAlchemy: How can I add common mapping logic to my custom base class (using declarative style mapping)?

I have a project where every table has some common fields, e.g., status, and I'd like to alias all of them. Is it possible to do this without manually adding the alias to each class? E.g., here's what I have now:
from core import foo_table, bar_table, Status
Base = declarative_base()
def CustomBase(object):
#property
def status(self):
return Status(self._status)
...
def Foo(Base, CustomBase):
__table__ = foo_table
_status = foo_table.c.status
...
def Bar(Base, CustomBase):
__table__ = bar_table
_status = bar_table.c.status
...
Ideally, I'd like to be able to set up my _status alias on CustomBase instead of in Foo and Bar, or set up my project so that the alias is added whenever a class extending CustomBase is loaded. Is this possible or am I trying to accomplish this in the wrong way? I know I can make it work if I rename the status field in my db or rename the status property in the CustomBase, but I'd prefer to avoid this if possible since they're both representations of the same thing, and there's no need to directly access in the enum value through the code.
Thanks!
Your best bet is probably to create a custom Column type that adapts Enum to translate to and from your own Status class. See here for a full reference. Below is a draft for your core module, the precise code depends a bit on your situation.
# core module
import sqlalchemy.types as types
class DBStatus (types.TypeDecorator):
impl = types.Enum
# what should happen with Status objects on the way into the table
def process_bind_param(self, value, dialect):
if value is None:
return value
return str(value) # if Status has a __str__ or __repr__ method
# what should happen with Enum objects on the way out of the table
def process_result_value(self, value, dialect):
if value is None:
return value
return Status(value)
foo_table = Table(
'foo',
MetaData(),
Column('status', DBStatus('OK', 'Error')),
# ...
)
After this you don't have to do anything special anymore in the module with the mappings:
# module with the mappings
Base = declarative_base()
class Foo (Base):
__table__ = foo_table
# ...
In fact it's so straightforward you might just as well use full declarative mapping, as far as the Status columns are concerned.
# everything in one module
class DBStatus (types.TypeDecorator):
# same as above
Base = declarative_base()
class Foo (Base):
status = Column(DBStatus('OK', 'Error'))
# ...

Does SQLAlchemy have an equivalent of Django's get_or_create?

I want to get an object from the database if it already exists (based on provided parameters) or create it if it does not.
Django's get_or_create (or source) does this. Is there an equivalent shortcut in SQLAlchemy?
I'm currently writing it out explicitly like this:
def get_or_create_instrument(session, serial_number):
instrument = session.query(Instrument).filter_by(serial_number=serial_number).first()
if instrument:
return instrument
else:
instrument = Instrument(serial_number)
session.add(instrument)
return instrument
Following the solution of #WoLpH, this is the code that worked for me (simple version):
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance
With this, I'm able to get_or_create any object of my model.
Suppose my model object is :
class Country(Base):
__tablename__ = 'countries'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
To get or create my object I write :
myCountry = get_or_create(session, Country, name=countryName)
That's basically the way to do it, there is no shortcut readily available AFAIK.
You could generalize it ofcourse:
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
params = {k: v for k, v in kwargs.items() if not isinstance(v, ClauseElement)}
params.update(defaults or {})
instance = model(**params)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
2020 update (Python 3.9+ ONLY)
Here is a cleaner version with Python 3.9's the new dict union operator (|=)
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
kwargs |= defaults or {}
instance = model(**kwargs)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
Note:
Similar to the Django version this will catch duplicate key constraints and similar errors. If your get or create is not guaranteed to return a single result it can still result in race conditions.
To alleviate some of that issue you would need to add another one_or_none() style fetch right after the session.commit(). This still is no 100% guarantee against race conditions unless you also use a with_for_update() or serializable transaction mode.
I've been playing with this problem and have ended up with a fairly robust solution:
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), False
except NoResultFound:
kwargs.update(create_method_kwargs or {})
created = getattr(model, create_method, model)(**kwargs)
try:
session.add(created)
session.flush()
return created, True
except IntegrityError:
session.rollback()
return session.query(model).filter_by(**kwargs).one(), False
I just wrote a fairly expansive blog post on all the details, but a few quite ideas of why I used this.
It unpacks to a tuple that tells you if the object existed or not. This can often be useful in your workflow.
The function gives the ability to work with #classmethod decorated creator functions (and attributes specific to them).
The solution protects against Race Conditions when you have more than one process connected to the datastore.
EDIT: I've changed session.commit() to session.flush() as explained in this blog post. Note that these decisions are specific to the datastore used (Postgres in this case).
EDIT 2: I’ve updated using a {} as a default value in the function as this is typical Python gotcha. Thanks for the comment, Nigel! If your curious about this gotcha, check out this StackOverflow question and this blog post.
A modified version of erik's excellent answer
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), True
except NoResultFound:
kwargs.update(create_method_kwargs or {})
try:
with session.begin_nested():
created = getattr(model, create_method, model)(**kwargs)
session.add(created)
return created, False
except IntegrityError:
return session.query(model).filter_by(**kwargs).one(), True
Use a nested transaction to only roll back the addition of the new item instead of rolling back everything (See this answer to use nested transactions with SQLite)
Move create_method. If the created object has relations and it is assigned members through those relations, it is automatically added to the session. E.g. create a book, which has user_id and user as corresponding relationship, then doing book.user=<user object> inside of create_method will add book to the session. This means that create_method must be inside with to benefit from an eventual rollback. Note that begin_nested automatically triggers a flush.
Note that if using MySQL, the transaction isolation level must be set to READ COMMITTED rather than REPEATABLE READ for this to work. Django's get_or_create (and here) uses the same stratagem, see also the Django documentation.
This SQLALchemy recipe does the job nice and elegant.
The first thing to do is to define a function that is given a Session to work with, and associates a dictionary with the Session() which keeps track of current unique keys.
def _unique(session, cls, hashfunc, queryfunc, constructor, arg, kw):
cache = getattr(session, '_unique_cache', None)
if cache is None:
session._unique_cache = cache = {}
key = (cls, hashfunc(*arg, **kw))
if key in cache:
return cache[key]
else:
with session.no_autoflush:
q = session.query(cls)
q = queryfunc(q, *arg, **kw)
obj = q.first()
if not obj:
obj = constructor(*arg, **kw)
session.add(obj)
cache[key] = obj
return obj
An example of utilizing this function would be in a mixin:
class UniqueMixin(object):
#classmethod
def unique_hash(cls, *arg, **kw):
raise NotImplementedError()
#classmethod
def unique_filter(cls, query, *arg, **kw):
raise NotImplementedError()
#classmethod
def as_unique(cls, session, *arg, **kw):
return _unique(
session,
cls,
cls.unique_hash,
cls.unique_filter,
cls,
arg, kw
)
And finally creating the unique get_or_create model:
from sqlalchemy import Column, Integer, String, create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
engine = create_engine('sqlite://', echo=True)
Session = sessionmaker(bind=engine)
class Widget(UniqueMixin, Base):
__tablename__ = 'widget'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True, nullable=False)
#classmethod
def unique_hash(cls, name):
return name
#classmethod
def unique_filter(cls, query, name):
return query.filter(Widget.name == name)
Base.metadata.create_all(engine)
session = Session()
w1, w2, w3 = Widget.as_unique(session, name='w1'), \
Widget.as_unique(session, name='w2'), \
Widget.as_unique(session, name='w3')
w1b = Widget.as_unique(session, name='w1')
assert w1 is w1b
assert w2 is not w3
assert w2 is not w1
session.commit()
The recipe goes deeper into the idea and provides different approaches but I've used this one with great success.
The closest semantically is probably:
def get_or_create(model, **kwargs):
"""SqlAlchemy implementation of Django's get_or_create.
"""
session = Session()
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance, False
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance, True
not sure how kosher it is to rely on a globally defined Session in sqlalchemy, but the Django version doesn't take a connection so...
The tuple returned contains the instance and a boolean indicating if the instance was created (i.e. it's False if we read the instance from the db).
Django's get_or_create is often used to make sure that global data is available, so I'm committing at the earliest point possible.
I slightly simplified #Kevin. solution to avoid wrapping the whole function in an if/else statement. This way there's only one return, which I find cleaner:
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if not instance:
instance = model(**kwargs)
session.add(instance)
return instance
There is a Python package that has #erik's solution as well as a version of update_or_create(). https://github.com/enricobarzetti/sqlalchemy_get_or_create
Depending on the isolation level you adopted, none of the above solutions would work.
The best solution I have found is a RAW SQL in the following form:
INSERT INTO table(f1, f2, unique_f3)
SELECT 'v1', 'v2', 'v3'
WHERE NOT EXISTS (SELECT 1 FROM table WHERE f3 = 'v3')
This is transactionally safe whatever the isolation level and the degree of parallelism are.
Beware: in order to make it efficient, it would be wise to have an INDEX for the unique column.
One problem I regularly encounter is when a field has a max length (say, STRING(40)) and you'd like to perform a get or create with a string of large length, the above solutions will fail.
Building off of the above solutions, here's my approach:
from sqlalchemy import Column, String
def get_or_create(self, add=True, flush=True, commit=False, **kwargs):
"""
Get the an entity based on the kwargs or create an entity with those kwargs.
Params:
add: (default True) should the instance be added to the session?
flush: (default True) flush the instance to the session?
commit: (default False) commit the session?
kwargs: key, value pairs of parameters to lookup/create.
Ex: SocialPlatform.get_or_create(**{'name':'facebook'})
returns --> existing record or, will create a new record
---------
NOTE: I like to add this as a classmethod in the base class of my tables, so that
all data models inherit the base class --> functionality is transmitted across
all orm defined models.
"""
# Truncate values if necessary
for key, value in kwargs.items():
# Only use strings
if not isinstance(value, str):
continue
# Only use if it's a column
my_col = getattr(self.__table__.columns, key)
if not isinstance(my_col, Column):
continue
# Skip non strings again here
if not isinstance(my_col.type, String):
continue
# Get the max length
max_len = my_col.type.length
if value and max_len and len(value) > max_len:
# Update the value
value = value[:max_len]
kwargs[key] = value
# -------------------------------------------------
# Make the query...
instance = session.query(self).filter_by(**kwargs).first()
if instance:
return instance
else:
# Max length isn't accounted for here.
# The assumption is that auto-truncation will happen on the child-model
# Or directtly in the db
instance = self(**kwargs)
# You'll usually want to add to the session
if add:
session.add(instance)
# Navigate these with caution
if add and commit:
try:
session.commit()
except IntegrityError:
session.rollback()
elif add and flush:
session.flush()
return instance

Categories

Resources