Adding metadata (annotations) to python variables - python

I have a sqlalchemy class User which inherits from Model.
class Model:
#declared_attr
def __tablename__(self):
return self.__name__.lower()
_id = Column(Integer, primary_key=True, autoincrement=True)
in_utc = Column(BigInteger, default=time())
out_utc = Column(BigInteger, default=config['MAX_UTC'])
def to_dict(self):
return {k: v for k, v in vars(self).items() if not isinstance(v, InstanceState)}
class User(declarative_base(), Model):
email = Column(String)
password = Column(String)
name = Column(String)
The reason for the parent class is to add some columns that are common across the tables and also the to_dict() method which creates a dictionary from the columns.
However I do not want the password column to be included when calling user.to_dict().
Is there any way to annotate the password Column (like in Java reflection) so that to_dict() knows to ignore it?
For example:
class User(declarative_base(), Model):
email = Column(String)
[IgnoredInOutput()]
password = Column(String)
name = Column(String)
I've now overridden to_dict in the User class to remove the password column for this model.

Python 3.9 introduced Annotated Type Hints which can be used to decorate types with context-specific metadata.
Annotated takes at least two arguments. The first argument is a regular type (e.g. str, int, etc.), and the rest of the arguments is metadata. A type checker will only check the first argument, leaving the interpretation of the metadata to the application. A type hint like Annotated[str, 'ignored'] will be treated equally to str by type checkers.
For example this will be a solutions with the new Annotated metadata:
from sqlalchemy import Column, Integer, String, BigInteger
from sqlalchemy.ext.declarative import declarative_base, declared_attr
from sqlalchemy.orm.state import InstanceState
from typing import Annotated, _AnnotatedAlias
Base = declarative_base()
class Model:
#declared_attr
def __tablename__(self):
return self.__name__.lower()
_id = Column(Integer, primary_key=True, autoincrement=True)
in_utc = Column(BigInteger)
out_utc = Column(BigInteger)
def to_dict(self):
return {key: value for key, value in vars(self).items() if not isinstance(value, InstanceState) and key not in [var for var, th in self.__annotations__.items() if isinstance(th, _AnnotatedAlias) and 'ignored' in th.__metadata__]}
class User(Base, Model):
email: str = Column(String)
ignored: Annotated[str, 'ignored'] = Column(String)
ignored_additional: Annotated[str, 'extra', 'ignored'] = Column(String)
name = Column(String)
if __name__ == "__main__":
user = User()
user.name = "username"
user.ignored = "ignored"
user.ignored_additional = "ignored with additional metadata"
user.email = "username#email.com"
print(user.to_dict())
As you can see the type hints doesn't have to be included and normal type hints are also accepted without affecting the to_dict function. Also other metadata besides ignored can be added.

Related

How to create an abstract subclass of SQLAlchemy's Table(Base) class

I am using SQLAlchemy to create tables in my project. I have a requirement where all these tables should have some specific attributes and functions. I want to create a structure such that all tables inherit from an abstract class which includes these attributes and functions.
Here's an example of what I want to achieve:
Base = declarative_base()
# pseudo
class Table(ABC, Base):
# like #abstractattribute
some_attribtue = list()
#staticmethod
def some_func(self):
pass
class Users(Table):
__tablename__ = "users"
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = list()
#staticmethod
def some_func():
do_something()
By doing this, I hope that I can use these classes in something like:
Base.metadata.create_all(engine)
while also being able to call:
Users.some_func()
I understand that this code wouldn't work as is, due to issues like having ABC and Base at the same time, not having #abstractattribute, and needing to add __tablename__ and a Primary-Key Column to the class Table.
I am thinking of using a decorator to achieve this, but I am not sure how to implement it correctly. This is the outline of my idea:
class Table(ABC):
some_attribute=None
#staticmethod
def some_func(self):
pass
# create decorator
def sql_table():
def decorator(abstract_class):
class SQLTable(Base): # How do I name the class correctly?
__tablename__ = abstract_class.__dict__["__tablename__"]
some_attribute = abstract_class.__dict__["some_attribute"]
for name, obj in abstract_class.__dict__.items():
if isinstance(obj, Column):
locals()[name] = obj
# How do I get the some_func function?
#sql_table
class Users(Table):
__tablename__ = "users"
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = "some_val"
#staticmethod
def some_func():
do_something()
Any help or suggestions on how to implement this (not necessarily with decorators) would be greatly appreciated.
Thanks to #snakecharmerb and #ljmc I have found an solution that works for me, although there seem to be many ways one can achieve this.
The solution that works for me is:
from sqlalchemy.ext.declarative import declarative_base, declared_attr
from sqlalchemy import Column, Integer, String
Base = declarative_base()
class Table(Base):
__abstract__ = True
#declared_attr
def __tablename__(cls) -> str: # so i don't have to specify no more
return cls.__name__.lower()
some_attribute = set() # this is the default
#staticmethod
def some_func(): # define default behavior (or pass)
do_something()
class Users(Table):
# define columns as usual
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = set(["a"]) # overwrite the default
def some_func(): # overwrite the default behavior
do_something_else()
Now, this should be improved upon by specifying a type to some_attribute (typing is awesome).

More succinct initialization for SQLAlchemy instance

It's my first attempt at sqlalchemy. I have a json file with my usr information and I would like to put them in a sqlite3 database file. It works but I find the instance initialization verbose since there are many columns in the table, as you can see below.
Is it possible to use a dictionary as input to initialize User()? Something like a = User(usr)?
import json
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base
engine = create_engine('sqlite:///tutorial.db', echo=True)
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
bbs_id = Column(String)
name = Column(String)
sex = Column(String, nullable=False)
city = Column(String)
state = Column(String)
class_type = Column(String, nullable=False)
class_id = Column(String, nullable=False)
latitude = Column(Float)
longitude = Column(Float)
def __repr__(self):
return "<User(bbs_id='%s', name='%s'>" % (self.bbs_id, self.name)
Base.metadata.create_all(engine)
with open('mydata.json') as fin:
usrs = json.load(fin)
usr = usrs[0]
a = User(id=usr['id'], bbs_id=usr['bbs_id'], name=usr['name'])
If you know the property names in the JSON object match the column names of the Python model, you can just change:
a = User(id=usr['id'], bbs_id=usr['bbs_id'], name=usr['name'])
to:
a = User(**usr)
Double-star/dict unpacking passes each key/value pair of the dict as if it were an argument being passed by keyword. Since you didn't override __init__ for your model, it already allows and expects the arguments to be optional and passed by keyword, so this lines up perfectly.

SQLAlchemy polymorphic update

I have a class Parent and two of its subclass Child and ChildOne.
I am able to add data to each of the table and read the data.
Now I want to update Child class's row such that it becomes a row in ChildOne table that is when I update type to "old" from "young", I want the row to get deleted from child_one table and added to child table, but without losing the "id" value (which is the primary key).
Is there any way that SQLAlchemy itself handle this? or any other idea to achieve this?
from sqlalchemy import create_engine, ForeignKey
from sqlalchemy import Column, Integer, String, case, Boolean
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base, declared_attr
engine = create_engine('sqlite:///testing.db', echo=True)
Base = declarative_base()
class Parent(Base):
__tablename__ = "parent"
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
type = Column(String)
def __init__(self, name):
self.name = name
self.type = type
__mapper_args__ = {
'polymorphic_identity':'parent',
'polymorphic_on': case(
[(type == "young", "child_one"),
(type == "old", "child")]
)
}
class Child(Parent):
__tablename__ = "child"
id = Column(Integer, ForeignKey('parent.id'), primary_key=True)
school = Column(String, default="some high school")
def __init__(self, name, school):
self.name = name
self.type = type
self.school = school
__mapper_args__ = {
'polymorphic_identity':'child'
}
class ChildOne(Parent):
__tablename__ = "child_one"
id = Column(Integer, ForeignKey('parent.id'), primary_key=True)
school = Column(String, default="UCLA")
mode = Column(Boolean, default=1)
bool = Column(Boolean, default=0)
def __init__(self, name, type, school, mode, bool):
self.name = name
self.type = type
self.school = school
self.mode = mode
self.bool = bool
__mapper_args__ = {
'polymorphic_identity':'child_one'
}
from sqlalchemy import event
# standard decorator style
#event.listens_for(Child, 'after_update')
def receive_after_update(mapper, connection, target):
"listen for the 'after_update' event"
if target.type == "young":
ChildOne(id=target.id)
But honestly, you should just be creating triggers in the database from something like this. It doesn't require overly complex sqlalchemy code.

How to use descriptors in sqlachemy.orm.synonym

I have code like this working fine:
def get_timestamp(ts):
return datetime.utcfromtimestamp(ts)
def set_timestamp(dt):
return time.mktime(dt.timetuple())
class Group(Base):
__tablename__ = 'group'
_created = Column('created', Integer, nullable=False)
#property
def created(self):
return get_timestamp(self._created)
#created.setter
def created(self, value):
self._created = set_timestamp(value)
I want some code like this, but it's not working:
created = synonym('_created',
descriptor=property(get_timestamp,
set_created))
Because it always passed in a self as the 1st param.
I'd like to use get_timestamp and set_timestamp across my project of cause. So I'm not going to make them methods of the class but stand alone function.
How can I achieve this?
EDIT: I would take Option2, and still open to other answers.
Option-1: Code below should work (you do not need to have a class in order to define self):
def pget_timestamp(self):
return datetime.utcfromtimestamp(self._created)
def pset_timestamp(self, dt):
self._created = time.mktime(dt.timetuple())
class Group(Base):
__tablename__ = 'group'
id = Column(Integer, primary_key=True)
_created = Column('created', Integer, nullable=False)
created = synonym(_created,
descriptor=property(pget_timestamp, pset_timestamp),
)
Option-2: If you do need the same on many classes, leverage Mixins
from sqlalchemy.ext.declarative import declared_attr
class _CreatedMixin(object):
_created = Column('created', Integer, nullable=False)
def pget_timestamp(self):
return datetime.utcfromtimestamp(self._created)
def pset_timestamp(self, dt):
self._created = time.mktime(dt.timetuple())
#declared_attr
def created(cls):
return synonym('_created',
descriptor=property(cls.pget_timestamp, cls.pset_timestamp),
)
class Group(_CreatedMixin, Base):
# #note: adding *_CreatedMixin* to bases defines both the column and the synonym
__tablename__ = 'group'
id = Column(Integer, primary_key=True)
Alternatively, if this is for all your classes, you could make _CreatedMixin a base class for all your models:
Base = declarative_base(engine, cls=_CreatedMixin)
class Group(Base):
__tablename__ = 'group'
id = Column(Integer, primary_key=True)
Option-3: You could do any of the above using Hybrid Attributes
Note: make your set/get functions in-sync: either both or none use UTC-enabled functionality. Currently (unless you are in UTC-0) setting one value to created will not return the same one back.
I'm now using a different implementation. It's not related to the original title, but in case you need it.
Use sqlalchemy.types.TypeDecorator. Defining a table with sqlalchemy with a mysql unix timestamp
class UTCTimestampType(TypeDecorator):
impl = Integer
def process_bind_param(self, value, dialect):
if value is None:
return None # support nullability
elif isinstance(value, datetime):
return int(time.mktime(value.timetuple()))
raise ValueError("Can operate only on datetime values. Offending value type: {0}".format(type(value).__name__))
def process_result_value(self, value, dialect):
if value is not None: # support nullability
return datetime.fromtimestamp(float(value))
class ModelA(Base):
__tablename__ = 'model_a'
id = Column(Integer, primary_key=True)
created = Column(UTCTimestampType, nullable=False)
issues about alembic. Alembic: How to migrate custom type in a model?
# manually change the line
sa.Column('created', sa.UTCTImestampType(), nullable=False),
# to
sa.Column('created', sa.Integer(), nullable=False),

Validation in SQLAlchemy

How can I get the required validator in SQLAlchemy? Actually I just wanna be confident the user filled all required field in a form. I use PostgreSQL, but it doesn't make sense, since the tables created from Objects in my models.py file:
from sqlalchemy import (
Column,
Integer,
Text,
DateTime,
)
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import (
scoped_session,
sessionmaker,
)
from zope.sqlalchemy import ZopeTransactionExtension
from pyramid.security import (
Allow,
Everyone,
)
Base = declarative_base()
class Article(Base):
""" The SQLAlchemy declarative model class for a Article object. """
__tablename__ = 'article'
id = Column(Integer, primary_key=True)
name = Column(Text, nullable=False, unique=True)
url = Column(Text, nullable=False, unique=True)
title = Column(Text)
preview = Column(Text)
content = Column(Text)
cat_id = Column(Integer, nullable=False)
views = Column(Integer)
popular = Column(Integer)
created = Column(DateTime)
def __unicode__(self):
return unicode(self.name)
So this nullable=False doesn't work, because the records added in any case with empty fields. I can of course set the restrictions at the database level by set name to NOT NULL for example. But there must be something about validation in SQLAlchemy isn't it? I came from yii php framework, there it's not the problem at all.
By empty fields I guess you mean an empty string rather than a NULL. A simple method is to add validation, e.g.:
class Article(Base):
...
name = Column(Text, unique=True)
...
#validates('name')
def validate_name(self, key, value):
assert value != ''
return value
To implement it at a database level you could also use a check constraint, as long as the database supports it:
class Article(Base):
...
name = Column(Text, CheckConstraint('name!=""')
...

Categories

Resources