Can I use context-sensitive functions in SQLAlchemy with parameters? - python

Here's a little table for holding IP address ranges in the form of a start address, end address, and number of IPs within the range
class IpRange(Base):
__tablename__ = 'ip_range'
ip_range_id = Column(Integer, Sequence('ip_range_id_seq'), primary_key=True)
start_ip = Column(String(15))
end_ip = Column(String(15))
num_ips = Column(Integer)
What I'd love to do is be able to create an object using a variety of styles, and have the class figure out how to populate its own fields.
foo = IpRange(ip='192.168.0.1')
foo = IpRange(ip='192.168.0.0/24')
foo = IpRange(ip='192.168.0.0-192.168.0.255')
It wouldn't be too hard to write a function that could parse various IP address/range notations:
def parseIp(desired_format):
... stuff to parse any valid IP address/network format ...
if desired_format == 'start':
return start_ip
if desired_format == 'end':
return end_ip
if desired_format == 'num_ips':
return num_ips
And I was hoping I could then use the default method for my Columns to get the data that each column needed:
class IpRange(Base):
__tablename__ = 'ip_range'
ip_range_id = Column(Integer, Sequence('ip_range_id_seq'), primary_key=True)
start_ip = Column(String(15), default=parseIp('start'))
end_ip = Column(String(15), default=parseIp('end'))
num_ips = Column(Integer, default=parseIp('num_ips')
However, that is not valid SQLAlchemy syntax. The documentation speaks of context-sensitive default columns, but the syntax does not allow parameters to be passed to the function. So even though I could call the function with default=parseIp, I wouldn't be able to tell it what kind of return value I'm looking for.
Is there a way to do this within Column specification for SQLAlchemy?
Or as an alternative idea, should I turn parseIp into a helper script that just generates the new IpRange object itself and returns it back to the caller? Something like:
def parseIp(ipstring):
... parse data ...
return IpRange(start_ip=parsed_start_ip, end_ip=parsed_end_ip, num_ips=parsed_num_ips)
>>> ipobj = parseIp('192.168.0.0/24')

Just override the constructor. Example:
class IpRange(Base):
def __init__(self, ip=None, **kwargs):
if ip is not None:
self.start_ip, self.end_ip = my_super_awesome_cidr_parser(ip)
else:
super(IpRange, self).__init__(**kwargs)

Related

Is it possible to have two or more __init__ in python model?

I want to create two __init__s inside the python class, that would instantiate different elements of that class.
The question was asked because I was trying to implement a partial update to a record inside the database and I was using SQLAlchemy ORM which maps python class and its attributes to database Table and column respectively. However, I was to update a user_score column in a User's table then I create an instance of the User object inside the update function that listens to the PATCH request but it wasn't updating. So, I thought I could create another init that will only be responsible for handling only that attribute user_score.
class User(db.Model):
__tablename__='users'
id = Column(Integer, primary_key=True)
name = Column(String, nullable=False)
score = Column(Integer, nullable=False)
def __init__(self, name, score):
self.name = name
self.score = score
def __init__(self, score):
self.score = score
def insert(self):
db.session.add(self)
db.session.commit()
def update(self):
db.session.commit()
def format(self):
return {
'id': self.id,
'name': self.name,
'score': self.score
}
Only one __init__ is possible, but you can have it do different things like:
def __init__(self, optA: bool = False, optB: int = 0):
self.optA = optA
self.optB = optB
if self.optA:
# do something different if optA is true (i.e., not the default)
if self.optB != 0:
# do something different if optB is not the default of 0
etc...
So by giving sensible defaults to optional parameters, you can have a an __init__ whose behavior differs based on what parameters are supplied, and what their values are if supplied. Lots of good references on this out there on the internet :)
So, if you just instantiate the object with no parameters, you get the defaults as shown, but if you do supply them, then the values you supply override the defaults and trigger the behavior in the if statements inside.

SQLalchemy ORM add to session with dynamic column

I have 2 tables (not the true setup of my database).
tableclass.py:
class TableMeaningEN(sqla_base):
__tablename__ = 'MeaningEN'
id = sqla.Column(sqla.Integer, primary_key=True)
meaning = sqla.Column(sqla.String, primary_key=True)
class TableReadingON(sqla_base):
__tablename__ = 'ReadingON'
id = sqla.Column(sqla.Integer, primary_key=True)
reading = sqla.Column(sqla.String, primary_key=True)
Different column names
As you can see, both have a column id, but TableMeaningEN has meaning and TableReadingON has reading.
Normally (assuming you already have a session) you would add something like this:
session.add(TableMeaningEN(id=1, meaning='test'))
However I want to add dynamically entries to tables, so I have:
import tableclass as tc
for t_name in ['MeaningEN', 'ReadingON']:
session.add(getattr(tc, 'Table{}'.format(t_name))(id=1, ??='test'))
Question
How do I solve the problem that ?? is in one table meaning and in the other reading?
Tried:
columns = sqla.inspect(getattr(tc, 'Table{}'.format(overwrite))).columns.keys()
session.add(getattr(tc, 'Table{}'.format(t_name))(id=1, columns[1]='test'))
I tried this, but however that's not allowed.
If tables are really different by just one attribute, you could create a constructor for both which takes positional arguments as well, such as:
class TableMeaningEN(sqla_base):
def __init__(self, id, meaning):
self.id, self.meaning = id, meaning
# similar __init__ for the other class/table
# then use the following:
session.add(getattr(tc, 'Table{}'.format(t_name))(1, 'test'))
An alternative would be to dynamically create the keyword arguments, assuming there is naming convention:
import tableclass as tc
for t_name in ['MeaningEN', 'ReadingON']:
cls = getattr(tc, 'Table{}'.format(t_name))
fld_name = t_name.lower()[:-2]
kw = {'id': 1, fld_name: 'test'}
session.add(cls(**kw))

SQLAlchemy filter always returns false

I have a simple player entity:
__tablename__ = 'player'
_id = Column('id', SmallInteger, primary_key=True)
_nickName = Column('nick_name', String)
def __init__(self, nickName):
self._nickName = nickName
#property
def id(self):
return self._id
#property
def nickName(self):
return self._nickName.decode(encoding='UTF-8')
#nickName.setter
def nickName(self, nickName):
self._nickName = nickName
when i do:
players = session.query(Player).filter(Player.nickName=='foo')
and i print the players var i got this:
SELECT player.id AS player_id, player.nick_name AS player_nick_name
FROM player
WHERE false
Obviously, when I add .first() at the end of the session query, the result is None.
I have tried with filter_by() and get the same result.
Any help is welcome.
While using #hybrid_property will fix this in the general case, you shouldn't need to be decoding manually at all. Just set the column type to Unicode instead of String and, assuming your server plays nice, you should correctly get back a unicode.
You also don't need the id property at all.
So all you should need for this class is:
class Player(Base):
__tablename__ = 'player'
id = Column(SmallInteger, primary_key=True)
nickName = Column(Unicode)
(Both the column names and __init__ arguments can be generated automatically.)
If there's some reason your database isn't handling Unicode correctly, well, that's a different problem that we'd love to help you fix. :)
You cannot use regular #propertys as query parameters. Use a #hybrid_property instead:
from sqlalchemy.ext.hybrid import hybrid_property
#hybrid_property
def nickName(self):
return self._nickName.decode(encoding='UTF-8')
#nickName.setter
def nickName(self, nickName):
self._nickName = nickName
This makes Player.nickName (so on the attribute on the class) useful in SQL expressions.

SQLAlchemy Declarative: Adding a static text attribute to a column

I am using: SQLAlchemy 0.7.9 and Python 2.7.3 with Bottle 0.11.4. I am an amateur at python.
I have a class (with many columns) derived from declarative base like this:
class Base(object):
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
id = Column(Integer, primary_key = True)
def to_dict(self):
serialized = dict((column_name, getattr(self, column_name))
for column_name in self.__table__.c.keys())
return serialized
Base = declarative_base(cls=Base)
class Case(Base):
version = Column(Integer)
title = Column(String(32))
plausible_dd = Column(Text)
frame = Column(Text)
primary_task = Column(Text)
secondary_task = Column(Text)
eval_objectives = Column(Text)
...
I am currently using this 'route' in Bottle to dump out a row/class in json like this:
#app.route('/<name>/:record')
def default(name, record, db):
myClass = getattr(sys.modules[__name__], name)
parms = db.query(myClass).filter(myClass.id == record)
result = json.dumps(([parm.to_dict() for parm in parms]))
return result
My first question is: How can I have each column have some static text that I can use as a proper name such that I can iterate over the columns and get their values AND proper names? For example:
class Case(Base):
version = Column(Integer)
version.pn = "Version Number"
My second question is: Does the following do what I am looking for? I have seen examples of this, but I don't understand the explanation.
Example from sqlalchemy.org:
id = Column("some_table_id", Integer)
My interpretation of the example:
version = Column("Version Number", Integer)
Obviously I don't want a table column to be created. I just want the column to have an "attribute" in the generic sense. Thank you in advance.
info dictionary could be used for that. In your model class define it like this:
class Case(Base):
version = Column(Integer, info={'description': 'Version Number'})
Then it can accessed as the table column property:
desc = Case.__table__.c.version.info.get('description', '<no description>')
Update
Here's one way to iterate through all the columns in the table and get their names, values and descriptions. This example uses dict comprehension, which is available since Python 2.7.
class Case(Base):
# Column definitions go here...
def as_dict(self):
return {c.name: (getattr(self, c.name), c.info.get('description'))
for c in self.__table__.c}

Python dicts in sqlalchemy

I would like to load/save a dict to/from my sqlite DB, but am having some problems figuring out a simple way to do it. I don't really need to be able to filter, etc., based on the contents so a simple conversion to/from string is fine.
The next-best thing would be foreign keys. Please don't post links to huge examples, my head would explode if I ever set eyes on any those.
The SQLAlchemy PickleType is meant exactly for this.
class SomeEntity(Base):
__tablename__ = 'some_entity'
id = Column(Integer, primary_key=True)
attributes = Column(PickleType)
# Just set the attribute to save it
s = SomeEntity(attributes={'baked': 'beans', 'spam': 'ham'})
session.add(s)
session.commit()
# If mutable=True on PickleType (the default) SQLAlchemy automatically
# notices modifications.
s.attributes['parrot'] = 'dead'
session.commit()
You can change the serialization mechanism by changing out the pickler with something else that has dumps() and loads() methods. The underlying storage mechanism by subclassing PickleType and overriding the impl attritbute:
class TextPickleType(PickleType):
impl = Text
import json
class SomeOtherEntity(Base):
__tablename__ = 'some_other_entity'
id = Column(Integer, primary_key=True)
attributes = Column(TextPickleType(pickler=json))
You can create a custom type by subclassing sqlalchemy.types.TypeDecorator to handle serialization and deserialization to Text.
An implementation might look like
import json
import sqlalchemy
from sqlalchemy.types import TypeDecorator
SIZE = 256
class TextPickleType(TypeDecorator):
impl = sqlalchemy.Text(SIZE)
def process_bind_param(self, value, dialect):
if value is not None:
value = json.dumps(value)
return value
def process_result_value(self, value, dialect):
if value is not None:
value = json.loads(value)
return value
Example usage:
class SomeModel(Base):
__tablename__ = 'the_table'
id = Column(Integer, primary_key=True)
json_field = Column(TextPickleType())
s = SomeModel(json_field={'baked': 'beans', 'spam': 'ham'})
session.add(s)
session.commit()
This is outlined in an example in the SQLAlchemy docs, which also shows how to track mutations of that dictionary.
This approach should work for all versions of Python, whereas simply passing json as the value to the pickler argument of PickleType will not work correctly, as AlexGrönholm points out in his comment on another answer.
SQLAlchemy has a built-in JSON type that you can use:
attributes = Column(JSON)
If you need to map a 1-N relation and map it as dict rather then list, then read Custom Dictionary-Based Collections
But if you mean a field, then what you can do it to have a DB field of type string, which is mapped to your Python object. But on the same python object you provide a property which will be kind-of proxy for this mapped string field of type dict().
Code example (not tested):
class MyObject(object):
# fields (mapped automatically by sqlalchemy using mapper(...)
MyFieldAsString = None
def _get_MyFieldAsDict(self):
if self.MyFieldAsString:
return eval(self.MyFieldAsString)
else:
return {} # be careful with None and empty dict
def _set_MyFieldAsDict(self, value):
if value:
self.MyFieldAsString = str(value)
else:
self.MyFieldAsString = None
MyFieldAsDict = property(_get_MyFieldAsDict, _set_MyFieldAsDict)
You can simply save() method to save dicts in sqlalchemy
For example
class SomeModel(Base):
__tablename__ = 'the_table'
id = Column(Integer, primary_key=True)
baked = Column(String, nullable=True)
spam = Column(String, nullable=True)
s = {'baked': 'beans', 'spam': 'ham'})
SomeModel(**s).save()

Categories

Resources