I am trying to task SQLAlchemy ORM to create class Field that describes all fields in my database:
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Field(Base):
__tablename__ = 'fields'
__table_args__ = {'schema':'SCM'}
id = Column(String(20), primary_key=True)
The issue is that table fields describes different fields in different schemas, i.e.
SCM.fields
TDN.fields
...
I need class Field to
Be initialized with object fieldset before records can be read from db
Schema determined by fieldset.get_schema() before table <schema>.fields is read.
Something like this:
session.query(Field(fieldset))).filter(Field.id=='some field')
However, adding
def __init__(self, fieldset)
pass
to class Field results in
__init__() takes 1 positional argument...
I could lump all fields tables into one schema and add column 'schema_name' but I still need Field have link to its fieldset.
Can this be done using SQLAlchemy ORM or should I switch to SqlAlchemy Core where I would have more control over object instantiation?
So the problem is solvable and the solution is documented as Augmenting the Base
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declared_attr, declarative_base
class Field:
__tablename__ = 'fields'
#declared_attr
def __table_args__(cls):
# schema will be part of class name, which works for me
# Example: class FieldSCM --> Field with schema SCM
return dict(schema=cls.__name__[5:].upper())
id = Column(String(20), primary_key=True)
Field = declarative_base(cls=Field)
class FieldSet:
def __init__(self, schema):
self.fieldtype = type('Field' + schema.upper(), (Field,), {})
Proof of concept:
FieldSet('ork').fieldtype.__table__
Table('fields', MetaData(bind=None), Column('id', String(length=20), table=, primary_key=True, nullable=False), schema='ORK')
Related
I am new to SQLAlchemy. Trying to build database in Heroku Postgres
Get error:
ArgumentError: Mapper mapped class Users->users could not assemble any primary key columns for mapped table 'users'
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import (Column, Integer, BigInteger, String, Sequence, TIMESTAMP, Boolean, JSON)
from sqlalchemy import sql
from sqlalchemy import Table, Column, Integer, String, MetaData, Sequence, create_engine
class Users(Base):
__tablename__ = 'users'
meta = MetaData()
users_table = Table(
'users', meta,
Column('id', Integer, primary_key=True),
Column('name',String(50)),
Column('fullname',String(50)),
Column('phone',Integer),
)
meta.create_all(engine)
def __repr__(self):
return "<User(id='{}', fullname='{}', username='{}')>".format(
self.id, self.full_name, self.username)
The main reasons this does not work is that you try to create the table before the class User is created AND also the table you declare is not declared on the correct attribute. So sqlalchemy cannot find the id column.
I have tried to outline a more idiomatic declarative approach below:
# This implicitly creates metadata, you do not need to make an explicit one.
Base = declarative_base()
class Users(Base):
__tablename__ = 'users'
# Using a table explicitly is not required here
# Just declare the columns as properties.
id = Column(Integer, primary_key=True)
name = Column(String(50))
fullname = Column(String(50))
phone = Column(Integer)
# Don't do this here because Users is not done
# meta.create_all(engine)
def __repr__(self):
return "<User(id='{}', fullname='{}', username='{}')>".format(
self.id, self.full_name, self.username)
# Do this after classes are declared and use Base's metadata.
Base.metadata.create_all(engine)
References:
If you really want to set a table you have to set it with __table__ and use Base.metadata:
https://docs.sqlalchemy.org/en/13/orm/extensions/declarative/table_config.html#using-a-hybrid-approach-with-table
I have this simple model of Author - Books and can't find a way to make firstName and lastName a composite key and use it in relation. Any ideas?
from sqlalchemy import create_engine, ForeignKey, Column, String, Integer
from sqlalchemy.orm import relationship, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
engine = create_engine('mssql://user:pass#library')
engine.echo = True
session = sessionmaker(engine)()
class Author(Base):
__tablename__ = 'authors'
firstName = Column(String(20), primary_key=True)
lastName = Column(String(20), primary_key=True)
books = relationship('Book', backref='author')
class Book(Base):
__tablename__ = 'books'
title = Column(String(20), primary_key=True)
author_firstName = Column(String(20), ForeignKey('authors.firstName'))
author_lastName = Column(String(20), ForeignKey('authors.lastName'))
The problem is that you have defined each of the dependent columns as foreign keys separately, when that's not really what you intend, you of course want a composite foreign key. Sqlalchemy is responding to this by saying (in a not very clear way), that it cannot guess which foreign key to use (firstName or lastName).
The solution, declaring a composite foreign key, is a tad clunky in declarative, but still fairly obvious:
class Book(Base):
__tablename__ = 'books'
title = Column(String(20), primary_key=True)
author_firstName = Column(String(20))
author_lastName = Column(String(20))
__table_args__ = (ForeignKeyConstraint([author_firstName, author_lastName],
[Author.firstName, Author.lastName]),
{})
The important thing here is that the ForeignKey definitions are gone from the individual columns, and a ForeignKeyConstraint is added to a __table_args__ class variable. With this, the relationship defined on Author.books works just right.
I have the following database schema:
Parent table:
id - primary key identifier.
type - polymorphic_identity.
name - string data column.
Child table - inherits Parent:
id - primary key identifier.
parent_id - foreignkey to Parent.
category - string data column.
Summing up I have two tables. Table Child inherits from Parent and also have a foreignkey to it.
UPD: I really need both inheritance and foreignkey. This example is only a short demo which reproduces the problem.
I used declarative_base to declare the schema:
# -*- coding: utf-8 -*-
from sqlalchemy import Column, String, Integer, ForeignKey
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Parent(Base):
__tablename__ = "Parent"
id = Column(Integer, primary_key=True)
type = Column(String(250))
name = Column(String(250))
__mapper_args__ = {
'polymorphic_identity':'Parent',
'polymorphic_on':type
}
class Child(Parent):
__tablename__ = 'Child'
id = Column(Integer, ForeignKey('Parent.id'), primary_key=True)
parent_id = Column(ForeignKey("Parent.id"), nullable=True)
category = Column(String(250))
__mapper_args__ = {
'polymorphic_identity':'Child',
}
engine = create_engine('postgresql+psycopg2://joe:joe#localhost/alch')
session = sessionmaker()
session.configure(bind=engine)
Base.metadata.create_all(engine)
But when I run the code I get the following error:
sqlalchemy.exc.AmbiguousForeignKeysError: Can't determine join between 'Parent' and 'Child'; tables have more than one foreign key constraint relationship between them. Please specify the 'onclause' of this join explicitly.
I have tried to set relationship attribute myself for Parent or for Child separately and for both too. Tried to use primaryjoin and foreign_keys parameters of relationship. But the error was the same.
I'm totally confused about this error.
I need help.
I found the solution here.
SQLAlchemy needs a hint in this situation: a inherit_condition field in Child's __mapper_args__ does the trick.
# -*- coding: utf-8 -*-
from sqlalchemy import Column, String, Integer, ForeignKey
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Parent(Base):
__tablename__ = "Parent"
id = Column(Integer, primary_key=True)
type = Column(String(250))
name = Column(String(250))
__mapper_args__ = {
'polymorphic_identity':'Parent',
'polymorphic_on':type
}
class Child(Parent):
__tablename__ = 'Child'
id = Column(Integer, ForeignKey('Parent.id'), primary_key=True)
parent_id = Column(ForeignKey("Parent.id"), nullable=True)
category = Column(String(250))
parent = relationship(Parent, foreign_keys=[parent_id])
__mapper_args__ = {
'polymorphic_identity':'Child',
'inherit_condition': id == Parent.id,
}
engine = create_engine('postgresql+psycopg2://joe:joe#localhost/alch')
session = sessionmaker()
session.configure(bind=engine)
Base.metadata.create_all(engine)
Have you tried removing the Foreign Key for the Child id field?
id = Column(Integer, ForeignKey('Parent.id'), primary_key=True)
parent_id = Column(ForeignKey("Parent.id"), nullable=True)
You need something like this:
id = Column(Integer, auto_increment=True, primary_key=True)
parent_id = Column(ForeignKey("Parent.id"), nullable=True)
Using Python 3.5 and SQLAlchemy 1.0.14 (ORM).
I have a table of items declared as such:
from sqlalchemy.ext.declarative.api import declarative_base
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String)
# other non relevant attributes
My Items can be of many different types, the type identifier being stored in type.
For a few of those objects types, I need to have specific methods or attributes available.
To achieve that I tried to use single table inheritance with several SpecialisedItem as subclass of Item:
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Base):
__mapper_args__ = {
'polymorphic_identity': 'specialitem',
}
def specialised_method(self):
return "I am special"
Now when I load my items, I'd want all specialised items (having type=='specialitem') to be loaded as such, while any other type value would result in the parent class Item being loaded.
That doesn't work, I get AssertionError: No such polymorphic_identity 'normal' is defined when loading the items.
I would like to avoid creating inherited classes that do nothing just to cover all possible type values, instead having "unmapped" type falling back to the parent class Item.
Is there any way to achieve that effect ?
Minimal test case for reference:
from sqlalchemy.engine import create_engine
from sqlalchemy.ext.declarative.api import declarative_base
from sqlalchemy.orm.session import sessionmaker
from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.sqltypes import Integer, String
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Item):
__mapper_args__ = {
'polymorphic_identity': 'special',
}
specialAttribute = Column(String)
def specialised_method(self):
return "I am special"
engine = create_engine("sqlite:///:memory:")
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Item(type='normal'))
session.add(Item(type='special'))
session.commit()
# loading only specialized items works
for item in session.query(Item).filter_by(type="special"):
print(item.specialised_method())
# loading other items fails
for item in session.query(Item):
print(item.type)
Thanks,
Guillaume
A mapping of “polymorphic identity” identifiers to Mapper instances is stored in the polymorphic_map dict. You can create custom polymorphic_map that will return parent class mapper for undefined polymorphic identities.
from sqlalchemy.engine import create_engine
from sqlalchemy.ext.declarative.api import declarative_base
from sqlalchemy.orm.session import sessionmaker
from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.sqltypes import Integer, String
from sqlalchemy import event
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Item):
__mapper_args__ = {
'polymorphic_identity': 'special',
}
specialAttribute = Column(String)
def specialised_method(self):
return "I am special"
#http://docs.sqlalchemy.org/en/rel_1_1/orm/events.html#sqlalchemy.orm.events.MapperEvents.mapper_configured
#event.listens_for(Item, 'mapper_configured')
def receive_mapper_configured(mapper, class_):
mapper.polymorphic_map = defaultdict(lambda: mapper, mapper.polymorphic_map)
# to prevent 'incompatible polymorphic identity' warning, not mandatory
mapper._validate_polymorphic_identity = None
engine = create_engine("sqlite:///:memory:")
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Item(type='normal'))
session.add(Item(type='special'))
session.commit()
# loading only specialized items works
for item in session.query(Item).filter_by(type="special"):
print(item.specialised_method())
# loading other items fails
for item in session.query(Item):
print(item.type)
A reusable decorator solution, based on #r-m-n answer. Custom class is also replaced with collections.defaultdict that actually does same thing.
def receive_mapper_configured(mapper, class_):
mapper.polymorphic_map = defaultdict(lambda: mapper, mapper.polymorphic_map)
# to prevent 'incompatible polymorphic identity' warning, not necessary
mapper._validate_polymorphic_identity = None
def polymorphic_fallback(mapper_klass):
event.listens_for(mapper_klass, 'mapper_configured')(receive_mapper_configured)
return mapper_klass
Then in your code you can just add this decorator to base classes:
#polymorphic_fallback
class Item:
...
class SpecificItem(Item):
...
Models
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, ForeignKey
from sqlalchemy import Integer
from sqlalchemy import Unicode
from sqlalchemy import TIMESTAMP
from sqlalchemy.orm import relationship
BaseModel = declarative_base()
class Base(BaseModel):
__tablename__ = 'base'
id = Column(Integer, primary_key=True)
location = Column(Unicode(12), ForeignKey("locationterrain.location"), unique=True,)
name = Column(Unicode(45))
ownerid = Column(Integer,ForeignKey("player.id"))
occupierid = Column(Integer, ForeignKey("player.id"))
submitid = Column(Integer,ForeignKey("player.id"))
updateid = Column(Integer,ForeignKey("player.id"))
owner = relationship("Player",
primaryjoin='Base.ownerid==Player.id',
join_depth=3,
lazy='joined')
occupier= relationship("Player",
primaryjoin='Base.occupierid==Player.id',
join_depth=3,
lazy='joined')
submitter = relationship("Player",
primaryjoin='Base.submitid== Player.id',
join_depth=3,
lazy='joined')
updater= relationship("Player",
primaryjoin='Base.updateid== Player.id',
join_depth=3,
lazy='joined')
class Player(BaseModel):
__tablename__ = 'player'
id = Column(Integer, ForeignKey("guildmember.playerid"), primary_key=True)
name = Column(Unicode(45))
Searching
bases = dbsession.query(Base)
bases = bases.order_by(Base.owner.name)
This doesn't work .... I've searched everywhere and read the documentation.
But I just don't see how I can sort my (Base) query on their 'owner' relationship's name.
It always results in:
AttributeError: Neither 'InstrumentedAttribute' object nor 'Comparator' object has an attribute 'name'
This must be easy... but I don't see it. Also looked into Comparators, which seemed logical, but I don't see where the query part for the ORDER BY is generated or what I should be returning since everything is generated dynamically. And making a comparator for each of my 'player' relationships to do a simple thing seems over complicated.
SQLAlchemy wants you to think in terms of SQL. If you do a query for "Base", that's:
SELECT * FROM base
easy. So how, in SQL, would you select the rows from "base" and order by the "name" column in a totally different table, that is, "player"? You use a join:
SELECT base.* FROM base JOIN player ON base.ownerid=player.id ORDER BY player.name
SQLAlchemy has you use the identical thought process - you join():
session.query(Base).join(Base.owner).order_by(Player.name)
Use order_by argument
note here order_by argumnent in relationship from sqlalchemy.orm
Please consider parent children relationship here as many-to-many.
class Parent(BaseModel):
__tablename__ = "parent"
name = Column(String(100), nullable=False)
children = relationship(
'Children',
secondary=chile_parent_mapping,
backref=backref(
'parents',
),
order_by="Parent.order_id",
)