I'm trying to map an existing DB2 database to new python ORM objects.
I wrote a very simple mapper class:
class Storage(Base):
__tablename__ = 'T_RES_STORAGE_SUBSYSTEM'
id = Column(Integer,primary_key=True,name='SUBSYSTEM_ID')
name = Column(String(255),name='NAME')
namealias = Column(String(256),name='NAME_ALIAS')
But when I try to map it, by executing a query it puts the DB2ADMIN.tablename in front of every query, which of course lead to errors. If I execute the query manually by prepending TPC.tablename to it, then everything works without issues.
How can I specify in a table definition which schema to use?
Ok so after the help of mustaccio, I found out that in the table_args you have to add schema:
class Storage(Base):
__tablename__ = 'T_RES_STORAGE_SUBSYSTEM'
__table_args__ = {'schema' : 'TPC'}
id = Column(Integer,primary_key=True,name='SUBSYSTEM_ID')
name = Column(String(255),name='NAME')
namealias = Column(String(256),name='NAME_ALIAS')
Related
I'm having this issue, where sqlalchemy does not recognize the database, even though it is declared with declarative_base. After trying to run a simple query of session.query(AppGeofencing).all() I get sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (1046, 'No database selected').
The table is declared as
Base = declarative_base()
AppBase = declarative_base(metadata=MetaData(schema='app'))
class AppGeofencing(AppBase):
__tablename__ = 'geofencing'
id = Column(INTEGER, primary_key=True, autoincrement=True)
name = Column(VARCHAR(45))
polygon = Column(Geometry('POLYGON'))
def __init__(self, name=None, polygon=None):
self.name = name
self.polygon = polygon
The case is only with this table, because I have also done similarly for other tables, and they work just fine.
After enabling the logging for sqlalchemy I can see that is does indeed create the correct query
INFO:sqlalchemy.engine.Engine:SELECT app.geofencing.id AS app_geofencing_id, app.geofencing.name AS app_geofencing_name, ST_AsEWKB(app.geofencing.polygon) AS app_geofencing_polygon
FROM app.geofencing
but somehow it cannot determine the database to use?
Does anyone have any idea, what could cause such issue?
I'm trying to setup multiple databases with the same model in flask-sqlalchemy
A sample model looks like below
db = flask.ext.sqlalchemy.SQLAlchemy(app)
app.config['SQLALCHEMY_DATABASE_URI'] = 'your_default_schema_db_uri'
app.config['SQLALCHEMY_BINDS'] = {'other_schema': ''mysql+pymysql://'+UNMAE+':'+PASS+'#'+SERVERURL+':3306/'+ DBNAME,'##your_other_db_uri}
class TableA(db.Model):
# This belongs to Default schema, it doesn't need specify __bind_key__
...
class TableB(db.Model) :
# This belongs to other_schema
__bind_key__ = 'other_schema'
...
db.create_all() works fine and creates the tables in their individual schemas.
I was following https://stackoverflow.com/a/34240889/8270017 and wanted to create a single table using:
TableB.__table__.create(db.session.bind, checkfirst=True)
The table gets created in the default bind and not other_schema.
Is there something I'm missing here? How can I fix it so that it gets created in the other schema.
You need to supply the correct engine to the create function. The correct bind can be retrieved in the following way:
from sqlalchemy.orm import object_mapper
TableB.__table__.create(db.session().get_bind(object_mapper(TableB())), checkfirst=True)
For PG at least I do it like this:
class TableB(db.Model):
__table_args__ = {"schema":"schema_name"}
I am just trying to get started using sqlalchemy. For whatever reason I can't get anything to work.
I installed sqlalchemy the import alone works. I tried to start following the code on this site:
https://www.pythoncentral.io/introductory-tutorial-python-sqlalchemy/
The code is as follows:
import os
import sys
from sqlalchemy import Column, ForeignKey, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship
from sqlalchemy import create_engine
Base = declarative_base()
class Person(Base):
__tablename__ = 'person'
# Here we define columns for the table person
# Notice that each column is also a normal Python instance attribute.
id = Column(Integer, primary_key=True)
name = Column(String(250), nullable=False)
class Address(Base):
__tablename__ = 'address'
# Here we define columns for the table address.
# Notice that each column is also a normal Python instance attribute.
id = Column(Integer, primary_key=True)
street_name = Column(String(250))
street_number = Column(String(250))
post_code = Column(String(250), nullable=False)
person_id = Column(Integer, ForeignKey('person.id'))
person = relationship(Person)
# Create an engine that stores data in the local directory's
# sqlalchemy_example.db file.
engine = create_engine('sqlite:///sqlalchemy_example.db')
# Create all tables in the engine. This is equivalent to "Create Table"
# statements in raw SQL.
Base.metadata.create_all(engine)
I copied and pasted the code to create a table and I'm getting the following error
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to
open database file (Background on this error at:
http://sqlalche.me/e/e3q8)
I went to http://sqlalche.me/e/e3q8 and it seems to believe that adding pool_pre_ping=True to the engine would help resolve issue. It mentions a connection issues, but don't really understand how that can be since it's just creating the sqlite database.
I would really appreciate any advice on how I can fix this issue.
Edit: I put the specific code into my question.
Also I tried performing the code in pythonanywhere and it works as expected. Any guidance on what could be wrong with my machine would be appreciated.
So for whatever reason I needed to designate the absolute path of where the database needed to be. I updated my engine to be:
sqlite:///C:\user\file_path\test.db
this allowed it to create the database. However I'd really prefer it just create the database in the current directory. If someone knows what I need to do to get that to work that would be great.
I have a sqlalchemy schema containing three tables, (A, B, and C) related via one-to-many Foreign Key relationships (between A->B) and (B->C) with SQLite as a backend. I create separate database files to store data, each of which use the exact same sqlalchemy Models and run identical code to put data into them.
I want to be able to copy data from all these individual databases and put them into a single new database file, while preserving the Foreign Key relationships. I tried the following code to copy data from one file to a new file:
import sqlalchemy
from sqlalchemy.ext import declarative
from sqlalchemy import Column, String, Integer
from sqlalchemy import orm, engine
Base = declarative.declarative_base()
Session = orm.session_maker()
class A(Base):
__tablename__ = 'A'
a_id = Column(Ingeter, primary_key=True)
adata = Column(String)
b = orm.relationship('B', back_populates='a', cascade='all, delete-orphan', passive_deletes=True)
class B(Base):
__tablename__ = 'B'
b_id = Column(Ingeter, primary_key=True)
a_id = Column(Integer, sqlalchemy.ForeignKey('A.a_id', ondelete='SET NULL')
bdata = Column(String)
a = orm.relationship('A', back_populates='b')
c = orm.relationship('C', back_populates='b', cascade='all, delete-orphan', passive_deletes=True)
class C(Base):
__tablename__ = 'C'
c_id = Column(Ingeter, primary_key=True)
b_id = Column(Integer, sqlalchemy.ForeignKey('B.b_id', ondelete='SET NULL')
cdata = Column(String)
b = orm.relationship('B', back_populates='c')
file_new = 'file_new.db'
resource_new = 'sqlite:////%s' % file_new.lstrip('/')
engine_new = sqlalchemy.create_engine(resource_new, echo=False)
session_new = Session(bind=engine_new)
file_old = 'file_old.db'
resource_old = 'sqlite:////%s' % file_old.lstrip('/')
engine_old = sqlalchemy.create_engine(resource_old, echo=False)
session_old = Session(bind=engine_old)
for arow in session_old.query(A):
session_new.add(arow) # I am assuming that this will somehow know to copy all the child rows from the tables B and C due to the Foreign Key.
When run, I get the error, "Object '' is already attached to session '2' (this is '1')". Any pointers on how to do this using sqlalchemy and sessions? I also want to preserve the Foreign Key relationships within each database.
The use case is where data is first generated locally in non-networked machines and aggregated into a central db on the cloud. While the data will get generated in SQLite, the merge might happen in MySQL or Postgres, although here everything is happening in SQLite for simplicity.
First, the reason you get that error is because the instance arow is still tracked by session_old, so session_new will refuse to deal with it. You can detach it from session_old:
session_old.expunge(arow)
Which will allow you do add arow to session_new without issue, but you'll notice that nothing gets inserted into file_new. This is because SQLAlchemy knows that arow is persistent (meaning there's a row in the db corresponding to this object), and when you detach it and add it to session_new, SQLAlchemy still thinks it's persistent, so it does not get inserted again.
This is where Session.merge comes in. One caveat is that it won't merge unloaded relationships, so you'll need to eager load all the relationships you want to merge:
query = session_old.query(A).options(orm.subqueryload(A.b),
orm.subqueryload(A.b, B.c))
for arow in query:
session_new.merge(arow)
The SQLAlchemy provides the Connection.execution_options.schema_translate_map for change the schemas in execution time, as said in docs.
In the examples is shown how to use to perform queries, but want to know how to use it with create_all().
I'm using Flask-Sqlaclhemy and postgresql as database. Let's say I have this:
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
def create_app():
app = Flask(...)
...
db.init_app(app)
...
return app
class User(db.Model):
__tablename__ = 'user'
__table_args__ = {'schema':'public'}
company = db.Column(db.String(10))
class SomePublicModel(db.Model):
__tablename__ = 'some_public'
__table_args__ = {'schema':'public'}
...
class SomeModelByDynamicSchema(db.Model):
__tablename__ = 'some_dynamic'
__table_args__ = {'schema':'dynamic'}
...
The dynamic schema will be replace for other value according the user's company in execution time.
Assuming I already have in database the schemas public and dynamic and a I want to create a new schema with the tables, something like this:
def create_new():
user = User(company='foo')
db.session.execute("CREATE SCHEMA IF NOT EXISTS %s" % user.company)
db.session.connection().execution_options(schema_translate_map={'dynamic':user.company})
#I would like to do something of the kind
db.create_all()
I expected the tables to be created in the foo schema as foo.some_dynamic, but the SQLAlchemy still try to create in dynamic schema.
Can someone help me?
When you set execution options, you create copy of connection. This mean what create_all run without schema_translate_map.
>>> c = Base.session.connection()
>>> w = c.execution_options(schema_translate_map={'dynamic':'kek'})
>>> c._execution_options
immutabledict({})
>>> w._execution_options
immutabledict({'schema_translate_map': {'dynamic': 'kek'}})
to achieve your goal, you could try another approach.
get the tables from old grammmar and adapt for new metadata.
metadata = MetaData(bind=engine, schema=db_schema)
for table in db.Model.metadata.tables.values():
table.tometadata(metadata)
metadata.drop_all()
metadata.create_all()