I'm working with scrapy. I want to get access to a sqlalchemy session for a table with a table named 'contacts' according to the docs (http://docs.sqlalchemy.org/en/latest/orm/session_basics.html#getting-a-session ) I have created the following:
engine = create_engine('sqlite:///data.db')
# create a configured "Session" class
Session = sessionmaker(bind=engine)
# create a Session
session = Session()
class ContactSpider(Spider):
.......
def parse(self, response):
print('hello')
session.query(contacts).filter_by(name='ed').all()
However I am not seeing a way to connect to a preexisting table. How is this done?
You can connect to pre-existing tables via reflection. Unfortunately your question lacks some of the code setup, so below is a general pseudocode example (assuming your table name is contacts)
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
# Look up the existing tables from database
Base.metadata.reflect(engine)
# Create class that maps via ORM to the database table
Contact = type('Contact', (Base,), {'__tablename__': 'contacts'})
Related
I am having trouble writing tables to a postgres database using SQLAlchemy ORM and Python scripts.
I know the problem has something to do with incorrect Session imports because when I place all the code below into a single file, the script creates the table without trouble.
However, when I break the script up into multiple files (necessary for this project), I receive the error "psycopg2.errors.UndefinedTable: relation "user" does not exist".
I have read many posts here on SO, tried reorganising my files, the function call order, changing from non-scoped to scoped sessions, eliminating and adding Base.metadata.create_all(bind=engine) in various spots, changed how the sessions are organised and created in base.py, as well as other things, but the script still errors and I'm not sure which code sequence is out of order.
The code currently looks like:
base.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import scoped_session, sessionmaker
# SQLAlchemy requires creating an Engine to interact with our database.
engine = create_engine('postgresql://user:pass#localhost:5432/testdb', echo=True)
# Create a configured ORM 'Session' factory to get a new Session bound to this engine
#_SessionFactory = sessionmaker(bind=engine)
# Use scoped session
db_session = scoped_session(
sessionmaker(
bind=engine,
autocommit=False,
autoflush=False
)
)
# Create a Base class for our classes definitions
Base = declarative_base()
models.py
from sqlalchemy import Column, DateTime, Integer, Text
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
email = Column(Text, nullable=False, unique=True)
name = Column(Text)
date_last_seen = Column(DateTime(timezone=True))
def __init__(self, email, name, date_last_seen):
self.email = email
self.name = name
self.date_last_seen = date_last_seen
inserts.py
from datetime import date
from base import db_session, engine, Base
from models import User
def init_db():
# Generate database schema based on our definitions in model.py
Base.metadata.create_all(bind=engine)
# Extract a new session from the session factory
#session = _SessionFactory()
# Create instance of the User class
alice = User('alice#throughthelooking.glass', 'Alice', date(1865, 11, 26))
# Use the current session to persist data
db_session.add(alice)
# Commit current session to database and close session
db_session.commit()
db_session.close()
print('Initialized the db')
return
if __name__ == '__main__':
init_db()
Thank you for any insight you're able to offer!
I'm trying to setup multiple databases with the same model in flask-sqlalchemy
A sample model looks like below
db = flask.ext.sqlalchemy.SQLAlchemy(app)
app.config['SQLALCHEMY_DATABASE_URI'] = 'your_default_schema_db_uri'
app.config['SQLALCHEMY_BINDS'] = {'other_schema': ''mysql+pymysql://'+UNMAE+':'+PASS+'#'+SERVERURL+':3306/'+ DBNAME,'##your_other_db_uri}
class TableA(db.Model):
# This belongs to Default schema, it doesn't need specify __bind_key__
...
class TableB(db.Model) :
# This belongs to other_schema
__bind_key__ = 'other_schema'
...
db.create_all() works fine and creates the tables in their individual schemas.
I was following https://stackoverflow.com/a/34240889/8270017 and wanted to create a single table using:
TableB.__table__.create(db.session.bind, checkfirst=True)
The table gets created in the default bind and not other_schema.
Is there something I'm missing here? How can I fix it so that it gets created in the other schema.
You need to supply the correct engine to the create function. The correct bind can be retrieved in the following way:
from sqlalchemy.orm import object_mapper
TableB.__table__.create(db.session().get_bind(object_mapper(TableB())), checkfirst=True)
For PG at least I do it like this:
class TableB(db.Model):
__table_args__ = {"schema":"schema_name"}
I work with a postgres database which has multiple (similar) schemas. In my codebase, I reflect the different schemas in seperate schema_xy.py files. I also have a base.py file which contains a base class with abstract tables definitions, for tables that are present in multiple schemas.
My base.py file:
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
engine = create_engine('postgresql+psycopg2://dbuser#dbhost:5432/dbname')
Base = declarative_base(bind=engine)
class User(Base):
__abstract__ = True
id = ..
And an example Schema1.py file, which inherits the User table from base but also has a schema specific table S1Table:
from sqlalchemy import declarative_base
from .base import User
engine = create_engine('postgresql+psycopg2://dbuser#dbhost:5432/dbname')
Schema1Base = declarative_base(bind=engine, metadata=MetaData(schema='Schema1'))
class User(Schema1Base, User):
__tablename__ = "user"
class S1Table(Schema1Base):
__tablename__ = "s1table"
foo = ...
My question is, how do I best instantiate sessions for querying and uploading data, which are 'bound' to a specific schema, i.e. how do I make sure that I query/manipulate the User table from the correct schema?
I have found this blog post
http://www.blog.pythonlibrary.org/2010/09/10/sqlalchemy-connecting-to-pre-existing-databases/
which defines a loadSession function for Base like so:
def loadSession():
metadata = Base.metadata
Session = sessionmaker(bind=engine)
session = Session()
return session
But I don't understand what the unused metadata is supposed to do here exactly.
I'm trying to get the number of rows from a SQL Server which consists of many tables by looping through it to see how much data they contain. However, I'm not sure what will go into select_from() function. As I currently supply Unicode for table names and it raised
NoInspectionAvailable: No inspection system is available for object of type <type 'unicode'>
The code that I used was
from sqlalchemy import create_engine
import urllib
from sqlalchemy import inspect
import sqlalchemy
from sqlalchemy import select, func, Integer, Table, Column, MetaData
from sqlalchemy.orm import sessionmaker
connection_string = "DRIVER={SQL Server}; *hidden*"
connection_string = urllib.quote_plus(connection_string)
connection_string = "mssql+pyodbc:///?odbc_connect=%s" % connection_string
engine = sqlalchemy.create_engine(connection_string)
Session = sessionmaker()
Session.configure(bind=engine)
session = Session()
connection = engine.connect()
inspector = inspect(engine)
for table_name in inspector.get_table_names():
print session.query(func.count('*')).select_from(table_name).scalar()
Typically, it's a class name that refers to a class that describes the database table.
In the sqlalchemy docs, http://docs.sqlalchemy.org/en/latest/orm/tutorial.html, they have you create a base class using declarative base and then create child classes for each table you want to query. You would then pass that class name into the select_from function unquoted.
The Flask framework provides a built-in base class that is ready for use called db.Model and Django has one called models.Model.
Alternatively, you can also pass queries. I use the Flask framework typically for python so I usually initiate queries like this:
my_qry = db.session.query(Cust).filter(Cust, Cust.cust == 'lolz')
results = my_qry.all()
On a side note, if you decide to look at .NET they also have nice ORMs. Personally, I favor Entity Framework, but Linq to SQL is out there, too.
I am just starting with SQLAlchemy and I have been wondering... I am going to have a lot of tables in my model. I would like to have own file for each table I will have in my model.
I am currently using following code:
from sqlalchemy import MetaData
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.dialects import postgresql
import sqlalchemy as sa
__all__ = ['Session', 'engine', 'metadata']
# SQLAlchemy database engine. Updated by model.init_model()
engine = None
# SQLAlchemy session manager. Updated by model.init_model()
Session = scoped_session(sessionmaker())
# Global metadata. If you have multiple databases with overlapping table
# names, you'll need a metadata for each database
metadata = MetaData()
# declarative table definitions
Base = declarative_base()
Base.metadata = metadata
schema = 'srbam_dev'
in meta.py
Following in _init_.py
"""The application's model objects"""
import sqlalchemy as sa
from sqlalchemy import orm
from models import meta
from models.filers import Filer
from models.vfilers import Vfiler
from models.filer_options import FilerOption
def init_models(engine):
"""Call me before using any of the tables or classes in the model"""
## Reflected tables must be defined and mapped here
#global reflected_table
#reflected_table = sa.Table("Reflected", meta.metadata, autoload=True,
# autoload_with=engine)
#orm.mapper(Reflected, reflected_table)
#
meta.engine = sa.create_engine(engine)
meta.Session.configure(bind=meta.engine)
class Basic_Table(object):
id = sa.Column(
postgresql.UUID(),
nullable=False,
primary_key=True
)
created = sa.Column(
sa.types.DateTime(True),
nullable=False
)
modified = sa.Column(
sa.types.DateTime(True),
nullable=False
)
And then following in all of my models
from models.meta import Base
from models.meta import Basic_Table
from models.meta import schema
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
class Filer(Base,Basic_Table):
This works just fine, as long as I do not start to use some Foreign Keys on tables, once I use Foreign Key says
sqlalchemy.exc.NoReferencedTableError: Foreign key associated with column 't_vfilers.filer_id' could not find table 't_filers' with which to generate a foreign key to target column 'id'
I tried to define id key directly in Filer class (and remove Basic_Table from declaration), however this does not solve the issue.
My code for creating the database looks like this
#!/usr/bin/python
import ConfigParser
from sqlalchemy.engine.url import URL
from models import *
config = ConfigParser.RawConfigParser()
config.read('conf/core.conf')
db_url = URL(
'postgresql+psycopg2',
config.get('database','username'),
config.get('database','password'),
config.get('database','host'),
config.get('database','port'),
config.get('database','dbname')
)
init_models(db_url)
meta.Base.metadata.drop_all(bind=meta.engine)
meta.Base.metadata.create_all(bind=meta.engine)
Does anyone have an idea how to fix this issue?
Marek,do one thing try defining the foreign key along with the schema name i.e 'test.t_vfilers.filer_id'(here 'test' is the schema name),this will solve the problem.
Have you remembered to import the different modules containing the models.
In my init.py I have at the bottom a ton of:
from comparty3.model.users import User, UserGroup, Permission
from comparty3.model.pages import PageGroup, Page
etc...
If that's not the issue then I'm not sure; however have you tried to change:
metadata = MetaData()
# declarative table definitions
Base = declarative_base()
Base.metadata = metadata
to:
# declarative table definitions
Base = declarative_base()
metadata = Base.metadata
I'm guessing here, but it may be that declarative_base() creates a special metadata object.
This is how it is defined in my pylons projects (were I'm guessing your code is from too).