I'm using mongoengine with django.
in my project web application, I need to connect at least two servers: one that is local for session, another connecting to mongolab (mongodb hosting service company).
So.. I need to connect localhost for session.. while connecting to another server in distance.
How can I implement multiple connections on mongoengine?
give me some examples please.
a Web application --- connecting ---> localhost for session
--- connecting ---> mongolab for application database
Use mongoengine dev branch.
Support multiple dbconnection with alias. https://github.com/hmarr/mongoengine/commit/8d2bc444bb64265f78f5bf716f773742dddd56c1
See this tests.
https://github.com/hmarr/mongoengine/blob/dev/tests/document.py#L2584
Depending on mongoengine documentation, to use multiple databases you can use connect() and provide an alias name for the connection. In the background this uses register_connection() to store the data and you can register all aliases up front if required.
connect(alias='user-db-alias', db='user-db')
connect(alias='book-db-alias', db='book-db')
connect(alias='users-books-db-alias', db='users-books-db')
class User(Document):
name = StringField()
meta = {'db_alias': 'user-db-alias'}
class Book(Document):
name = StringField()
meta = {'db_alias': 'book-db-alias'}
class AuthorBooks(Document):
author = ReferenceField(User)
book = ReferenceField(Book)
meta = {'db_alias': 'users-books-db-alias'}
You can also use switch_db() method.
The function disconnect() can be used to disconnect a particular connection. This can be used to change a connection globally:
from mongoengine import connect, disconnect
connect('a_db', alias='db1')
class User(Document):
name = StringField()
meta = {'db_alias': 'db1'}
disconnect(alias='db1')
connect('another_db', alias='db1')
Related
I'm developing an API with Flask and I cannot retrieve queries from a MySQL database I've connected with flask-sqlalchemy (not sqlalchemy alone). This is a pre-existing database downloaded from my client's PHPMyAdmin, so I haven't ran db.create_all(): I simply created the connection string in config.py, then instantiated db = SQLAchemy() and initialized it (db.init_app(app)) in my factory function (i'm using the factory pattern together with blueprints).
I've already checked and my computer is running the mysql process, the login credentials provided are correct and the database exists in my computer. I'm using MariaDB because I run Manjaro Linux.
This is the connection string, located in config.py:
SQLALCHEMY_DATABASE_URI = os.environ.get('DATABASE_URL') or "mariadb+mariadbconnector://dev:dev#localhost/desayunos56"
This is the relevant model. It was created using flask-sqlacodegen and then modified by me to only use the relevant columns within the table. At models.py:
from flask_sqlalchemy import SQLAlchemy
from app import db
# coding: utf-8
t_aus_postmeta = db.Table(
"""
post_id: Order ID
meta_key: Type of value (client name, billing address)
meta_value: Value of meta_key (Name or address itself)
"""
'aus_postmeta',
#db.Column('meta_id', db.BigInteger, nullable=False),
db.Column('post_id', db.BigInteger, nullable=False, server_default=db.FetchedValue()),
db.Column('meta_key', db.String(255, 'utf8mb4_unicode_ci')),
db.Column('meta_value', db.String(collation='utf8mb4_unicode_ci'))
)
And finally, this is the file with the error, views.py. It's a blueprint already registered to __init__.py. I created it only with the intention of checking if I could run queries, but I don't really intend to render anything from Flask:
from flask import render_template
from . import main
from .. import db
from app.models import t_aus_postmeta
#main.route("/", methods=["GET"])
def index():
result = t_aus_postmeta.query_by(post_id=786).first()
This is the error I get: AttributeError: 'Table' object has no attribute 'query_by'
I think it's noteworthy that, although my linter doesn't complain due to unresolved imports, when I use t_aus_postmeta I don't get any method suggestions.
All the questions I've checked are based on using sqlalchemy instead of flask-sqlalchemy. What could be causing this error? At this point, I'm at a loss.
I don't think that's the right way to create your model. Instead you should create it as a class, which will inherit from db.Model, that contains your query_by method.
models.py
class t_aus_postmeta(db.Model):
"""
post_id: Order ID
meta_key: Type of value (client name, billing address)
meta_value: Value of meta_key (Name or address itself)
"""
__tablename__ = 'aus_postmeta'
post_id = db.Column(db.BigInteger(), nullable=False, server_default=db.FetchedValue())
# rest of your columns...
If you do it this way a valid query would look like this:
t_aus_postmeta.query.filter_by('post_id').first()
Notice that this includes tutiplain's suggestion. I think you got your method name wrong. It's just query followed by a filter_by!
I can't find the API reference for the "query_by" method you are using. It seems there is no such method. Perhaps you meant "filter_by" instead?
I have push a python-django project to heroku and it works well. In my view.py file of django model, I added function that could connect to the local mysql database to retrieve data from the mysql. The function is the view.py is as followed:
#login_required
def results(request):
data=[]
data1 = []
owner = request.user
owner = str(owner)
db = MySQLdb.connect(user='root', db='aaa', passwd='xxxxx', host='localhost')
cursor = db.cursor()
cursor.execute("SELECT search_content, id, title, author, institute, FROM result_split where username = '%s'" % (owner))
data = cursor.fetchall()
db.close()
return render(request, "webdevelop/results.html", {"datas": data})
But when I try to open the page that show the data from mysql database in the deployed heroku website, it show the error:"OperationalError at /results/
(2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)")". How could I have this heroku project to connect to my local mysql database? Or I should choose alternative?
Firstly, you need to ensure that the user and password you're using to connect to MySQL is correct and that the user has the correct privileges to work with the selected database.
Then you can check that mysql is accepting connections on localhost.
As for directly addressing the Connection Refused exception, check things like the mysql socket used to communicate with localhost applications like your Django project. The socket must exist and be configured in MySQL.
I also recommend taking a look at something like SQLAlchemy for Python which will help you interact directly with the database using Python objects. For example,
Connecting to the database:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, relationship, scoped_session, mapper
from config import DB_URL
"""Database Declaration"""
metadata = MetaData()
Base = declarative_base(name='Base', mapper=mapper, metadata=metadata)
engine = create_engine(DB_URL, pool_recycle=1800)
Session = sessionmaker(bind=engine, autocommit=False, autoflush=True)
session = scoped_session(Session)
You can now use session variable to perform queries and updates using its inherited functions from the SQLAlchemy Session class.
SQLAlchemy also includes a declarative model for telling Python what your tables look like. For example,
class Clinic(Base):
__tablename__ = 'clinic'
clinic_id = Column(Integer, primary_key=True)
clinic_name = Column(VARCHAR)
address = Column(VARCHAR)
city = Column(VARCHAR)
zip = Column(VARCHAR)
phone = Column(VARCHAR)
user_id = Column(VARCHAR)
These examples work well for my projects in Flask and should work well enough in Django.
I am looking to find out when a connection is made to my Django database, or when my Django server is restarted. I found the connection_created Django signal. The description is:
Sent when the database wrapper makes the initial connection to the database. This is particularly useful if you’d like to send any post connection commands to the SQL backend.
So I think using this signal will be a good solution for my case. I want to run a function once the connection is made. I can't find any documentations on the use cases of this signal. connection_created.connect is probably the function to use. This function takes in a bunch of arguments, but the ones that are relevant are self, receiver, sender and weak. Does anyone know how I can use these arguments and this function to run my function at a new connection instance?
Also if anyone has any alternative solutions other than this signal, I'd love to hear them.
I have all my tables distributed among dynamic postgres table schemas, and use the connection signal to set the search path of the connection, since django does not support postgres schemas.
in myapp/apps.py
from django.db.backends.signals import connection_created
class MyappConfig(AppConfig):
name = 'myapp'
def ready(self):
from myapp.schema_manager import new_connection
connection_created.connect(new_connection)
in myapp/schema_manager.py
def new_connection(sender, connection, **kwargs):
search_path = ['public'] + get_current_schemas() # fetch the active schemas
connection.cursor().execute("SET search_path TO %s;" % ', '.join(search_path)
According to the docs, this signal receives two arguments:
sender
The database wrapper class – i.e. django.db.backends.postgresql.DatabaseWrapper or django.db.backends.mysql.DatabaseWrapper, etc.
connection
The database connection that was opened. This can be used in a multiple-database configuration to differentiate connection signals from different databases.
... since django does not support postgres schemas
Django supports postgres schemas:
class MyModel(models.Model):
id = models.IntegerField(primary_key=True)
title = models.TextField()
class Meta:
db_table = '"schema_name"."table_name"'
I use this notation in all of our projects.
In the event mapper level docs
it says that Session.add() is not supported, but when I tried to do db.session.add(some_object) inside after_insert event it worked, example:
def after_insert_listener(mapper, connection, user):
global_group = Group.query.filter_by(groupname='global').first()
a = Association(user,global_group)
db.session.add(a)
event.listen(User, 'after_insert', after_insert_listener)
Basically any new user should be part of global_group, so I added it in the after_insert event. I tried to insert a user, and then checked into my database and I found the user record, and the association record.
Let's check the diferences:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite://///Users/dedeco/Documents/tmp/testDb.db'
db = SQLAlchemy(app)
>>>type(db.session)
<class 'sqlalchemy.orm.scoping.scoped_session'>
or
from sqlalchemy import *
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
some_engine = create_engine('sqlite://///Users/dedeco/Documents/tmp/testDb.db')
Session = sessionmaker(bind=some_engine)
session = Session()
Base = declarative_base()
>>> type(session)
<class 'sqlalchemy.orm.session.Session'>
Basically the difference is:
In the first way you are using a API developed for the Flask framework, called Flask-SQLAlchemy. It's the option if you are creating a Flask application, because the scope of the Session can be managed automatically by your application. You have many benefits like a infrastructure to establish a single Session, associated with the request, which is correctly constructed and torn down corresponding torn down at the end of a request.
In the second way is a pure SQLAlchemy app, so if you are using a library to connect a particular database, you can use just a SQLAlchemy API, for example, for a command-line script, background daemon, GUI interface-driven application, etc.
So, in a both way you can add, like:
Using a Flask-SQLAlchemy:
class User(db.Model):
__tablename__ = 'users'
user_id = db.Column(db.Integer(), primary_key = True)
user_name = db.Column(db.String(80), unique=True)
def __init__(self, user_name):
self.user_name = user_name
>>> db.create_all()
>>> u = User('user1')
>>> db.session.add(u)
>>> db.session.commit()
>>> users = db.session.query(User).all()
>>> for u in users:
... print u.user_name
...
user1
Using just SQLAlchemy:
class User(Base):
__tablename__ = 'users'
user_id = Column(Integer(), primary_key = True)
user_name = Column(String(80), unique=True)
>>> u = User()
>>> u.user_name = 'user2'
>>> session.add(u)
>>> session.commit()
>>> users = session.query(User).all()
>>> for u in users:
... print u.user_name
...
user1
user2
Realize that I am connecting in the same database just for show that you can add using many ways.
server = Flask(__name__)
app = dash.Dash(__name__,server=server,external_stylesheets=[dbc.themes.LITERA], suppress_callback_exceptions = True)
app.server.config["SQLALCHEMY_DATABASE_URI"] = f'postgresql://postgres:.../...'
db = SQLAlchemy(app.server)
I have the same problem of not knowing at what point I should close the session of the database in my web application. I found this in the link that #GabrielChu shared so what I understood was if you are dealing with a web app the session is closed when the user close their tab
A web application is the easiest case because such an application is already constructed around a single, consistent scope - this is the request, which represents an incoming request from a browser, the processing of that request to formulate a response, and finally the delivery of that response back to the client. Integrating web applications with the Session is then the straightforward task of linking the scope of the Session to that of the request. The Session can be established as the request begins, or using a lazy initialization pattern which establishes one as soon as it is needed. The request then proceeds, with some system in place where application logic can access the current Session in a manner associated with how the actual request object is accessed. As the request ends, the Session is torn down as well, usually through the usage of event hooks provided by the web framework. The transaction used by the Session may also be committed at this point, or alternatively the application may opt for an explicit commit pattern, only committing for those requests where one is warranted, but still always tearing down the Session unconditionally at the end.
Some web frameworks include infrastructure to assist in the task of aligning the lifespan of a Session with that of a web request. This includes products such as Flask-SQLAlchemy, for usage in conjunction with the Flask web framework, and Zope-SQLAlchemy, typically used with the Pyramid framework. SQLAlchemy recommends that these products be used as available
I have a django 1.5 project using django models over mysql running on apache server.
class Person(models.Model):
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
birthdate = models.DateField()
class Book(models.Model):
name = models.CharField(max_length=100)
author = models.ForeignKey(Person)
I also have a python/django application (using django custom commands) running on a remote computer that must use these models.
Remote application shares the same model definitions with server
Remote application needs read only access to models
Remote application cannot have a full dump of server database, as server must return a queryset based on user rights
Remote application can only connect over http to server
Server can expose the models over REST API (json)
Is there any automated way to transfer models over http? I have tried to use django.core.serializers but I had the following issues:
I cannot serialize the related objects in a queryset
Remote application cannot work without local database
Remote application searches related objects on local db after deserialization (that does not exist)
Edit:
I managed to serialize models like this:
books = Book.objects.prefetch_related('author').all()
authors = [book.author for book in books]
data = authors + list(books.all())
serialized_data = django.core.serializers.serialize("json", data)
My problem is that the remote application cannot deserialize without having a local database.
Don't think that you need to transfer models over http.
need just to connect to server db.
In remote app settings choose db engine (mysql in your case), name.
Specify appropriate user and password.
And enter a valid host and proxy (if needed). the one your database server is running on
As for the user. on the server create a mysql user with read only rights to the db.
This will give you the ability to use the same db for both, server and remote app.
Finally I solved by using sqlite running on ram on client side
On settings.py I used this configuration
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'
}
}
And the code is like this:
from django.db import connections
from django.core.management.color import no_style
from django.core import serializers
from apps.my_models import Book, Person
connection = connections['default']
cursor = connection.cursor()
sql, references = connection.creation.sql_create_model(Book, no_style(), set())
cursor.execute(sql[0])
sql, references = connection.creation.sql_create_model(Person, no_style(), set())
cursor.execute(sql[0])
serialized_data = get_serialized_data()
for obj in serializers.deserialize("json", serialized_data):
obj.save()
books = Book.objects.prefetch_related('author').all()