Where to store MongoClient in Django - python

I'm using pymongo to allow my Django site to save data to MongoDB. Apparently the MongoClient() class has connection pooling built in, and it should only be instantiated once when Django starts up. So each connection to my Django site will basically reuse that single MongoClient. I see lots of information on the web that states that this is the way it should be done.However, I cannot find any suggestions on where exactly in Django to put this single instance of MongoClient. Most Django literature says explicitly not to persist global variables used across all user sessions.
So where exactly do I create and store this single instance of MongoClient? In views.py? In models.py? Somewhere else? And if there is just a single instance of MongoClient, how exactly does the connection pooling inside help?

It's a bit late answering this question, but future searchers may find it useful.
If you're only using MongoDB for a few operations (and thus don't want to use the full MongoEngine architecture), you can set up your architecture like this:
# project/settings.py
(place Mongo connection info here)
# project/__init__.py
(declare a global MongoClient here, it will be throughout the app)
# project/all apps
import project.MY_MONGO_CLIENT_NAME
(use MongoClient as needed)
A more full breakdown may be found here: https://gist.github.com/josephmosby/4497f8a4f675170180ab

Further to (and inspired by) josephmosby's answer, I'm using something like the following:
# project/settings
MONGO_DB = {
'default': {
'HOST': 'localhost',
'PORT': 27017
},
...
}
# project/__init__.py
gMongoClient = {}
# project/utils/mongo_tool.py
from project import gMongoClient
from project.settings import MONGO_DB
import pymongo
def get_mongo_db(dbname="default"):
if dbname in gMongoClient:
return gMongoClient[dbname]
if dbname in MONGO_DB:
with MONGO_DB[dbname] as config:
gMongoClient = pymongo.MongoClient(config["HOST"],
config["PORT"])
else:
gMongoClient[dbname] = None
return gMongoClient[dbname]
# .../view.py
from utils import mongo_tool
...
db = mongo_tool.get_mongo_db()
results = db["collection"].find(...)
This could be made fancier, e.g. to see if a user and password are specified in the settings for a particular connection, etc., but the above captures the essence of the idea.

Related

How to set flask routes to work with both test and production database separately? [duplicate]

This question already exists:
How to make pytest unit tests run all operations on test database instead of dev/production one?
Closed 7 months ago.
I have a REST API and I want to separate pytest from actual app. Is this acceptable to setup mongodb collection like this? At the moment of running tests, I want all my routes to use testing database. But when I run the API, I want the routes to use the normal database instead.
config.py
class TestConfig(Config):
TESTING = True
DEBUG = True
MONGO_URI = os.environ.get("MONGO_TEST_URI")
DBNAME = "test_cve_db"
class DevConfig(Config):
DEBUG = True
URI = os.environ.get("MONGO_URI")
DBNAME = "cve_db"
init.py
def db_connect(collection_name):
dbname = current_app.config["DBNAME"]
db = getattr(db_client, dbname)
collection = getattr(db, collection_name)
return collection
And then on the top of routes.py:
collection = db_connect("user")
... # and further db operations
This is the only solution I can come up with to separate pytests from app. So, at the time of creating the instance of an app, the DBNAME parameter is passed to specify the database name. So when I run pytest - it sets:
dbname = current_app.config["test_app_db"]
and when I run normal instance of an app either through file or flask cli - it sets:
dbname = current_app.config["app_db"]
And all the collection names in test_db are respectively named as in normal database.
I know this solution is not the best, but for now I don't really know where to look for something better.

Invoke method on SQLite connection in Flask-SQLAlchemy

I'm developing a web app with Flask-SQLAlchemy backed by a SQLite database. I need to call a method (create_collation) right after connecting. Without the SQLAlchemy framework, I can do that like this:
conn = sqlite3.connect(path)
conn.create_collation('my_collate', my_collate)
# ... go on and do fancy "order_by" stuff.
How do I do that in Flask-SQLAlchemy? Based on the API I was thinking of the following, but I get AttributeError: 'Engine' object has no attribute 'create_collation'.
from flask_sqlalchemy import SQLAlchemy
class MySQLAlchemy(SQLAlchemy):
def create_engine(self, sa_url, engine_opts):
engine = super().create_engine(sa_url, engine_opts)
engine.create_collation('my_collate', self.my_collate)
return engine
#staticmethod
def my_collate(string1, string2):
return string1.locateCompare(string2)
Following the SQLAlchemy docs I think I need to get the connection rather than the engine. But I can't find out how.
Also, where should this go specifically in Flask-SQLAlchemy? What part ultimately "connect"s, and how do I tune into that?
SQLAlchemy has an Events API that allows you to create a function that will be called whenever the connection pool creates a new connection:
from sqlalchemy.event import listens_for
from sqlalchemy.pool import Pool
#listens_for(Pool, "connect")
def my_on_connect(dbapi_con, connection_record):
dbapi_con.create_collation('my_collate', my_collate)

$inc operator not working in my flask app using mongodb atlas

I'm looking to increment the 'views' field by +1 in my document within my collection. I'm using mongodb atlas database to be included in my flask app. I've included my route here. Any suggestions would be great thanks.
#app.route('/view_count/<recipe_id>', methods=['POST'])
def view_count(recipe_id):
mongo.db.recipes.update_one({"_id": ObjectId(recipe_id)}, {"$inc": {'views': 1}})
return redirect(url_for('view_recipe.html'))
you queries are correct if you are using pymongo.
maybe, the problem are mongo.db.
Example
from bson import ObjectId
from pymongo import MongoClient
# connect to general db
client = MongoClient('mongodb://localhost:27017')
# mongo accept everything, so is ok these queries below
# OBS: client.db means connection with database called db inside mongo
client.db.recipes.insert_one({'_id':ObjectId(), 'views': 0})
client.db.recipes.find_one({}) # the insertion above work
client.db.recipes.update_one({}, {'$inc': {'views': 1}}) # have only one, so update they
but if you change:
client = MongoClient('mongodb://localhost:27017')
# with
client = MongoClient('mongodb://localhost:27017').db
# everything continue working, but now, the path to recipes is db.db.db.recipes

How to cache SQL Alchemy calls with Flask-Cache and Redis?

I have a Flask app that takes parameters from a web form, queries a DB with SQL Alchemy and returns Jinja-generated HTML showing a table with the results. I want to cache the calls to the DB. I looked into Redis (Using redis as an LRU cache for postgres), which led me to http://pythonhosted.org/Flask-Cache/.
Now I am trying to use Redis + Flask-Cache to cache the calls to the DB. Based on the Flask-Cache docs, it seems like I need to set up a custom Redis cache.
class RedisCache(BaseCache):
def __init__(self, servers, default_timeout=500):
pass
def redis(app, config, args, kwargs):
args.append(app.config['REDIS_SERVERS'])
return RedisCache(*args, **kwargs)
From there I would need to something like:
# not sure what to put for args or kwargs
cache = redis(app, config={'CACHE_TYPE': 'redis'})
app = Flask(__name__)
cache.init_app(app)
I have two questions:
What do I put for args and kwargs? What do these mean? How do I set up a Redis cache with Flask-Cache?
Once the cache is set up, it seems like I would want to somehow "memoize" the calls the DB so that if the method gets the same query it has the output cached. How do I do this? My best guess would be to wrap the call the SQL Alchemy in a method that could then be given memoize decorator? That way if two identical queries were passed to the method, Flask-Cache would recognize this and return to the appropriate response. I'm guessing that it would look like this:
#cache.memoize(timeout=50)
def queryDB(q):
return q.all()
This seems like a fairly common use of Redis + Flask + Flask-Cache + SQL Alchemy, but I am unable to find a complete example to follow. If someone could post one, that would be super helpful -- but for me and for others down the line.
You don't need to create custom RedisCache class. The docs is just teaching how you would create new backends that are not available in flask-cache. But RedisCache is already available in werkzeug >= 0.7, which you might have already installed because it is one of the core dependencies of flask.
This is how I could run the flask-cache with redis backend:
import time
from flask import Flask
from flask_cache import Cache
app = Flask(__name__)
cache = Cache(app, config={'CACHE_TYPE': 'redis'})
#cache.memoize(timeout=60)
def query_db():
time.sleep(5)
return "Results from DB"
#app.route('/')
def index():
return query_db()
app.run(debug=True)
The reason you're getting "ImportError: redis is not a valid FlaskCache backend" is probably because you don't have redis (python library) installed which you can simply install by:
pip install redis.
your redis args would look something like this:
cache = Cache(app, config={
'CACHE_TYPE': 'redis',
'CACHE_KEY_PREFIX': 'fcache',
'CACHE_REDIS_HOST': 'localhost',
'CACHE_REDIS_PORT': '6379',
'CACHE_REDIS_URL': 'redis://localhost:6379'
})
Putting the #cache.memoize over a method that grabs the info from the DB should work.

Local server address for multiple users on different computers? (Python)

I'm working with some friends to build a PostgreSQL/SQLAlchemy Python app and have the following line:
engine = create_engine('postgresql+pg8000://oldmba#localhost/helloworld')
Newbie question: Instead of having to edit in "oldmba" (my username) all the time whenever I git pull someone else's code, what's the simple way to make that line equally applicable to all users so we don't have to constantly edit it? Thanks in advance!
have a config file with your settings.
It can store data in python config dictionary or variables
The config file can import from a local_settings.py file. This file can be ignored in your gitignore. It can contain your individdual settings , username , password, database urls, pretty much anything that you need to configure and that may differ depending on your enviornment (production vs devel)
This is how settings in django projects are usually handled. It allows for multiple users to devlop on the same project with different settings. You might want a 'database_url' field or something too so on production if you need to set your database to a different server but on development you use 'localhost'
# config.py
database = {
'username': 'production_username',
'password': 'production_password'
}
try:
from local_config import *
catch ImportError:
pass
# local_config.py
database = {
'username': 'your_username',
'password': 'your_password'
}
from config import *
engine = create_engine('postgresql+pg8000://{0}#localhost/helloworld'.format(database['username']))

Categories

Resources