sorry for my english.
I want to create a DatabaseWrapper because I want to make a dynamic database connection.
I need an object of the DatabaseWrapper for the Django Querybuilder: http://django-query-builder.readthedocs.org/en/latest/ref/query.html#querybuilder.query.Query
Why?
- Because I don't want to store the database config in the settings.py - DATABASE variable.
I want to create a connection during the runtime and close it after my class is used. The reason for that is that if I want to test, Django creates a test_database on the second database. Surely this is not a problem but I don't have writing privileges on the database, hence my tests fail every time.
Related
I am using Postgres DB and I have two databases that I am using.
Consider this example as I have to make two version of the software. In one version there is the main DB used for real data and second DB for test version of the software for trainee person.
Now how do I change the default DB according to the logged in User.
P.S: I know I can use 'using(db_name)'. But I don't want that. I want to change the default DB in settings.py dynamically.
The puspose is to provide a test version to the client with a seperate DB.
I am writing a REST API using flask_restful and managing the mysql db using flask-sqlalchemy. I would like to know what the best practice for loading existing data into a table when the app starts is.
I am currently calling the db.create_all() method withing an endpoint with the #app.before_first_request decorator. I would like to then fill in one of the tables created with existing data from a csv file. Should the code to push the data in a separate script or within the function?
Thanks!
I would separate loading initial database data from application initialization, because probably initial data from my experience would not be changed often and can take some time if file is bigger, and usually you don't need to reload it in the database each time application loads.
I think you will most certainly need database migrations at some point in your application development, so I would suggest setting up Flask-Migrate to handle that, and running its upgrade method on application creation (create_app method if you are using Flask application factories pattern) which will handle database migrations. I am saying this since it will save you some headache when you are introducing it later on database already populated with actual data which is initialized with db.create_all().
And for populating database with seed data I would go with Flask CLI or Flask-Script. In one of my recent projects I used Flask-Script for this, and created separate manage.py file which amongst other application management methods contained initial data seeding method which looked something like this:
#manager.command
def seed():
"Load initial data into database."
db.session.add(...)
db.session.commit()
And it was run on demand by following command:
python manage.py seed
I'm working with a Postgresql database with Django. Because of licensing reasons, I can't use psycopg2 , so I'm using the alternative pygresql.
I don't need to use the Django ORM at all, I simply need the cursor for cur.execute() and cur.fetchall().
Since I can't use the pygresql pgdb module in the Database settings in settings.py; I've to manually open up a connection object.
What would be the best practice to do this? Currently I've simply created the connection object conn=pgdb.connect(params) in views.py outside of all functions, but this seems a bit hacky.
Any tips?
It might be a good idea to create your own PYGRE_CONFIG dictionary in settings.py that has info about the server hostname, database name, login name etc. You can use it by using from django.conf import settings and settings.PYGRE_CONFIG. Then, create a separate application utils or pygre in the root of your project directory that manages the connection object (opening and closing it as needed using settings.PYGRE_CONFIG) and stores it in a thread-local variable. Your other applications can import things from this module. Keeping it as a separate app can make it easy for you to port it from project to project.
I am using pymongo to connect to mongodb in my code. I am writing a google analytic kind of application. My db structure is like that for each new website I create a new db. So when someone registers a website I create a new db with that name, however when unregistering the website I wish the database to be deleted. I remove all the collection but still the database could not be removed
And as such the list of databases is growing very large. When I do
client = MongoClient(host=MONGO_HOST,port=27017,max_pool_size=200)
client.database_names()
I see a more than a 1000 list of apps. Many of them are just empty databases. Is there a way that I remove the mongo databases ?
Use drop_database method:
client = MongoClient(host=MONGO_HOST,port=27017,max_pool_size=200)
client.drop_database("database_name")
I have the database backup script in python which inserts some data in mysql database .
Now my Django is in different database.
How can i access different database because i don't have any objects in Models.py.
i want to display some data in django interface
Yes, you can setup multiple database and access every one of them.
you can get the specified database connection cursor using this:
from django.db import connections
cursor = connections['my_db_alias'].cursor()
where my_db_alias is your another db alias .
check the doc:
https://docs.djangoproject.com/en/1.3/topics/db/multi-db/