Remote access of django models - python

I have a django 1.5 project using django models over mysql running on apache server.
class Person(models.Model):
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
birthdate = models.DateField()
class Book(models.Model):
name = models.CharField(max_length=100)
author = models.ForeignKey(Person)
I also have a python/django application (using django custom commands) running on a remote computer that must use these models.
Remote application shares the same model definitions with server
Remote application needs read only access to models
Remote application cannot have a full dump of server database, as server must return a queryset based on user rights
Remote application can only connect over http to server
Server can expose the models over REST API (json)
Is there any automated way to transfer models over http? I have tried to use django.core.serializers but I had the following issues:
I cannot serialize the related objects in a queryset
Remote application cannot work without local database
Remote application searches related objects on local db after deserialization (that does not exist)
Edit:
I managed to serialize models like this:
books = Book.objects.prefetch_related('author').all()
authors = [book.author for book in books]
data = authors + list(books.all())
serialized_data = django.core.serializers.serialize("json", data)
My problem is that the remote application cannot deserialize without having a local database.

Don't think that you need to transfer models over http.
need just to connect to server db.
In remote app settings choose db engine (mysql in your case), name.
Specify appropriate user and password.
And enter a valid host and proxy (if needed). the one your database server is running on
As for the user. on the server create a mysql user with read only rights to the db.
This will give you the ability to use the same db for both, server and remote app.

Finally I solved by using sqlite running on ram on client side
On settings.py I used this configuration
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'
}
}
And the code is like this:
from django.db import connections
from django.core.management.color import no_style
from django.core import serializers
from apps.my_models import Book, Person
connection = connections['default']
cursor = connection.cursor()
sql, references = connection.creation.sql_create_model(Book, no_style(), set())
cursor.execute(sql[0])
sql, references = connection.creation.sql_create_model(Person, no_style(), set())
cursor.execute(sql[0])
serialized_data = get_serialized_data()
for obj in serializers.deserialize("json", serialized_data):
obj.save()
books = Book.objects.prefetch_related('author').all()

Related

heroku connect to mysql database

I have push a python-django project to heroku and it works well. In my view.py file of django model, I added function that could connect to the local mysql database to retrieve data from the mysql. The function is the view.py is as followed:
#login_required
def results(request):
data=[]
data1 = []
owner = request.user
owner = str(owner)
db = MySQLdb.connect(user='root', db='aaa', passwd='xxxxx', host='localhost')
cursor = db.cursor()
cursor.execute("SELECT search_content, id, title, author, institute, FROM result_split where username = '%s'" % (owner))
data = cursor.fetchall()
db.close()
return render(request, "webdevelop/results.html", {"datas": data})
But when I try to open the page that show the data from mysql database in the deployed heroku website, it show the error:"OperationalError at /results/
(2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)")". How could I have this heroku project to connect to my local mysql database? Or I should choose alternative?
Firstly, you need to ensure that the user and password you're using to connect to MySQL is correct and that the user has the correct privileges to work with the selected database.
Then you can check that mysql is accepting connections on localhost.
As for directly addressing the Connection Refused exception, check things like the mysql socket used to communicate with localhost applications like your Django project. The socket must exist and be configured in MySQL.
I also recommend taking a look at something like SQLAlchemy for Python which will help you interact directly with the database using Python objects. For example,
Connecting to the database:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, relationship, scoped_session, mapper
from config import DB_URL
"""Database Declaration"""
metadata = MetaData()
Base = declarative_base(name='Base', mapper=mapper, metadata=metadata)
engine = create_engine(DB_URL, pool_recycle=1800)
Session = sessionmaker(bind=engine, autocommit=False, autoflush=True)
session = scoped_session(Session)
You can now use session variable to perform queries and updates using its inherited functions from the SQLAlchemy Session class.
SQLAlchemy also includes a declarative model for telling Python what your tables look like. For example,
class Clinic(Base):
__tablename__ = 'clinic'
clinic_id = Column(Integer, primary_key=True)
clinic_name = Column(VARCHAR)
address = Column(VARCHAR)
city = Column(VARCHAR)
zip = Column(VARCHAR)
phone = Column(VARCHAR)
user_id = Column(VARCHAR)
These examples work well for my projects in Flask and should work well enough in Django.

Django : Dynamically create a database

I Would like to implement a web page in Django which allow a user to dynamically create a Data Base.
The Django model stored will always be the same. (a generic user)
Scenario :
The user fills up a form with fields like user.firstname, user.lastname ..
Then after the submit, django creates a brand new DB on the DBMS.
The model is then stored in this new DB.
Django save/store this DB setting for further use.
I find out Here that it's possible to set up multiple DB's in Django but that implies that they already have been created (Which is obviously not my case).
To create my DB I can think of executing an custom SQL query directly on the DBMS like that
from django.db import connection
def createNewDB(self,id):
with connection.cursort() as cursor:
queryStr = "\"CREATE DATABASE " + id + "\""
cursor.execute(queryStr)
But then I have no clues about :
How to save the new DB settings in Django DB (see below).
How to migrate and save the model user in the DB
Migrate :
from django.core.management import call_command
call_command("migrate", interactive=False)
Save model : user.save(using='db_id')
I find this post talking about how to add a connection dynamically with :
from django.db import connections
connections.databases['new-alias'] = { ... }
conn = connections['new-alias']
But again in, this case the sqlite DB exist before, so I'm not sure if it is what I need.
So is it possible to create a DB link to a model and then add it to django's database setting ?
Or should I review all my data structure ? (like one DB but multiple user model table)
I am using django 2.0.5 and postgreSQL 9.6 but I can change if the solution is not compatible (sqlite for example).

Django how to use connection_created signal

I am looking to find out when a connection is made to my Django database, or when my Django server is restarted. I found the connection_created Django signal. The description is:
Sent when the database wrapper makes the initial connection to the database. This is particularly useful if you’d like to send any post connection commands to the SQL backend.
So I think using this signal will be a good solution for my case. I want to run a function once the connection is made. I can't find any documentations on the use cases of this signal. connection_created.connect is probably the function to use. This function takes in a bunch of arguments, but the ones that are relevant are self, receiver, sender and weak. Does anyone know how I can use these arguments and this function to run my function at a new connection instance?
Also if anyone has any alternative solutions other than this signal, I'd love to hear them.
I have all my tables distributed among dynamic postgres table schemas, and use the connection signal to set the search path of the connection, since django does not support postgres schemas.
in myapp/apps.py
from django.db.backends.signals import connection_created
class MyappConfig(AppConfig):
name = 'myapp'
def ready(self):
from myapp.schema_manager import new_connection
connection_created.connect(new_connection)
in myapp/schema_manager.py
def new_connection(sender, connection, **kwargs):
search_path = ['public'] + get_current_schemas() # fetch the active schemas
connection.cursor().execute("SET search_path TO %s;" % ', '.join(search_path)
According to the docs, this signal receives two arguments:
sender
The database wrapper class – i.e. django.db.backends.postgresql.DatabaseWrapper or django.db.backends.mysql.DatabaseWrapper, etc.
connection
The database connection that was opened. This can be used in a multiple-database configuration to differentiate connection signals from different databases.
... since django does not support postgres schemas
Django supports postgres schemas:
class MyModel(models.Model):
id = models.IntegerField(primary_key=True)
title = models.TextField()
class Meta:
db_table = '"schema_name"."table_name"'
I use this notation in all of our projects.

Specifying Readonly access for Django.db connection object

I have a series of integration-level tests that are being run as a management command in my Django project. These tests are verifying the integrity of a large amount of weather data ingested from external sources into my database. Because I have such a large amount of data, I really have to test against my production database for the tests to be meaningful. What I'm trying to figure out is how I can define a read-only database connection that is specific to that command or connection object. I should also add that these tests can't go through the ORM, so I need to execute raw SQL.
The structure of my test looks like this
class Command(BaseCommand):
help = 'Runs Integration Tests and Query Tests against Prod Database'
def handle(self,*args, **options):
suite = unittest.TestLoader().loadTestsFromTestCase(TestWeatherModel)
ret = unittest.TextTestRunner().run(suite)
if(len(ret.failures) != 0):
sys.exit(1)
else:
sys.exit(0)
class TestWeatherModel(unittest.TestCase):
def testCollectWeatherDataHist(self):
wm = WeatherManager()
wm.CollectWeatherData()
self.assertTrue(wm.weatherData is not None)
And the WeatherManager.CollectWeatherData() method would look like this:
def CollecWeatherData(self):
cur = connection.cursor()
cur.execute(<Raw SQL Query>)
wm.WeatherData = cur.fetchall()
cur.close()
I want to somehow idiot-proof this, so that someone else (or me) can't come along later and accidentally write a test that would modify the production database.
You can achieve this by hooking into Django's connection_created signal, and
then making the transaction read-only.
The following works for PostgreSQL:
from django.db.backends.signals import connection_created
class MyappConfig(AppConfig):
def ready(self):
def connection_created_handler(connection, **kwargs):
with connection.cursor() as cursor:
cursor.execute('SET default_transaction_read_only = true;')
connection_created.connect(connection_created_handler, weak=False)
This can be useful for some specific Django settings (e.g. to run development
code with runserver against the production DB), where you do not want to
create a real read-only DB user.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'myusername',
'PASSWORD': 'mypassword',
'HOST': 'myhost',
'OPTIONS': {
'options': '-c default_transaction_read_only=on'
}
}
}
Source: https://nejc.saje.info/django-postgresql-readonly.html
Man, once again, I should read the docs more carefully before I post questions here. I can define a readonly connection to my production database in the settings file, and then straight from the docs:
If you are using more than one database, you can use django.db.connections to obtain the connection (and cursor) for a specific database. django.db.connections is a dictionary-like object that allows you to retrieve a specific connection using its alias:
from django.db import connections
cursor = connections['my_db_alias'].cursor()
# Your code here...
If you add a serializer for you model, you could specialized in the serializer that is working in readonly mode
class AccountSerializer(serializers.ModelSerializer):
class Meta:
model = Account
fields = ('id', 'account_name', 'users', 'created')
read_only_fields = ('account_name',)
from http://www.django-rest-framework.org/api-guide/serializers/#specifying-read-only-fields

Multiple Connection on mongoengine.. Give me some examples~

I'm using mongoengine with django.
in my project web application, I need to connect at least two servers: one that is local for session, another connecting to mongolab (mongodb hosting service company).
So.. I need to connect localhost for session.. while connecting to another server in distance.
How can I implement multiple connections on mongoengine?
give me some examples please.
a Web application --- connecting ---> localhost for session
--- connecting ---> mongolab for application database
Use mongoengine dev branch.
Support multiple dbconnection with alias. https://github.com/hmarr/mongoengine/commit/8d2bc444bb64265f78f5bf716f773742dddd56c1
See this tests.
https://github.com/hmarr/mongoengine/blob/dev/tests/document.py#L2584
Depending on mongoengine documentation, to use multiple databases you can use connect() and provide an alias name for the connection. In the background this uses register_connection() to store the data and you can register all aliases up front if required.
connect(alias='user-db-alias', db='user-db')
connect(alias='book-db-alias', db='book-db')
connect(alias='users-books-db-alias', db='users-books-db')
class User(Document):
name = StringField()
meta = {'db_alias': 'user-db-alias'}
class Book(Document):
name = StringField()
meta = {'db_alias': 'book-db-alias'}
class AuthorBooks(Document):
author = ReferenceField(User)
book = ReferenceField(Book)
meta = {'db_alias': 'users-books-db-alias'}
You can also use switch_db() method.
The function disconnect() can be used to disconnect a particular connection. This can be used to change a connection globally:
from mongoengine import connect, disconnect
connect('a_db', alias='db1')
class User(Document):
name = StringField()
meta = {'db_alias': 'db1'}
disconnect(alias='db1')
connect('another_db', alias='db1')

Categories

Resources