I am building a Django project that uses a relational DB (for development purposes SQLite) and an non-relational DB (OrientDB). This is my first time a using non-relational DB and I m having difficulty getting it set up with Django.
The use of OrientDB in my project is solely to keep track of friend relationships and friend-of-friend relationships, while all other user data is being stored in my relational DB.
I know I need to register the DB in my settings file. I am trying to do something like this:
#setting.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'friends': {
'NAME': 'friends',
'ENGINE': 'django.db.backends.orientdb',
'USER': 'root',
'PASSWORD': 'hello',
'HOST': '',
'PORT': '2480',
}
}
When I do this, however, I get the error:
No module named 'django.db.backends.orientdb'
Is this backend module something I have to create myself or can I manually connect to the DB in my code whenever I need something specific done? For example, whenever someone creates a new user in my SQLite DB, can I use a Signal post_save to
connect to OrientDb,
create a friend instance in Orient DB, and
disconnects from OrientDB?
It seems like there ought to be a much cleaner way of doing this.
This is almost certainly something you'll need to build yourself, though your use case doesn't sound like it requires a whole Django backend. A few manual queries might be enough.
Django officially supports PostgreSQL, MySQL, SQLite, and Oracle. There are third-party backends for SAP SQL Anywhere, IBM DB2, Microsoft SQL Server, Firebird, and ODBC.
There is an abandoned project that attempted to provide an OrientDB backend for Django, but it hasn't been updated in quite a long time and likely needs a lot of love:
This project isn't maintained anymore, feel free to fork and keep it alive.
No matter how you choose to proceed you should probably take a look at OrientDB's Python library.
Related
I need to develop a new django project (let's call it new_django) using a SQL Server 2019 database named AppsDB which already hosts another django project (let's call it old_django). The two apps are completely separate from each other. Unfortunately, I can't get a new database for each new django project, so I have to reuse AppsDB. What I don't understand is, how can I tell django not to overwrite the existing auth_... and django_... tables generated by old_django?
My first idea was to use different schemas for the two project, but django doesn't support this with a SQL Server database as far as I know. Some workarounds suggest to change the database default schema for a given user like this anwser. But I won't get a new user for every project either. And relying on manually changing the db schema every time before I migrate something will most certainly cause a mess at some point.
I'm stuck with the current setup and would like to know if anyone has come up with a more elegant solution or different approach to solve my problem?
Any help is much appreciated!
All you need to do is to create a new database in mssql server and then point your django application on the database server like this below.
DATABASES = {
'default': {
'ENGINE': 'mssql',
'NAME': 'YOU_DATABASE_NAME',
'USER': 'DB_USER',
'PASSWORD': 'DB_PASSWORD',
'HOST': 'YOUR_DATABASE_HOST',
'PORT': '',
'OPTIONS': {
'driver': 'ODBC Driver 13 for SQL Server',
},
}
}
I'm already deploying my django app and I'm using postgresql as my server and I used heroku for hosting my app. However, I don't know what I should place in my host instead of using localhost.
note: this works perfectly if I run it locally.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'trilz',
'USER': 'postgres',
'PASSWORD': 'franz123',
'HOST': 'localhost',
'PORT': '5432',
}
}
You should probably use environment variables (the prior link actually uses databases as an example. Using hardcoded values leaves you vulnerable to a bunch of different risks. The Django guide also presents connection files.
After you started to use that, then you need to figure out where you are running your Postgres database. localhost means "my machine" (i.e the same machine as running the Django app.). If you are using some database-as-a-service, they'll expose all the environmental variables you need. If you are using something like Heroku, they will expose environment variables when running the service, that you'll probably use. If you are using a Kubernetes/Docker setup you yourself control, then you'll probably be in control of where the database is running and needs to use the path to that.
For heroku
I've used https://pypi.org/project/dj-database-url/ for a hobby project (which is no longer maintained, but does work, the last time I used it).
My config then looked like this:
DATABASES = {"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": "mydatabase"}}
if "DATABASE_URL" in os.environ:
logger.info("Adding $DATABASE_URL to default DATABASE Django setting.")
DATABASES["default"] = dj_database_url.config(conn_max_age=600)
DATABASES["default"]["init_command"] = "SET sql_mode='STRICT_TRANS_TABLES'"
That gives you a working Sqlite3 version if no URL is added. You can use something else if you'd like. Heroku exposes an environment variable called DATABASE_URL that links to the database you configured in Heroku, which you catch in if "DATABASE_URL" in os.environ:, and then subsequently use. Did this provide a sufficient answer?
I have a project running in Django and connecting do SQLAlchemy ORM via sessionmaker, as showed below. It basicly handles http methods with a specified API (GET, POST, DELETE) and returns, posts, updates or deletes db entries.
from sqlalchemy import create_engine
from sqlalchemy.orm.session import sessionmaker
self.session = sessionmaker(
bind=create_engine('mysql+pymysql://user:pw#127.0.0.1/db')
Under myproject.settings I am using defaults like 'ENGINE': 'django.db.backends.sqlite3',.
I would like to test whether the API is working as intended by simply interating through all possible methods and URIs which seem necessary to test. Testing is done with Django's TestCase class and its Client module. Works quite fine.
My Problem:
It is altering (especially deleting and updating columns within) the real db. Rather than using the "created and destroyed test_db" as Django's test output might indicate it is using the real db.
I kinda get why (I am bypassing Django's built-in db-connection with my SQLAlchemy connection), but I am interested in how to fix this, i.e. using a true test_db.
Currently I am using a read-only mysql-user for testing, but that prevents me from testing actual POST and DELETE requests.
I could try to use a different db for testing by mocking, but I would prefer another solution (I would have to create a dummy db from the real one, every time I ran a test)
PS: If you feel I have not provided enough code, give me a hint. But I feel people might get the idea of my problem and the solution is probably the mysql integration into Django, which I had not the need to do properly yet. Or more accuratly which I could not get working every time I tried.
EDIT: When trying to configure my database to
DATABASES = {
'default': {
# 'ENGINE': 'django.db.backends.sqlite3',
# 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db',
'USER': 'user',
'PASSWORD': 'pw',
'HOST': '127.0.0.1',
}
}
I get django.db.utils.OperationalError: (2006, 'SSL connection error: SSL_CTX_set_tmp_dh failed') which I is due to not using pymysql here, I figure.
I recently switched form a SQL Lite DB to a Postgresql DB for my Django project. I was not far in, so I did no migrations and just started with a clean DB. I followed the instructions found here https://stackoverflow.com/a/5421511/3681278
Things are going swimmingly and things updated and added via PGAdmin III are showing up in the admin screen. When I attempt to add models and run a sync db, it does not fail, executes and seems to work, but nothing in the database is changed.
Also, posting changes via models that would normally add/change/update/delete database entries have no effect.
I have search high and low for a solution to no avail.
A hopefully helpful clue:
When I change a model name or delete a model I am asked if I want to delete the old models. So, the models must be generating some table somewhere, but once again there is no effect on the postgresql database.
Here is my settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'RED_DB',
'USER': 'postgres',
'PASSWORD': 'MyPass',
'HOST': ''
}
}
Thanks in advance!
Sync db isn't a command that you can run after you have modified the models (migrations), most developers use a tool called south. This is a pluggable app for Django that handles the migration.
EDIT: Since Django 1.7 migrations are supported, take a look the documentation: https://docs.djangoproject.com/en/dev/topics/migrations/ .
I am writing a Django application where I already have 1 mysql backend db configured in my settings.py.
I know we can add as many db configurations as we want, but that's hard coding which I don't want.. rather can't possibly do as I have to ad-hockly connect to say, about 70-80 different remote machines and query and fetch the result.
I am planning to connect to those machines via their IP address.
I am comparatively new to Django, so I was wondering if we can somehow, make a function which queries the machine by putting in configuration something like :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbName',
'USER': 'root',
'PASSWORD': 'root',
'HOST': '',
'PORT': '3306'
}
}
So instead of DATABASES and default, I could configure my function to change the configuration, through an Ajax call or something!
Fortunately, every machine I have to connect to uses mysql so no problem with that.
I looked into this mysql -python connector but not sure if I should use it as I already have MySQLDb installed. I also have to do some raw queries too :|
Could anyone guide me for what would be the best approach for this situation?
P.S : I have also looked at this post which discusses about connecting to remote mysql machine from local. But that's of no help for me :( :(
I believe there are quite a few paths you can take, 3 of which are:
Add all your connections in DATABASES using using - which you said you don't want to do because you have so many conections
You could connect using python's mysql library. If you do this I don't think you'll get to use djangos nice ORM
Look at how django wraps connections to allow you to use their ORM. I did some quick searches about manually establishing a connection using django ORM but didn't find anything. All the answers are in the source code. I believe you can just instantiate your own connections and interact with your remote database using the ORM. I don't have time to look through it now, but everything is in their source