I'm trying to implement a failover strategy when my MySQL backend is down in Celery.
I found in this other stack overflow answer that failover is made possible in SQLAlchemy. However, I couldn't write the same behavior in Celery using sqlalchmey_engine_options
__app.conf.result_backend = 'db+mysql://scott:tiger#localhost/foo'
__app.conf.sqlalchmey_engine_options = {
'connect_args': {
'failover': [{
'user': 'root',
'password': 'password',
'host': 'http://other_db.com',
'database': 'dbname'
}]
}
}
What I'm trying to do is if the first backend scott:tiger does not respond, then it switches to root:password backend.
There is definitely more than one way to achieve failover. You could start with simple try..except and handle situation when your prefered backend is not responding, in simplest (and probably not very pythonic) way you could try something like this:
try:
# initialise your SQL here and also set the connection up
except:
# initialise your backup SQL here
You could also move your backend selection to the infrastructure so it's transparent from your application perspective, i.e. by using session pooling system (I am not MySQL user but in PostgreSQL world we have pgpool).
--- edit ---
I realised you probably want to have your database session and connection handled by celery itself. So very likely above does not answer your question directly, in my simple project I initialise database connection within tasks that require it as in my particular case most tasks do not require database at all.
Related
I need to develop a new django project (let's call it new_django) using a SQL Server 2019 database named AppsDB which already hosts another django project (let's call it old_django). The two apps are completely separate from each other. Unfortunately, I can't get a new database for each new django project, so I have to reuse AppsDB. What I don't understand is, how can I tell django not to overwrite the existing auth_... and django_... tables generated by old_django?
My first idea was to use different schemas for the two project, but django doesn't support this with a SQL Server database as far as I know. Some workarounds suggest to change the database default schema for a given user like this anwser. But I won't get a new user for every project either. And relying on manually changing the db schema every time before I migrate something will most certainly cause a mess at some point.
I'm stuck with the current setup and would like to know if anyone has come up with a more elegant solution or different approach to solve my problem?
Any help is much appreciated!
All you need to do is to create a new database in mssql server and then point your django application on the database server like this below.
DATABASES = {
'default': {
'ENGINE': 'mssql',
'NAME': 'YOU_DATABASE_NAME',
'USER': 'DB_USER',
'PASSWORD': 'DB_PASSWORD',
'HOST': 'YOUR_DATABASE_HOST',
'PORT': '',
'OPTIONS': {
'driver': 'ODBC Driver 13 for SQL Server',
},
}
}
I have a project running in Django and connecting do SQLAlchemy ORM via sessionmaker, as showed below. It basicly handles http methods with a specified API (GET, POST, DELETE) and returns, posts, updates or deletes db entries.
from sqlalchemy import create_engine
from sqlalchemy.orm.session import sessionmaker
self.session = sessionmaker(
bind=create_engine('mysql+pymysql://user:pw#127.0.0.1/db')
Under myproject.settings I am using defaults like 'ENGINE': 'django.db.backends.sqlite3',.
I would like to test whether the API is working as intended by simply interating through all possible methods and URIs which seem necessary to test. Testing is done with Django's TestCase class and its Client module. Works quite fine.
My Problem:
It is altering (especially deleting and updating columns within) the real db. Rather than using the "created and destroyed test_db" as Django's test output might indicate it is using the real db.
I kinda get why (I am bypassing Django's built-in db-connection with my SQLAlchemy connection), but I am interested in how to fix this, i.e. using a true test_db.
Currently I am using a read-only mysql-user for testing, but that prevents me from testing actual POST and DELETE requests.
I could try to use a different db for testing by mocking, but I would prefer another solution (I would have to create a dummy db from the real one, every time I ran a test)
PS: If you feel I have not provided enough code, give me a hint. But I feel people might get the idea of my problem and the solution is probably the mysql integration into Django, which I had not the need to do properly yet. Or more accuratly which I could not get working every time I tried.
EDIT: When trying to configure my database to
DATABASES = {
'default': {
# 'ENGINE': 'django.db.backends.sqlite3',
# 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db',
'USER': 'user',
'PASSWORD': 'pw',
'HOST': '127.0.0.1',
}
}
I get django.db.utils.OperationalError: (2006, 'SSL connection error: SSL_CTX_set_tmp_dh failed') which I is due to not using pymysql here, I figure.
I am building a Django project that uses a relational DB (for development purposes SQLite) and an non-relational DB (OrientDB). This is my first time a using non-relational DB and I m having difficulty getting it set up with Django.
The use of OrientDB in my project is solely to keep track of friend relationships and friend-of-friend relationships, while all other user data is being stored in my relational DB.
I know I need to register the DB in my settings file. I am trying to do something like this:
#setting.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'friends': {
'NAME': 'friends',
'ENGINE': 'django.db.backends.orientdb',
'USER': 'root',
'PASSWORD': 'hello',
'HOST': '',
'PORT': '2480',
}
}
When I do this, however, I get the error:
No module named 'django.db.backends.orientdb'
Is this backend module something I have to create myself or can I manually connect to the DB in my code whenever I need something specific done? For example, whenever someone creates a new user in my SQLite DB, can I use a Signal post_save to
connect to OrientDb,
create a friend instance in Orient DB, and
disconnects from OrientDB?
It seems like there ought to be a much cleaner way of doing this.
This is almost certainly something you'll need to build yourself, though your use case doesn't sound like it requires a whole Django backend. A few manual queries might be enough.
Django officially supports PostgreSQL, MySQL, SQLite, and Oracle. There are third-party backends for SAP SQL Anywhere, IBM DB2, Microsoft SQL Server, Firebird, and ODBC.
There is an abandoned project that attempted to provide an OrientDB backend for Django, but it hasn't been updated in quite a long time and likely needs a lot of love:
This project isn't maintained anymore, feel free to fork and keep it alive.
No matter how you choose to proceed you should probably take a look at OrientDB's Python library.
I could like get my celery worker process to talk to the django test database.
Its an oracle database, so I believe the database/user is already created.
I am just trying to figure out what to pass the Celery/App configuration to get it to talk to the "TEST" database.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
.............
'TEST': {
'USER': WIT_TEST_DB_USER,
'PASSWORD': WIT_TEST_DB_USER,
}
}
}
I have seen a stackoverflow article that talks about passing the settings.conf from the parent test setup() to the worker process. That may be necessary when the test database file is automatically generated in case of sqllite databases.
In my case, its a well defined oracle test database that I think is already part of the config/settings files.
so I am looking for a way to directly start the work process independent of the testrunner/testcase code.
Can some one suggest an approach to doing this?
You are regarding your test database as an ordinary database. So I think the best solution would be to define your test database as the default database under DATABASES settings in a separate settings file. And when running your worker you can pass the specific new settings file to your worker like this:
export DJANGO_SETTINGS_MODULE='[python path to your celery specific settings file]'
# the command to run your celery worker
I am writing a Django application where I already have 1 mysql backend db configured in my settings.py.
I know we can add as many db configurations as we want, but that's hard coding which I don't want.. rather can't possibly do as I have to ad-hockly connect to say, about 70-80 different remote machines and query and fetch the result.
I am planning to connect to those machines via their IP address.
I am comparatively new to Django, so I was wondering if we can somehow, make a function which queries the machine by putting in configuration something like :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbName',
'USER': 'root',
'PASSWORD': 'root',
'HOST': '',
'PORT': '3306'
}
}
So instead of DATABASES and default, I could configure my function to change the configuration, through an Ajax call or something!
Fortunately, every machine I have to connect to uses mysql so no problem with that.
I looked into this mysql -python connector but not sure if I should use it as I already have MySQLDb installed. I also have to do some raw queries too :|
Could anyone guide me for what would be the best approach for this situation?
P.S : I have also looked at this post which discusses about connecting to remote mysql machine from local. But that's of no help for me :( :(
I believe there are quite a few paths you can take, 3 of which are:
Add all your connections in DATABASES using using - which you said you don't want to do because you have so many conections
You could connect using python's mysql library. If you do this I don't think you'll get to use djangos nice ORM
Look at how django wraps connections to allow you to use their ORM. I did some quick searches about manually establishing a connection using django ORM but didn't find anything. All the answers are in the source code. I believe you can just instantiate your own connections and interact with your remote database using the ORM. I don't have time to look through it now, but everything is in their source