I recently switched form a SQL Lite DB to a Postgresql DB for my Django project. I was not far in, so I did no migrations and just started with a clean DB. I followed the instructions found here https://stackoverflow.com/a/5421511/3681278
Things are going swimmingly and things updated and added via PGAdmin III are showing up in the admin screen. When I attempt to add models and run a sync db, it does not fail, executes and seems to work, but nothing in the database is changed.
Also, posting changes via models that would normally add/change/update/delete database entries have no effect.
I have search high and low for a solution to no avail.
A hopefully helpful clue:
When I change a model name or delete a model I am asked if I want to delete the old models. So, the models must be generating some table somewhere, but once again there is no effect on the postgresql database.
Here is my settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'RED_DB',
'USER': 'postgres',
'PASSWORD': 'MyPass',
'HOST': ''
}
}
Thanks in advance!
Sync db isn't a command that you can run after you have modified the models (migrations), most developers use a tool called south. This is a pluggable app for Django that handles the migration.
EDIT: Since Django 1.7 migrations are supported, take a look the documentation: https://docs.djangoproject.com/en/dev/topics/migrations/ .
Related
I need to develop a new django project (let's call it new_django) using a SQL Server 2019 database named AppsDB which already hosts another django project (let's call it old_django). The two apps are completely separate from each other. Unfortunately, I can't get a new database for each new django project, so I have to reuse AppsDB. What I don't understand is, how can I tell django not to overwrite the existing auth_... and django_... tables generated by old_django?
My first idea was to use different schemas for the two project, but django doesn't support this with a SQL Server database as far as I know. Some workarounds suggest to change the database default schema for a given user like this anwser. But I won't get a new user for every project either. And relying on manually changing the db schema every time before I migrate something will most certainly cause a mess at some point.
I'm stuck with the current setup and would like to know if anyone has come up with a more elegant solution or different approach to solve my problem?
Any help is much appreciated!
All you need to do is to create a new database in mssql server and then point your django application on the database server like this below.
DATABASES = {
'default': {
'ENGINE': 'mssql',
'NAME': 'YOU_DATABASE_NAME',
'USER': 'DB_USER',
'PASSWORD': 'DB_PASSWORD',
'HOST': 'YOUR_DATABASE_HOST',
'PORT': '',
'OPTIONS': {
'driver': 'ODBC Driver 13 for SQL Server',
},
}
}
I have a Django application that I wrote 5 years ago, which has been running successfully on Google App Engine - until last month when Google upgraded to Second Generation Cloud SQL.
Currently, I have a settings.py file, which includes a database definition which looks like this:
DATABASES = {
'default': {
'ENGINE': 'google.appengine.ext.django.backends.rdbms',
'INSTANCE': 'my-project-name:my-db-instance',
'NAME': 'my-db-name',
},
Google's upgrade guide, tells me that the connection name needs to change from 'my-project-name:my-db-instance' to 'my-project-name:my-region:my-db-instance'. That's simple enough. Changing this leads me to get the error
InternalError at /login/
(0, u'Not authorized to access instance:
my-project-name:my-region:my-db-instance')
According to this question, I need to add the prefix '/cloudsql/' to my instance name. So, I changed this (and the ENGINE specification) to give:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'INSTANCE': '/cloudsql/my-project-name:my-region:my-db-instance',
'NAME': 'my-db-name',
'USER' : 'root',
'PASSWORD': '******************',
},
I uploaded the modified file to Google (using gcloud app deploy). This time I get a completely different error screen, showing:
Error: Server Error
The server encountered an error and could not complete your request.
Please try again in 30 seconds.
When I look in the Logs, I see:
ImproperlyConfigured: Error loading MySQLdb module: No module named
MySQLdb
It was pointed out by Daniel Ocando in this question, "The rdbms library will not work with an upgraded Second Generation Cloud SQL instance". Connections to the database now need to be made by means of a Unix domain socket.
Google provide documentation and examples of how to do this. However, I dont find Google's instructions very helpful. For connecting a Python application, they give this example code:
main.py:
db = sqlalchemy.create_engine(
sqlalchemy.engine.url.URL(
drivername="mysql+pymysql",
username=db_user,
password=db_pass,
database=db_name,
query={"unix_socket": "/cloudsql/{}".format(cloud_sql_connection_name)},
),
)
My application doesnt have a main.py file, so I'm really not sure where to put this code. I have looked at the full example code for this in GitHub, but I am none the wiser about what changes I need to make in my settings.py file, or elsewhere in my application.
My question is: Do I really have to go down this route (shifting to use SQLalchemy library), or can I upgrade my Django app to work with a second generation Google cloud SQL instance simply by making some changes in my settings.py file? And, if so, what changes?
My application uses Django 1.4, Python 2.7, and I'm not using the Flask framework which Google suggest.
I learned to use Django purely for the purposes of writing this application 5 years ago, but I have not used it since - so I have forgotten pretty much everything I knew about Django and Python.
A few notes here before I start: Django 1.4 and Python 2.7 are no longer supported, so this configuration may or may not work.
You were on the right track up until the part where you introduced the SQLAlchemy instructions. These aren't Django configurations. You should be able to keep your current DATABASES syntax, provided you complete the following steps:
Ensure you add the settings to your app.yaml to allow your new database.
Ensure you have set the Cloud SQL Client (preferred) permissions to the App Engine service account.
Make sure you have mysqlclient as a Python dependency.
The Django on App Engine Flex guide uses PostgreSQL, but does include some MySQL specific suggestions that should be useful. There's also sample DATABASE configurations there.
Hope this helps!
Many thanks to glasnt for the helpful suggestions. These put me on the right track, and together with some information I found elsewhere, I got my site back up and running again - much to the delight of my client!
Here are the details of what I had to change:
I updated my app.yaml file and added:
beta_settings:
cloud_sql_instance: "my-project-name:my-region:my-db-instance"
and also:
libraries:
- name: MySQLdb
version: "latest"
I updated settings.py, and added:
import MySQLdb
and I updated the DATABASES definition to:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'HOST': '/cloudsql/my-project-name:my-region:my-db-instance',
'NAME': 'my-db-name',
'USER' : 'root',
'PASSWORD': 'my-db-password',
},
I found that I did not need to set up any permissions for the App Engine Service account. Google's upgrade guide, states:
When you upgrade an instance with an authorized App Engine project
from First Generation to Second Generation, Cloud SQL creates a
special service account that provides the same access as the
authorized App Engine project did before the upgrade. Because this
service account authorizes access only to a specific instance, rather
than the entire project, this service account is not visible in the
IAM service account page, and you cannot update or delete it.
So, for a migrated first-gen application, no permissions need to be configured.
I have a project running in Django and connecting do SQLAlchemy ORM via sessionmaker, as showed below. It basicly handles http methods with a specified API (GET, POST, DELETE) and returns, posts, updates or deletes db entries.
from sqlalchemy import create_engine
from sqlalchemy.orm.session import sessionmaker
self.session = sessionmaker(
bind=create_engine('mysql+pymysql://user:pw#127.0.0.1/db')
Under myproject.settings I am using defaults like 'ENGINE': 'django.db.backends.sqlite3',.
I would like to test whether the API is working as intended by simply interating through all possible methods and URIs which seem necessary to test. Testing is done with Django's TestCase class and its Client module. Works quite fine.
My Problem:
It is altering (especially deleting and updating columns within) the real db. Rather than using the "created and destroyed test_db" as Django's test output might indicate it is using the real db.
I kinda get why (I am bypassing Django's built-in db-connection with my SQLAlchemy connection), but I am interested in how to fix this, i.e. using a true test_db.
Currently I am using a read-only mysql-user for testing, but that prevents me from testing actual POST and DELETE requests.
I could try to use a different db for testing by mocking, but I would prefer another solution (I would have to create a dummy db from the real one, every time I ran a test)
PS: If you feel I have not provided enough code, give me a hint. But I feel people might get the idea of my problem and the solution is probably the mysql integration into Django, which I had not the need to do properly yet. Or more accuratly which I could not get working every time I tried.
EDIT: When trying to configure my database to
DATABASES = {
'default': {
# 'ENGINE': 'django.db.backends.sqlite3',
# 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db',
'USER': 'user',
'PASSWORD': 'pw',
'HOST': '127.0.0.1',
}
}
I get django.db.utils.OperationalError: (2006, 'SSL connection error: SSL_CTX_set_tmp_dh failed') which I is due to not using pymysql here, I figure.
I have followed the directions in the django-ldap README and I cannot seem to get django-ldapdb to act like it's making an LDAP query. The following has been edited on a brand new instance of Django v.2.1.2 using with Python 3.7:
Changes to settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'ldap': {
'ENGINE': 'ldapdb.backends.ldap',
'NAME': 'ldaps://my.server',
'USER': 'cn=some user',
'PASSWORD': 'somePassword',
}
}
DATABASE_ROUTERS = ['ldapdb.router.Router']
New models.py:
class MyPerson(ldapdb.models.Model):
base_dn = "ou=people,dc=ucsf,dc=edu"
object_classes = ['person', 'myPerson]
uid = fields.IntegerField(db_column='uid')
displayname = fields.CharField(db_column='displayname')
eid = fields.CharField(db_column='eid')
def __str__(self):
return str(self.uid)
def _unicode__(self):
return str(self.uid)
The query in my view. First I tried:
result = MyPerson.objects.filter(uid=99894)
Then I tried:
result = MyPerson.objects.using('ldap').filter(uid=99894)
Running the Django dev server in PyCharm's debugger I can see that result receives a QuerySet with a message of:
Unable to get repr for <class 'django.db.models.query.QuerySet'>
What do I mean by "message". Honestly I'm not sure, but the debugger shows this:
Also, it seems that though the db member of the QuerySet is 'ldap', and the query member shows an SQL query, not an LDAP filter. As I traced the HTTP request through the URL routing, to the view, to the query, and then the result, I never once saw it make any sort of LDAP-related call. For good measure I mangled the LDAP bind password and I don't get a bind error. Pretty sure I'm missing something that lets Django know I want to work with LDAP at this point... I just don't know what that is.
As LDAPdoes not represent a relational database and generally has a schema which is created via configuration and not as it would be with queries, it never dawned on me that I needed to run manage.py makemigrations and manage.py migrate. (I'm relatively new to Python, and even more so to Django. Multi-datasource ORMs that I've used and extended for LDAP in the past did not required similar preparations.) On a hunch, I ran the manage.py commands over my LDAP models and then tried my code again. Now it works.
FWIW - I worked with PHP Symfony for some years and authored ucsf-iam/UcsfLdapOrm. While Symfony also has a db migration process, as LDAP schema are fairly static, when I wrote that LDAP ORM I hard coded part of what Django migration takes care of on the fly. The rest was taken care of by PHP annotations in model classes, similar to how Django has pythonic field types and relates them to LDAP attribute types. Now that I understand all of this better, I have a deeper appreciation for how Django does ORM setup.
I hope this is instructional for other LDAP developers moving over to Python and Django.
I am building a Django project that uses a relational DB (for development purposes SQLite) and an non-relational DB (OrientDB). This is my first time a using non-relational DB and I m having difficulty getting it set up with Django.
The use of OrientDB in my project is solely to keep track of friend relationships and friend-of-friend relationships, while all other user data is being stored in my relational DB.
I know I need to register the DB in my settings file. I am trying to do something like this:
#setting.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'friends': {
'NAME': 'friends',
'ENGINE': 'django.db.backends.orientdb',
'USER': 'root',
'PASSWORD': 'hello',
'HOST': '',
'PORT': '2480',
}
}
When I do this, however, I get the error:
No module named 'django.db.backends.orientdb'
Is this backend module something I have to create myself or can I manually connect to the DB in my code whenever I need something specific done? For example, whenever someone creates a new user in my SQLite DB, can I use a Signal post_save to
connect to OrientDb,
create a friend instance in Orient DB, and
disconnects from OrientDB?
It seems like there ought to be a much cleaner way of doing this.
This is almost certainly something you'll need to build yourself, though your use case doesn't sound like it requires a whole Django backend. A few manual queries might be enough.
Django officially supports PostgreSQL, MySQL, SQLite, and Oracle. There are third-party backends for SAP SQL Anywhere, IBM DB2, Microsoft SQL Server, Firebird, and ODBC.
There is an abandoned project that attempted to provide an OrientDB backend for Django, but it hasn't been updated in quite a long time and likely needs a lot of love:
This project isn't maintained anymore, feel free to fork and keep it alive.
No matter how you choose to proceed you should probably take a look at OrientDB's Python library.