I want to create a table, and then a list of data that should be inserted into that table. Does South have the capability to do such a thing? If so, do you have any reference as to show me how this can be done?
I want to be able to do this because at this point, it seems like the only way to have the 'exact' same data is to manually insert them into the database ourselves.
I want some 'nice' automated way of inserting rows in a table.
You can! It's called a "data migration".
There are plenty of times you might want to use one: the link above gives a good example, another is the "data migration for every Django project":
from south.v2 import DataMigration
from django.conf import settings
class Migration(DataMigration):
def forwards(self, orm):
Site = orm['sites.Site']
site = Site.objects.get(id=settings.SITE_ID)
site.domain = settings.DOMAIN_NAME
site.name = settings.SITE_NAME
site.save()
(this picks up the domain and site name from settings.py, for use with the sites framework)
You'd want to use fixtures.
Create a fixtures directory in your app's folder.
Create a dictionary file in that folder, intial_data.json (or XML/YAML)
Populate the file with the data you'd want to insert. Example
Run manage.py loaddata <fixturename>, where <fixturename> is the name of the fixture file you've created.
South handles this pretty much the same way, but it seems like Django's core approach is more documented.
Related
I am new to Django and am trying to create a simple CRUD for an existing table in a database.
The thing is, that database is not local and I do not want Django to create its built-in tables in it -- I want it to create its tables in a local database. The correct way to do that is by using database routers, right?
I created an app myapp, and declared the external table I want to write the CRUD for in its models.py.
Now, I have written two routers (in two separate files) -- one that routes myapp to the external database (the DataRouter), and another one that routes all other requests to the local database (the SystemRouter). Is that the correct way to do that? Where should I place those files? I have tried placing them in multiple different directories inside my project, but can't make the DATABASE_ROUTERS list in settings.py find them.
For example, if I place them in the root directory for the project and make:
DATABASE_ROUTERS = ['DataRouter', 'SystemRouter']
I get:
ImportError: DataRouter doesn't look like a module path
I am really lost. Is that the best way to solve my problem? How do I route the data correcty?
Is it possible to alter the contents of a view/template based on specific user-defined settings in development.ini or production.ini.
As an example say I am developing a pyramid web-app that lists all the students of the class. The back-end database has only one table - 'student'. Now I develop an optional script that also adds a table 'teacher' to the database. Ideally the web-app should be able to run for both the cases. If teacher table is missing, it will not query it and simply print student details. If teacher table is present, it will print name of the teacher along with the name of the student.
To my mind this can be accomplished in one of the following ways -
Keep separate routes (URLs) for teacher+student and student only
pages. The problem is that you cannot stop people from actually
calling the former when you only have student info. This will lead
to unnecessary error pages
Use a setting teacher_enabled=true/false in .ini file. The setting can be accessed in __ init __.py file through settings['teacher_enabled']. Configure just a single route (say'home','/') but map it to different views based on whether seeting variable is true/false. This will not allow for use of #view_config decorator and templates for both cases will have to be separate
Again using the setting variable, pass it to the view somehow. Make only relevant queries in the view. E.g. - If the teacher_enabled is True, query the teacher table else query the student table only. Pass this variable to templates too and there decide if some details are to be displayed (e.g. teacher name).
So my question is which of these approaches should I use? In case settings variables are to be passed to the views, how can that be done? And is there a standard way of dealing with this problem?
Keep separate routes (URLs) for teacher+student and student only pages. The problem is that you cannot stop people from actually calling the former when you only have student info.
Ah, but you can! Combine it with number 2: Add a teacher_enabled=true/false setting to your .ini file, and then you can use some code similar to this:
from pyramid.threadlocal import get_current_registry
from pyramid.httpexceptions import HTTPFound
#Define some awesome student views here
#view_config(name='student')
def student(request):
return HTTPFound('Foo!')
if get_current_registry().settings['teacher_enabled']:
#Some awesome student and teacher views here
#view_config(name='student_and_teacher')
def student_and_teacher(request):
return HTTPFound('Bar!')
Number 3 is also feasible. Just remember: It's easier to ask for forgiveness than permission. So you could do something like this: (Using SQLAlchemy as an example)
from your_models.teacher import Teacher
from sqlalchemy.exc import NoSuchTableError
try:
teacher = session.query(Teacher).filter(Teacher.id==42).first()
except NoSuchTableError:
teacher = Teacher('Unknown')
I've written some python code to accomplish a task. Currently, there are 4-5 classes that I'm storing in separate files. I'd now like to change this whole thing into a database-backed web app. I've been reading tutorials on Django, and so far I get the impression that I'll need to manually specify the fields and their types for every "model" that I use. This is a little surprising to me, since I was expecting some kind of ORM capability that would just take the existing classes I've already defined, and map them onto a database somehow, in a manner abstracted away from me.
Is this not the case? Am I missing something? It looks like I need to specify all the fields and types in the file 'models.py'.
Okay, now beyond those specifics, does anyone have any general tips on the best way to migrate an object-oriented desktop application to a web application?
Thanks!
That is Django's ORM: it maps classes to tables. What else did you expect? There needs to be some way of specifying what the fields are, though, before you can use them, and that's managed through the models.Model class and the various models.Field subclasses. You can certainly use your classes as mixins in order to use the existing business logic on top of the field definitions.
If you are thinking about a database backend based web app, you have to specify what fields of the data you want to store and what type of the value you want stored.
There is an abstraction that introspects the db to convert it into the django models.py format. But I know not of any that introspects a python class and stores arbitrary data into db. How would that even work? Are the objects, now, stored as a pickle?
You're going to have to check the output, but you can have Django automatically create models from existing databases through one-time introspection.
Taken from the link below, you would set up your database in settings.py, and then call
python manage.py inspectdb
This will dump the sample models.py file to standard out for your inspection. In order to create the file, simply redirect the output
python manage.py inspectdb > models.py
See for more:
http://docs.djangoproject.com/en/dev/howto/legacy-databases/?from=olddocs#auto-generate-the-models
Think of this:
You create a CMS of some sort, which asks you for an application name and a csv file for that application.
Then it automatically creates that app on the fly, creates the required model.py based on the csv columns, activates the admin page for it and allows only you to have the full permission to this new table via django admin, then it inserts the the app into the url.py and creates the view.py for it as well.
Then all you'd have to do is upload a csv, name your app and whola!, you have an admin page to play with.
Now, is there anyway to create an app or at least a model.py out of a csv file in django or is there any django-app that can do this?
Note: Look beyond (./manage.py inspectdb > models.py)
While this does not involve creating an actual models.py and application, you may want to look into dynamically creating Model classes at runtime. You could have "meta" models that store the information on the dynamic models, and then have your CSV view import the data into those models, create the classes, and register them with the admin. Or something like that.
Creating an actual application directory, with models.py, views.py, and so on, is fairly easy (just create the directory, create the files, and write formatted strings to them based on the CSV data). Editing the project's settings.py and urls.py, and reloading the modules, wouldn't be too difficult either. But, I wouldn't trust automatically generated Django applications without first looking at them.
I want to write tests that can show whether or not the database is in sync with my models.py file. Actually I have already written them, only to find out that django creates a new database each time the tests are run based on the models.py file.
Is there any way I can make the models.py test use the existing database schema? The one that's in mysql/postgresql, and not the one that's in /myapp/models.py ?
I don't care about the data that's in the database, I only care about it's schema i.e. I want my tests to notice if a table in the database has less fields than the schema in my models.py file.
I'm using the unittest framework (actually the django extension to it) if this has any relevance.
thanks
What we did was override the default test_runner so that it wouldn't create a new database to test against. This way, it runs the test against whatever our current local database looks like. But be very careful if you use this method because any changes to data you make in your tests will be permanent. I made sure that all our tests restores any changes back to their original state, and keep our pristine version of our database on the server and backed up.
So to do this you need to copy the run_test method from django.test.simple to a location in your project -- I put mine in myproject/test/test_runner.py
Then make the following changes to that method:
// change
old_name = settings.DATABASE_NAME
from django.db import connection
connection.creation.create_test_db(verbosity, autoclobber=not interactive)
result = unittest.TextTestRunner(verbosity=verbosity).run(suite)
connection.creation.destroy_test_db(old_name, verbosity)
// to:
result = unittest.TextTestRunner(verbosity=verbosity).run(suite)
Make sure to do all the necessary imports at the top and then in your settings file set the setting:
TEST_RUNNER = 'myproject.test.test_runner.run_tests'
Now when you run ./manage.py test Django will run the tests against the current state of your database rather than creating a new version based on your current model definitions.
Another thing you can do is create a copy of your database locally, and then do a check in your new run_test() method like this:
if settings.DATABASE_NAME != 'my_test_db':
sys.exit("You cannot run tests using the %s database. Please switch DATABASE_NAME to my_test_db in settings.py" % settings.DATABASE_NAME)
That way there's no danger of running tests against your main database.