How do I delete in Django? (mysql transactions) - python

If you are familiar with Django, you know that they have a Authentication system with User model. Of course, I have many other tables that have a Foreign Key to this User model.
If I want to delete this user, how do I architect a script (or through mysql itself) to delete every table that is related to this user?
My only worry is that I can do this manually...but if I add a table , but I forget to add that table to my DELETE operation...then I have a row that links to a deleted, non-existing User.

As far as I understand it, django does an "on delete cascade" by default:
http://docs.djangoproject.com/en/dev/topics/db/queries/#deleting-objects

You don't need a script for this. When you delete a record, Django will automatically delete all dependent records (thus taking care of its own database integrity).
This is simple to test. In the admin, go to delete a User. On the confirmation page, you'll see a list of all dependent records in the system. You can use this any time as a quick test to see what's dependent on what (as long as you don't actually click Confirm).
If you perform deletions from your view code with .delete(), all dependent objects will be deleted automatically with no option for confirmation.
From the docs:
When Django deletes an object, it
emulates the behavior of the SQL
constraint ON DELETE CASCADE -- in
other words, any objects which had
foreign keys pointing at the object to
be deleted will be deleted along with
it.

Related

When I delete all the item's from my query in Django the item id's don't reset [duplicate]

I have been working on an offline version of my Django web app and have frequently deleted model instances for a certain ModelX.
I have done this from the admin page and have experienced no issues. The model only has two fields: name and order and no other relationships to other models.
New instances are given the next available pk which makes sense, and when I have deleted all instances, adding a new instance yields a pk=1, which I expect.
Moving the code online to my actual database I noticed that this is not the case. I needed to change the model instances so I deleted them all but to my surprise the primary keys kept on incrementing without resetting back to 1.
Going into the database using the Django API I have checked and the old instances are gone, but even adding new instances yield a primary key that picks up where the last deleted instance left off, instead of 1.
Wondering if anyone knows what might be the issue here.
I wouldn't call it an issue. This is default behaviour for many database systems. Basically, the auto-increment counter for a table is persistent, and deleting entries does not affect the counter. The actual value of the primary key does not affect performance or anything, it only has aesthetic value (if you ever reach the 2 billion limit you'll most likely have other problems to worry about).
If you really want to reset the counter, you can drop and recreate the table:
python manage.py sqlclear <app_name> > python manage.py dbshell
Or, if you need to keep the data from other tables in the app, you can manually reset the counter:
python manage.py dbshell
mysql> ALTER TABLE <table_name> AUTO_INCREMENT = 1;
The most probable reason you see different behaviour in your offline and online apps, is that the auto-increment value is only stored in memory, not on disk. It is recalculated as MAX(<column>) + 1 each time the database server is restarted. If the table is empty, it will be completely reset on a restart. This is probably very often for your offline environment, and close to none for your online environment.
As others have stated, this is entirely the responsibility of the database.
But you should realize that this is the desirable behaviour. An ID uniquely identifies an entity in your database. As such, it should only ever refer to one row. If that row is subsequently deleted, there's no reason why you should want a new row to re-use that ID: if you did that, you'd create a confusion between the now-deleted entity that used to have that ID, and the newly-created one that's reused it. There's no point in doing this and you should not want to do so.
Did you actually drop them from your database or did you delete them using Django? Django won't change AUTO_INCREMENT for your table just by deleting rows from it, so if you want to reset your primary keys, you might have to go into your db and:
ALTER TABLE <my-table> AUTO_INCREMENT = 1;
(This assumes you're using MySQL or similar).
There is no issue, that's the way databases work. Django doesn't have anything to do with generating ids it just tells the database to insert a row and gets the id in response from database. The id starts at 1 for each table and increments every time you insert a row. Deleting rows doesn't cause the id to go back. You shouldn't usually be concerned with that, all you need to know is that each row has a unique id.
You can of course change the counter that generates the id for your table with a database command and that depends on the specific database system you're using.
If you are using SQLite you can reset the primary key with the following shell commands:
DELETE FROM your_table;
DELETE FROM SQLite_sequence WHERE name='your_table';
Another solution for 'POSTGRES' DBs is from the UI.
Select your table and look for 'sequences' dropdown and select the settings and adjust the sequences that way.
example:
I'm not sure when this was added, but the following management command will delete all data from all tables and will reset the auto increment counters to 1.
./manage.py sqlflush | psql DATABASE_NAME

How to fix, PostgreSQL BEFORE UPDATE Trigger on selected columns executed from Django side on the columns not associated with the Trigger?

Basically, I am creating a Django app with a PostgreSQL database, now for a given table on a database I have to use Triggers for preventing update on selected columns, Trigger is working properly from the database side doing what it is made to do, preventing update on selected columns and allowing to update non-associated column with trigger.
Now from the Django side, whenever I try to update the table's fields/columns which are not associated with the trigger, it invokes/executes the trigger preventing the update operation for those not associated fields/columns and this trigger is even being executed when trying to add new record or data from Django side preventing insertion of new record.
Can anyone please help?
Thank you
I found the solution to my problem.
I created a PostgreSQL trigger to prevent the updating of a few columns. From the database side, I tested it and is working completely fine. The problem was with the Django side.
In the database, a single column can be updated by entering a value, it has no connection with other columns so no error in updating the specific column.
For example, a row has two fields, one modifiable and other is prevented with a trigger, so when I modify that modifiable field it will be modified.
The problem with Django, in Django, it will always write the whole object. Like, if you have an object with two fields, one modifiable, the other not, but both are mapped in Django, when you save that object, Django will update both fields, even if only one or none of them have changed.
As the Django update all the fields even if only one field was being changed, the trigger was being invoked.
so I had to look for an option that will allow only to update specified fields or changed/modified fields only and not the other unchanged fields.
I found this update_fields,
update_fields helps me specific which fields should be updated instead of updating the whole row or all the fields.

Is it possible to initiate new entries in Active Directory, or update existing entries in Active Directory, from SQL Server?

I want some user fields to be updated in Active Directory from SQL server. Is it possible to do that or Is it possible to update the fields using python? Any pointers would be greatly helpful!
You can use something like Python LDAP to make changes in Active Directory via the LDAP interface. The challenge is knowing what/when data changes in your database table.
In MySQL, you can use triggers to perform actions when INSERT, UPDATE, or DELETE operations are committed. A trigger could be used to populate a second table that is essentially a changelog. Either remove items from the changelog table when processed and updated into AD or maintain a "last change processed" number within your code and retain the changelog data as an audit log.

Adding a foreign key constraint back to a Django table

I have a job that does some work on a copy of a table corresponding to a Django model, and then replaces the working table with the copy when done.
The problem is that although the copy of the table picks up all of the indexes and everything else, it's not picking up the foreign key constraints.
Can I just add them back when I swap the table in? Or does South or Django depend on anything in the constraint name?
I'm on MySQL and Django 1.8.
(Let's assume I'm not able to change how the job works)

Changing Model's key_Name + App Engine

I'm planning to do this for my application:
Store a unique id into key_name of a User model.
At any time given, user will be allowed to choose a username once, which I intend to replace the original key_name of a model with this username chosen by user.
With my implementation, any new user, the User model will be only created when user is activated.
Based on the situation, my question, which of the following a better approach ?
Upon user log in, user must choose a username, so that I could create the User model with the keyname = username chosen. However, this approach might appears unpleasant to user as they should be allowed to choose username anytime that they wanted to.
The approach explained in situation above, however I would need to do clone_entity. However, with clone_entity, will reference properties be assigned back to the new cloned entity ? And also, performance is priority, will this be costly in terms of database operations if it involves a lot of users at the same time ?
If you're heart-set on having the user_name as the key either approach should work fine (assuming you have the logic to prevent duplicate usernames)
However, with clone_entity, will reference properties be assigned back to the new cloned entity ?
If the clone entity is done correctly reference properties will be copied over without a problem. However if you have any entities referencing the entity you are cloning those will not be updated to reference the new clone of the entity.
And also, performance is priority, will this be costly in terms of database operations if it involves a lot of users at the same time ?
As long as the clone is implemented efficiently and assuming you pass in the entity you want to clone there should only be one database operation per clone call (the put of the newly created entity).
It looks like the the clone_entity you linked has an update that will avoid excess db calls for reference properties so you should be good.

Categories

Resources