I stumbled upon a weird behaviour: I add a permission to a user object but the permission check fails.
permission = Permission.objects.get_by_natural_key(app_label='myapp', codename='my_codename', model='mymodel')
user.user_permissions.add(permission)
user.has_permission('myapp.my_codename') # this is False!
I found some posts about user permission caching here and here and the solution seems to be to completely reload the object from the database.
# Request new instance of User
user = get_object_or_404(pk=user_id)
# Now note how the permissions have been updated
user.has_perms('myapp.my_codename') # now it's True
This seems really overkill for me and very un-django-like. Is there really no way to either clear the permission cache oder reload the foreign keys like you can do for an object with refresh_from_db()?
Thanks in advance!
Ronny
You can force the recalculation by deleting the user object's _perm_cache and _user_perm_cache.
permission = Permission.objects.get_by_natural_key(app_label='myapp', codename='my_codename', model='mymodel')
user.user_permissions.add(permission)
user.has_permission('myapp.my_codename') # returns False
del user._perm_cache
del user._user_perm_cache
user.has_permission('myapp.my_codename') # should be True
But this will essentially hit the database again to fetch the updated permissions. Since these are based on internal workings in django and not part of the public API, these cache keys might change in the future, so I would still suggest to just fetch the user again. But that's totally up to you.
The caching is only an issue for you because of the has_perms method. If you follow it all the way down the stack, you eventually end up here. You'll see this method explicitly checking the cache before proceeding.
If you really need to know this user's permissions at this point in time and really don't want to refresh from the DB, then you can check more manually/directly without the has_perm helper method, since this permission .add() is a standard m2m operation and the model field has been updated.
In practice, it's unlikely you'll check a permission right after it is added, while the object is in scope, and while the permissions are cached since it's a bit redundant. I'm sure the Django devs considered this, and decided the benefits of caching it for you by default was the right call.
Related
I used to delete a user when the user is left, but a lot of models relates User which I need to set related foreign key to empty or delete them since then.
But some models would be pointless since the related User is deleted, such as Order. Thus I need to set User.is_active or something similar to invalid instead of delete the data.
I think it would be best If I can override User.objects.delete, so I don't need to modify a lot of business functions relates to it.
The django.contrib.auth.User already has an is_active parameter, so you can just set that.
In fact, from the docs linked above:
We recommend that you set this flag to False instead of deleting accounts; that way, if your applications have any foreign keys to users, the foreign keys won’t break.
Yes, technically you can override delete by setting a new Manager, but its the wrong approach.
I have an issue with removing permissions to users in view or even in the shell. Let me explain my problem:
I did those tests in the shell:
org = Organisateur.objects.get(user__username__contains="ghj")
content_type = ContentType.objects.get_for_model(Tournoi)
Tournoi is the name of a model
permission_ecriture = 'ecriture_Palaiseau'
permission = Permission.objects.get(content_type=content_type, codename=permission_ecriture)
org.user.user_permissions.remove(permission)`
but when I write:
org.user.has_perm('inscription.ecriture_Palaiseau')`
it returns True
but when I rewrite:
org = Organisateur.objects.get(user__username__contains="ghj")
org.user.has_perm('inscription.ecriture_Palaiseau')`
it returns False
It is really weird. Why does it works like this?
In my views, it seems that the permissions are not removed even if I do write:
org = Organisateur.objects.get(user__username__contains="ghj")
(after removing the permission, the user still has it)
What I want to do is to remove a permission from an user and add another permission to the same user immediately after.
But each time I do that, the user still has the "removed permission"......
Thank you very much
I look forward to hearing from you all soon.
This behavior is expected because permissions are cached. From the Django docs:
Permission caching
The ModelBackend caches permissions on the User object after the first time they need to be fetched for a permissions check. This is typically fine for the request-response cycle since permissions are not typically checked immediately after they are added (in the admin, for example). If you are adding permissions and checking them immediately afterward, in a test or view for example, the easiest solution is to re-fetch the User from the database.
Your code is almost right, you just have forgotten to save your user object at the end!
use user.save()
I have a settings app inside my project, and every record in DB table that this app uses, represents particular setting.
And it is important that settings always must be consistent, so Im looking for way to protect them from accidental deletion by admin-panel users, or by buggy code.
Another cases for this feature might be - error messages stored in DB and available for editing in admin-site or templates for email messages for web-site users.
Possible solutions that I have in mind:
- Store each setting as table column or multiple columns, so the table will extends column-wise, not row-wise. Pros - simple, reliable, Cons - ugly
- DB-side solution.
- Implement some kind of permissions system which will control access for CRUD operations based on objects ownership, like file system permissions in Linux. Pros - less ugly, abstract from DB, Cons - I have no idea yet how to make easy and elegant implementation of it for Django.
Does anybody have better ideas?
The short answer is: if you don't want someone to have certain database abilities don't grant them. It appears that you are thinking there are admin panel superusers and everyone else.
Django allows much more fine grained control over Users, Permissions, Authorization, and even Admin Panel privileges. Indeed, too much control to elaborate here when the documentation does such a good job of it.
I'm not sure I completely understood your question, but here it goes:
I see two ways of protecting a model for being deleted:
Override the delete() method, and make it check a set of rules that enforce the consistency you require. E.g. if one of the consistency rules fail, you raise an exception to be properly handled.
The other is through autorization, aka permissions. You can manage permissions users have to delete particular models, as explained in this answer.
I notice that Django default permissions API does not support specific object's permissions, only permissions applied on models. However, there are third-party apps that provide this, such as this one.
In Django there is no real built-in way (that I am aware of) that prevents "accidental deletion". If you are using the admin, they do provide confirmation pages whenever you want to delete a record that can help curb the potential problem. As #msw mentioned, the user authentication system is designed to help you enforce permissions, but would not prevent accidental deletions if the individual has the proper permission to begin with...
...an alternative strategy would be to prevent deletions on the database entirely (at the web application level). You can give the "illusion" of a delete from the user's perspective by flagging and filtering out any "deleted" records to your user. That way, restoring information would be as simple as toggling/resetting the flag in the record. You would have to override the proper deletion signals as well.
I have a custom user model where I want to track number of failed login attempts and take action based on that. I am wondering what would be a better place to write this logic.
Following are two options I have in mind for updating *failed_attempts* field in User model:
Autheticate method in the backend.
*check_password* method in the User model. I have overridden this method from AbstractBaseUser model.
And the basic logic(does not cover all cases) is like this:
If authentication fails check the time of previous failed login attempt.
If that was recent, increment failed login count.
If count reaches maximum attempt lock the account for few minutes (or do something else).
My question is what would be a better place for writing this logic and why.
Using only the details you list, I would say the Authentication method is more appropriate, if only because it would be very confusing if check_password updates fields on the model.
Why, though, do you have both an 'Authenticate method in the backed' and a check_password method in the model?
What: I would actually implement that logic in an Authentication Backend.
How: Use a specific, separate, Model to track login attempts, or, use the solution suggested by miko (fail2ban).
Why: You de-couple authentication from users. Bonus: if you want to take advantage of the upcoming pluggable User models in Django, that's a good idea.
On a side note, there probably is a way you can achieve an even "neater" solution by wrapping existing authentication backends to provide the required functionality.
Ultimately, my goal is to extend Django's ModelAdmin to provide field-level permissions—that is, given properties of the request object and values of the fields of the object being edited, I would like to control whether or not the fields/inlines are visible to the user. I ultimately accomplished this by adding a can_view_field() method to the ModelAdmin and modifying the built-in get_form() and get_fieldset() methods to remove/exclude fields+inlines that the user does not have permissions (as determined by can_view_field()) to see. If you'd like to see the code, I placed it in a pastebin, since it's long and only somewhat relevant.
It works great...almost. I appear to have run into some sort of thread-safety or caching issue, where the state of the ModelAdmin object is being leaked from one request to another in a reproducible manner.
I'll illustrate the problem with a simple example. Suppose that I have a model whose ModelAdmin I have extended with the field-level permissions code. This model has two fields:
- public_field, which can be seen/edited by any staff member
- secret_field, which can only be seen/edited by superusers
In this case, the can_view_field() method would look like this:
def can_view_field(self, request, obj, field_name):
"""
Returns boolean indicating whether the user has necessary permissions to
view the passed field.
"""
if obj is None:
return request.user.has_perm('%s.%s_%s' % (
self.opts.app_label,
action,
obj.__class__.__name__.lower()
))
else:
if field_name == "public_field":
return True
if field_name == "secret_field" and request.is_superuser:
return True
return False
Test case 1: with a fresh server restart, if you first view the changelist form as a superuser, you see the form as should happen, with both public_field and secret_field visible. If you log out and view it as a staff member (but not superuser), you only see public_field.
Test case 2: with a fresh server restart, if you log in as a staff member first, you still only see public_field. However, if you then log out and view as a superuser, you do not see secret_field. This is 100% reproducible.
I've done some basic thread-safety diagnostics:
At the end of get_form(), I've printed out the memory address of the ModelForm object. As it should be, it is unique with each request. Therefore, the ModelForm object is not the problem.
Immediately before the admin registration, I tried printing the memory address of the ModelAdmin object. In test case 1, it is unique with both requests. However with test case 2, it does not print at all on the second request.
At this point, I'm clueless. My next point of research will be the admin registration system (which I admittedly know nothing about). The state resets with a server restart, so it seems that the ModelAdmin must be cached? Or is it a thread-safety issue? If I turn it into a factory and return a deepcopy() of the ModelAdmin, would it serve a fresh ModelAdmin with each request? I'm clueless and would appreciate any thoughts. Thanks!
I'm confused about why you think ModelAdmin should be a new instance on each request. The admin objects are instantiated by the admin.site.register(Model) calls in each admin.py, which in turn is called from admin.autodiscover() in urls.py. In other words, this happens on process startup. Given the dynamic multi-process nature of most web serving environments, you may or may not get a new process with any particular request - certainly you won't get one every single time.
Because of this, it's not wise to store or alter state on a global object like ModelAdmin. I haven't looked through your linked code properly, but there was at least one case where you were altering an attribute on self as a result of a method call. Don't do that - you'll need to find some other way of passing dynamic values between methods.