Ive been configuring and troubleshooting some Django auth issues with a custom backend.
One thing I have noticed is that once the expiry date has expired for the session (confirmed via a Session.objects.all()) that the session remains in the table.
At the point that I have to reauthenticate it creates another entry creating a situation where a single user can have tons of sessions within the table rather then just one.
Is there a simple way of getting Django to clear these out at the point of them expiring ?
Thanks,
From official documentation -
Django does not provide automatic purging of expired sessions. Therefore, it’s your job to purge expired sessions on a regular basis. Django provides a clean-up management command for this purpose: clearsessions. It’s recommended to call this command on a regular basis, for example as a daily cron job.
Use something like this:
python manage.py clearsessions
...and schedule it to run regularly.
Related
I recently started developing a Desktop python application and I would like to know how more expert people would handle this issue.
I used to develop (about 5-10 years ago) web applications in the past using PHP + MySQL and there, since the code/program is located on the server where the user doesn't have access (except the web page), I could simply store the user/group permissions in the database in a table say users, users_groups, users_permissions, and so on. I would then check at every page load if the user had the right to access that page / update that record in the database.
With a desktop application where the user has access to the executable (which can relatively easy be decompiled to source code, being written in Python) the approach will likely be quite different.
Since MySQL has forked into MariaDB and is not so actively developed anymore, PostgreSQL looked promising to start. I thought about creating different users on PostgreSQL level and letting PostgreSQL handle the permissions (instad of my application handling them directly).
However, this only allows tuning of the permissions down to the table level. A user will be allowed to create/delete/update records in a table, however no further control is available. AFAIK you cannot tell "let this user only update his own records" or "this user can only delete the records from this group", or "users from group X can only update their own records while users from group Y can update everybody's records".
My understanding as how to handle this kind of issue would be to put some kind of middleware application between the user and the database, located on the server, such as:
Desktop application <-----> Server-side application permissions handler <-----> Database
Where server-side permission handler could be as simple as adding a "WHERE user=..." to each query as well as much more advanced stuff (first check user permissions stored in the database, based on that decide if letting user execute the query or reject it). I think this is a common problem for all desktop applications and would therefore expect that such a server-side application already exist. Am I missing something obvious or maybe PostgreSQL allows for more detailed fine-tuning?
Thank you for all your help ;)
Your intuition is right. It is never a good idea having a client access directly a database. Take a look a Django https://www.djangoproject.com and https://www.django-rest-framework.org
This would be the the basis for your server side. You would handle here business logic, authentication, authorization. The client should basically present the data within the UI and delegate all the decision making to the server.
Here you can find a step by step tutorial about how to implement a REST api with user authentication in Django. https://wsvincent.com/django-rest-framework-authentication-tutorial/
I am doing some tasks with tens of thousands of activity directory objects that can take several minutes to load.
To speed up things up I'd like to just refresh this data into the sqllite database in the middle of the night (since there's no need for it to be current).
Is there a way to typically approach to this type of problem? Perhaps have Django periodically run a function somehow?
you can write a django admin command and use cron or at to execute the command.
or just use a django cron lib:
django-cron
django-crontab
Current flow:
1) mysql >> CREATE {DATABASE} db_name [create_specification] ;
2) change database info in settings.py
3) python manage.py syncdb (assuming we have ready-made models)
Isn't there a way to do the same without using step 1. Maybe putting database name and specifications somewhere in settings.py so that i don't have to manually configure the db everytime i shift this project to some server
EDIT -
WHY I want to dodge the first step:
In my case, different columns of different tables have different collation types. So while development, whenever I recreate the database, i need to manually change the configurations of individual columns/tables, which is frustrating.
All you need is to know database user / password with grant create database, all other data is in settings. You can connect a custom command to pre_syncdb signal to gather this data.
Take a look to createsuperuser that is raised on post_syndb signal to learn about this.
EDITED
syncdb is not longer available. Since 1.9 you should to use pre_migrate signal.
I don't think its possible to dodge the step one, at least if you're using a different database backend except SQLite in django.
A note from docs: https://docs.djangoproject.com/en/dev/intro/tutorial01/#database-setup
If you’re using PostgreSQL or MySQL, make sure you’ve created a
database by this point. Do that with “CREATE DATABASE database_name;”
within your database’s interactive prompt.
If you’re using SQLite, you don’t need to create anything beforehand -
the database file will be created automatically when it is needed.
If you move your project to a server like Heroku, the db creation is automated, like when using PostgreSQL.
I'm wondering why you want to dogde the first step, however if you're desperate, you might still want to trying going the direct python way via psycopg2
Creating a postgresql DB using psycopg2
Granting the overall application database creation permissions to spare a minor manual initialization at deployment time sounds like a bad idea.
You should rather create a separate script that automates deployment for you. Fabric seems to have become the standard tool for that.
Done via django_extensions:
1) ./manage.py sqlcreate | ./manage.py dbshell
2) ./manage.py migrate
If at least the user on settings.py is already able to connect on the DB and have creation permissions, this should work.
To have it, I suggest to provide an envvar DATABASE_URL and use dj_database_url
At least is working for me on Postgres.
You can use sqlite while you are still in development stage.
I work on a page in Django, where users can set custom reminders for different dates (max. 3 per date). The reminders should send via e-mail. Its similar to Google Calendar, where you can set multiple reminders for each event in x-minutes, x-hour or x-days before the date starts.
I wonder, how I can solve it combined with Django. Since there will be a lot of users and dates, which should of course also run perfomant.
Should I do this with a cron job? Is there a python way?
The other traditional way is to use django-celery: http://pypi.python.org/pypi/django-celery/
You can use the celerybeat command to run periodical tasks. Also you can start pending tasks from a django view.
You can use a cron job. To create a management command: refer to the documentation here
Also, you can create the email generation as a queue based, distributed implementation for enhanced performance. You can use Django-mailer app for the same.
I'm using sessions in Django to store login user information as well as some other information. I've been reading through the Django session website and still have a few questions.
From the Django website:
By default, Django stores sessions in
your database (using the model
django.contrib.sessions.models.Session).
Though this is convenient, in some
setups it’s faster to store session
data elsewhere, so Django can be
configured to store session data on
your filesystem or in your cache.
Also:
For persistent, cached data, set
SESSION_ENGINE to
django.contrib.sessions.backends.cached_db.
This uses a write-through cache –
every write to the cache will also be
written to the database. Session reads
only use the database if the data is
not already in the cache.
Is there a good rule of thumb for which one to use? cached_db seems like it would always be a better choice because best case, the data is in the cache, and worst case it's in the database where it would be anyway. The one downside is I have to setup memcached.
By default, SESSION_EXPIRE_AT_BROWSER_CLOSE is set
to False, which means session cookies
will be stored in users' browsers for
as long as SESSION_COOKIE_AGE. Use
this if you don't want people to have
to log in every time they open a
browser.
Is it possible to have both, the session expire at the browser close AND give an age?
If value is an integer, the session
will expire after that many seconds of
inactivity. For example, calling
request.session.set_expiry(300) would
make the session expire in 5 minutes.
What is considered "inactivity"?
If you're using the database backend, note that session data can
accumulate in the django_session
database table and Django does not
provide automatic purging. Therefore,
it's your job to purge expired
sessions on a regular basis.
So that means, even if the session is expired there are still records in my database. Where exactly would one put code to "purge the db"? I feel like you would need a seperate thread to just go through the db every once in awhile (Every hour?) and delete any expired sessions.
Is there a good rule of thumb for which one to use?
No.
Cached_db seems like it would always be a better choice ...
That's fine.
In some cases, there a many Django (and Apache) processes querying a common database. mod_wsgi allows a lot of scalability this way. The cache doesn't help much because the sessions are distributed randomly among the Apache (and Django) processes.
Is it possible to have both, the session expire at the browser close AND give an age?
Don't see why not.
What is considered "inactivity"?
I assume you're kidding. "activity" is -- well -- activity. You know. Stuff happening in Django. A GET or POST request that Django can see. What else could it be?
Where exactly would one put code to "purge the db"?
Put it in crontab or something similar.
I feel like you would need a seperate thread to just go through the db every once in awhile (Every hour?)
Forget threads (please). It's a separate process. Once a day is fine. How many sessions do you think you'll have?