Python, beginner's question! Repository or Object persisting itself? - python

I am a seasoned .Net developer who's trying to write some Python code. On one of the projects I am contributing to, we have a services layer which is a set of classes which abstract away functionality and a django web app which consumes these in process services (which are just classes).
I had created a repository layer and ensured that all interaction with the database happens through the services layer through this repository. We have a document oriented database and thus we do not have the usual object-relational muck.
During a recent code review, one developer who is supposedly seasoned with python shunned at this and commented that this was not the python way of doing things. He remarked that python developers are used to having a save and delete method on the object instance itself (and do not use the repository pattern as much) and this would confuse python devs looking to contribute to our OSS project. Python devs, your views? Would you be confused?
Edit: This is not django code, but will be code called by the django app (It an in process service layer)

Maybe that is a Django pattern, but not a Python one by all means.
That said, if the target audience of your module are Django developers, I would advise you to follow as much as possible the Django philosophy and its associated patterns.

Django's ORM provides save() and delete() methods on the object. SQLAlchemy on the other hand has a so called session to which you add or delete objects.
Both are very popular so I'd say that both methods are about equal in terms of popularity. However in the context of a Django application going with the Django convention is probably preferable unless you have a good reason not to.

Best of my recollection Django's models include save() and delete() methods so you can deal exclusively with objects, rather than interacting with a database connection object. I don't know that it's instantly a Python way of doing things, but I'm pretty sure it's a pervasive Django pattern.
If I was told "this is Django code" but the code diverged from how Django does things, that might be confusing.

Don't Repeat Yourself. If all the data stored in the database is meant to be accessible through django (e.g., they are defined in the django models.py); there's a django-ORM that is already designed to safely (no SQL injection) and easily access the database for you via save() and delete(). There's also helpful wrapper functions to create transactions (e.g., #transaction.commit_on_success to group together actions. You can use the ORM in python scripts outside of the running django web-app. E.g., create a django management command or run a script from the django shell (./manage shell)
I definitely agree having another repository layer creates confusion and potentially leads to major issues for people using your . E.g., sometimes you have model validation that goes beyond the database validation and if you save it outside of django that validation never runs. Or maybe every time a specific model is saved, extra behavior should occur (like creating a complimentary object; or generate a task) that would be skipped if save() is not called so the pre_save post_save signals are never generated.
Granted you said this is a document-oriented-database (e.g., mongdb/couchdb), and AFAIK django does not support these sorts of NoSQL dbs, so then ignore what I said.

Related

Using custom python classes alongside Django app

In a school project, my team and I have to create a shopping website with a very specific server-side architecture. We agreed to use python and turned ourselves towards Django since it seemed to offer more functionalities than other possible frameworks. Be aware that none of us ever used Django in the past. We aren't masters at deploying application on the web either (we are all learning).
Here's my problem: two weeks in the project, our teacher told us that we were not allowed to use any ORM. To me, this meant bye bye to Django models and that we have to create everything on our own.
Here are my questions: as we already have created all our python classes, is there any way for us to use them alongside our Django app? I have not seen any example online of people using their own python classes within a Django app. If it were possible, where should we instantiate all our objects? Would it be easier to just go with another framework (I am thinking about Flask). Am I just missing important information about how Django works and asking a dumb question?
We have 4 weeks completed and 6 more to go before finishing our project. I often see online "use Flask before using Django" since it is simpler to use. We decided on Django because in the project description, Django was recommended but not Flask.
Thanks for the help.
Without being an absolute Django expert, here is my opinion.
The Django ORM is far from being the only feature this Framework has to offer (URLs routing, test client, user sessions variables, etc.), but surely it is one the main component you want to use while working with Django since it is often directly linked to other core features of Django.
If using the ORM is completely forbidden, a lot of features out of the box won't be available for you. One of the main features I can think about is the admin interface. You won't be able to use it if the ORM is not an option for you.
So, in my opinion, you should go for another Framework like Flask. Mainly because without using the ORM, some of the Django value is gone.
Hope it helps!

Should I save user backend in database in Django?

I'm creating a Django website that supports both local login backend and LDAP login (through django-auth-ldap), and maybe more in the future.
I'm getting into Django login and backends sutff and have a couple of questions - mainly is there any reason Django doesn't keep user creation backend in the database? Shouldn't user A be linked (and by linked I mean a field on User model) with the backend django.contrib.auth.backends.ModelBackend for safety/convince reason?
I'm getting around to creating a custom user model, and was thinking about adding such field. The ability to unambiguously know which backend was/is used to create/login the user sounds logically for me, but the fact that Django doesn't have that by default, and that I can't find anything similar on the Internet has me worried that I didn't think of a really good reason for why it's done the way it is.
Thanks in advance,
Paweł
Django doesn't need that info. Once the user is authenticated, and django has a User model, it doesn't care what backend authenticated it. The User model data is stored in one source. The User model (whether the default or custom) is consistent and has the same attributes, functionality and behaviour across the entire django project and schema. Nothing in the out-of-the-box django deals with different user models.
You may extend this with AbstractBaseUser, but managing really different users across the same project, especially with the core django modules, is a strech.
Django uses the User model a lot, and you will have to manually locate each place it does, and provide your own router to the correct backend. There is no API for this (like, say, db routers), it's going to be a mess of hacks that will probably even messier with each upgrade.
Django does support, in addition to the custom user model, "authentication backends". Some of the functionality your are looking for is available and exposed with this option, in a formal API. So you probably want to stick with that.
see:https://docs.djangoproject.com/en/1.9/topics/auth/customizing/#authentication-backends

Where to instantiate shared thread objects in Django, equiv of pyramid registry?

I'm plugging some framework agnostic code of mine into Django instead of Pyramid. It uses SQLAlchemy and has a customized session factory object for getting db sessions. In pyramid, I instantiate this at server start up in the main app method and attach it to the registry so that all other parts of my app can get at it. I'd like to know what the "correct" way of instantiating and making available a shared factory is in Django. Is there somewhere canonical for putting something like that so that Django users will find it easily and the code will be readable to people used to Django patterns?
thanks
I place my SQLAlchemy/SQLSoup connections at models.py, because it is related to persistence (and to the "model" layer of Model-View-Whatever).
You can even replace the Django ORM with SQLAlchemy if you are not using applications relying on the first like django.contrib.admin or django.contrib.auth.

Designing API for internal and remote usage in the Django app

I am starting web project that should be very flexible and modular and definitely will grow much in the future. As we plan to provide an api to other developers I started thinking that maybe it is a good idea to implement all the methods as api and provide functionality to use them remotely and internally.
For instance, say we want to extract all registered Users. So we design method in api, like get_all_users,which maybe available via REST or internal invocation. The problem is I cannot figure out how to distinguish access from internal usage and remote usage, as I should take into consideration also speed of internal invocation,code reusage and checking of user permissions(secret keys for api access, etc). What are the best practices? How to organize such API?
So to build the API, Tastypie or Piston. These let you write 'resources' (which are basically API versions of the views) - instead of returning httpresponses you just return objects which piston/tastypie convert into json or xml or yaml or whatever your favorite data-language is.
Securing the access: Consider decorating any resources which contain confidential information with the #login_required or #staff_member_required decorator as appropriate.
You might even want to go a step further and write a decorator to check that the staff user is using https (whereas you might allow a normal user to use any connection type).
If you have not considered it yet I recommend to use Tastypie to set up an API with django.
In order to distinguish between your internal and remote usage maybe you can simply design a different URL scheme.

Best practice: How to persist simple data without a database in django?

I'm building a website that doesn't require a database because a REST API "is the database". (Except you don't want to be putting site-specific things in there, since the API is used by mostly mobile clients)
However there's a few things that normally would be put in a database, for example the "jobs" page. You have master list view, and the detail views for each job, and it should be easy to add new job entries. (not necessarily via a CMS, but that would be awesome)
e.g. example.com/careers/ and example.com/careers/77/
I could just hardcode this stuff in templates, but that's no DRY- you have to update the master template and the detail template every time.
What do you guys think? Maybe a YAML file? Or any better ideas?
Thx
Why not still keep it in a database? Your remote REST store is all well and funky, but if you've got local data, there's nothing (unless there's spec saying so) to stop you storing some stuff in a local db. Doesn't have to be anything v glamorous - could be sqlite, or you could have some fun with redis, etc.
You could use the Memcachedb via the Django cache interface.
For example:
Set the cache backend as memcached in your django settings, but install/use memcachedb instead.
Django can't tell the difference between the two because the provide the same interface (at least last time I checked).
Memcachedb is persistent, safe for multithreaded django servers, and won't lose data during server restarts, but it's just a key value store. not a complete database.
Some alternatives built into the Python library are listed in the Data Persistence chapter of the documentation. Still another is JSON.

Categories

Resources