How do I enforce unique user names in Flask? - python

I'm a complete beginner to Flask and I'm starting to play around with making web apps.
I have a hard figuring out how to enforce unique user names. I'm thinking about how to do this in SQL, maybe with something like user_name text unique on conflict fail, but then how to I catch the error back in Python?
Alternatively, is there a way to manage this that's built in to Flask?

That entirely depends on your database layer. Flask is very specifically not bundled with a specific ORM system, though SQL Alchemy is recommended. The good news is that SQL Alchemy has a unique constraint.
Here's how it might work:
from sqlalchemy.ext.declarative import declarative_base, InvalidRequestError
engine = #my engine
session = Session() # created by sessionmaker(bind=engine)
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
# then later...
user = User()
user.name = 'Frank'
session.add(user)
try:
session.commit()
print 'welcome to the club Frank'
except InvalidRequestError:
print 'You are not Frank. Impostor!!!'
Run the part after "then later" twice. The first time you'll get a welcome message, the second time you won't.
Addendum: The closest thing that Flask has to a default authentication framework simply stores users in a dict by username. The way to check to enforce uniqueness is by manually testing eg.
if username in digest_db:
raise Exception('HEY! "{}" already exists! \
You can\'t do that'.format(username))
digest_db.add_user(username, password)
or overriding RealmDigestDB to make sure that it checks before adding:
class FlaskRealmDigestDB(authdigest.RealmDigestDB):
def add_user(self, user, password):
if user in self:
raise AttributeError('HEY! "{}" already exists! \
You can\'t do that'.format(user))
super(FlaskRealmDigestDB, self).add_user(user, password)
def requires_auth(self, f):
# yada yada
or overriding RealmDigestDB, and making it return something which does not allow duplicate assignment. eg.
class ClosedDict(dict):
def __setitem__(self, name, val):
if name in self and val != self[name]:
raise AttributeError('Cannot reassign {} to {}'.format(name, val))
super(ClosedDict, self).__setitem__(name,val)
class FlaskRealmDigestDB(authdigest.RealmDigestDB):
def newDB():
return ClosedDict()
def requires_auth(self, f):
# yada yada
I put this here as an addendum because that class does not persist data in any way, if you're planning on extending authdigest.RealmDigestDB anyway you should use something like SQLAlchemy as above.

You can use SQLAlchemy.It's a plug-in

Related

Flask Mongoengine ValidationError Field is required on .save() but fields already exist in db

Problem: I get a ValidationError when trying to perform a .save() when appending a value to an EmbeddedDocumentListField because I am missing required fields that already exist on the document.
Note that at this point the User document has already been created as part of the signup process, so it already has an email and password in the DB.
My classes:
class User(gj.Document):
email = db.EmailField(required=True, unique=True)
password = db.StringField(required=True)
long_list_of_thing_1s = db.EmbeddedDocumentListField("Thing1")
long_list_of_thing_2s = db.EmbeddedDocumentListField("Thing2")
class Thing1(gj.EmbeddedDocument):
some_string = db.StringField()
class Thing2(gj.EmbeddedDocument):
some_string = db.StringField()
Trying to append a new EmbeddedDocument to the EmbeddedDocumentListField in my User class in the Thing2 Resource endpoint:
class Thing2(Resource):
def post(self):
try:
body = request.get_json()
user_id = body["user_id"]
user = UserModel.objects.only("long_list_of_thing_2s").get(id=user_id)
some_string = body["some_string"]
new_thing_2 = Thing2Model()
new_thing_2.some_string = some_string
user.long_list_of_thing_2s.append(new_thing_2)
user.save()
return 201
except Exception as exception:
raise InternalServerError
On hitting this endpoint I get the following error on the user.save()
mongoengine.errors.ValidationError: ValidationError (User:603e39e7097f3e9a6829f422) (Field is required: ['email', 'password'])
I think this is because of the .only("long_list_of_thing_2s")
But I am specifically using UserModel.objects.only("long_list_of_thing_2s") because I don't want to be inefficient in bringing the entire UserModel into memory when I only want to append something the long_list_of_thing_2s
Is there a different way I should be going about this? I am relatively new to Flask and Mongoengine so I am not sure what all the best practices are when going about this process.
You are correct, this is due to the .only and is a known "bug" in MongoEngine.
Unless your Model is really large, using .only() will not make a big difference so I'd recommend to use it only if you observe performance issues.
If you do have to keep the .only() for whatever reason, you should be able to make use of the push atomic operator. An advantage of using the push operator is that in case of race conditions (concurrent requests), it will gracefully deal with the different updates, this is not the case with regular .save() which will overwrite the list.

Peewee retrieves data in python console but not in app

I have entities designed with peewee in Python. Before I started implementing real database, I've made several tests with in-memory databases. When I started to implement database functionality, I faced strange problem. My queries returns empty results, what more it depends if I run script or use python console.
First of all, let me proof that logic is correct. When I use python console, everything is ok:
>>> from Entities import *
>>> print (RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
RUT00
As you see, everything is correct. Specific query is executed and returns result. Now the same in a script:
from Entities import *
print (RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
This one returns exception instance matching query does not exist
print
(RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
File
"C:\Users\Kamil\AppData\Local\Programs\Python\Python37-32\lib\site-packages\peewee.py",
line 5975, in get
(clone.model, sql, params)) Entities.RouterSettingsDoesNotExist: instance matching query does not exist : SQL:
SELECT "t1"."id", "t1"."name", "t1"."ip", "t1"."username",
"t1"."password", "t1"."model", "t1"."phone_num", "t1"."provider",
"t1"."location" FROM "routersettings" AS "t1" WHERE ("t1"."name" = ?)
LIMIT ? OFFSET ? Params: ['RUT00', 1, 0]
When I was trying to debug, I've found that database was as if not created:
Please note that within debugged variables database object is null (None).
Do you have any ideas what's going on?
My Entities are defined as follows:
from peewee import *
class EnumField(IntegerField):
def __init__(self, *argv):
super().__init__()
self.enum = []
for label in argv:
self.enum.append(label)
def db_value(self, value):
try:
return self.enum.index(value)
except ValueError:
raise EnumField.EnumValueDoesnExistError(
"Value doesn\'t exist in enum set.\nMaybe you forgot to add "
"that one: " + value + "?")
def python_value(self, value):
try:
return self.enum[value]
except IndexError:
raise EnumField.EnumValueDoesnExistError(
'No value for given id')
class EnumValueDoesnExistError(Exception):
pass
class ModelField(EnumField):
def __init__(self):
super().__init__('RUT955_Q', 'RUT955_H', 'GLiNet300M')
class ProviderField(EnumField):
def __init__(self):
super().__init__('Orange', 'Play', 'Virgin')
class BaseModel(Model):
class Meta:
database = SqliteDatabase('SIMail.db', pragmas={'foreign_keys': 1})
class RouterSettings(BaseModel):
name = CharField(unique=True)
ip = CharField(unique=True)
username = CharField()
password = CharField()
model = ModelField()
phone_num = IntegerField(unique=True)
provider = ProviderField()
location = CharField()
You probably are running it with a relative path to the database file, and depending on the current working directory when you're running your app vs the console, its using a different database file.

"Matching"/relations data across databases in Django

In developing a website for indexing system documentation I've come across a tough nut to crack regarding data "matching"/relations across databases in Django.
A simplified model for my local database:
from django.db import models
class Document(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
Imagined model, system details are stored in a remote database.
from django.db import models
class System(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
The idea is that when creating a new Document entry at my website the ID of the related system is to be stored in the local database. When presenting the data I would have to use the stored ID to retrieve the system name among other details from the remote database.
I've looked into foreign keys across databases, but this seems to be very extensive and I'm not sure if I want relations. Rather I visualize a function inside the Document model/class which is able to retrieve the matching data, for example by importing a custom router/function.
How would I go about solving this?
Note that I won't be able to alter anything on the remote database, and it's read-only. Not sure if I should create a model for System aswell. Both databases use PostgreSQL, however my impression is that it's not really of relevance to this scenario which database is used.
In the django documentation multi-db (manually-selecting-a-database)
# This will run on the 'default' database.
Author.objects.all()
# So will this.
Author.objects.using('default').all()
# This will run on the 'other' database.
Author.objects.using('other').all()
The 'default' and 'other' are aliases for you databases.
In your case it would could be 'default' and 'remote'.
of course you could replace the .all() with anything you want.
Example: System.objects.using('remote').get(id=123456)
You are correct that foreign keys across databases are a problem in Django ORM, and to some extent at the db level too.
You already have the answer basically: "I visualize a function inside the Document model/class which is able to retrieve the matching data"
I'd do it like this:
class RemoteObject(object):
def __init__(self, remote_model, remote_db, field_name):
# assumes remote db is defined in Django settings and has an
# associated Django model definition:
self.remote_model = remote_model
self.remote_db = remote_db
# name of id field on model (real db field):
self.field_name = field_name
# we will cache the retrieved remote model on the instance
# the same way that Django does with foreign key fields:
self.cache_name = '_{}_cache'.format(field_name)
def __get__(self, instance, cls):
try:
rel_obj = getattr(instance, self.cache_name)
except AttributeError:
system_id = getattr(instance, self.field_name)
remote_qs = self.remote_model.objects.using(self.remote_db)
try:
rel_obj = remote_qs.get(id=system_id)
except self.remote_model.DoesNotExist:
rel_obj = None
setattr(instance, self.cache_name, rel_obj)
if rel_obj is None:
raise self.related.model.DoesNotExist
else:
return rel_obj
def __set__(self, instance, value):
setattr(instance, self.field_name, value.id)
setattr(instance, self.cache_name, value)
class Document(models.Model:
name = models.CharField(max_length=200)
system_id = models.IntegerField()
system = RemoteObject(System, 'system_db_name', 'system_id')
You may recognise that the RemoteObject class above implements Python's descriptor protocol, see here for more info:
https://docs.python.org/2/howto/descriptor.html
Example usage:
>>> doc = Document.objects.get(pk=1)
>>> print doc.system_id
3
>>> print doc.system.id
3
>>> print doc.system.name
'my system'
>>> other_system = System.objects.using('system_db_name').get(pk=5)
>>> doc.system = other_system
>>> print doc.system_id
5
Going further you could write a custom db router:
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-routers
This would let you eliminate the using('system_db_name') calls in the code by routing all reads for System model to the appropriate db.
I'd go for a method get_system(). So:
class Document:
def get_system(self):
return System.objects.using('remote').get(system_id=self.system_id)
This is the simplest solution. A possible solution is also to use PostgreSQL's foreign data wrapper feature. By using FDW you can abstract away the multidb handling from django and do it inside the database - now you can use queries that need to use the document -> system relation.
Finally, if your use case allows it, just copying the system data periodically to the local db can be a good solution.

Python data structures -- do I need to use SQL plugin... if so, how?

I have the following class with associated attributes:
class Company(object):
self.ticker # string
self.company # string
self.creator # string
self.link # string
self.prices # list of tuples (this could be many hundreds of entries long)
self.creation_date # date entry
I populate individual companies and then store a list of companies into class Companies:
class Companies(object):
def __init__(self):
self.companies = []
def __len__(self):
return len(self.companies)
def __getitem__(self, key):
return self.companies[key]
def __repr__(self):
return 'Company list of length %i' % (self.__len__())
def add(self, company):
self.companies.append(company)
I want to be able to easily perform queries such as Companies.find(creator="someguy") and have the class return a list of all companies created by someguy. I also want to be able to run a query such as Companies.find(creation_date > x) and return a list of all entries created after a certain date.
I have a little bit of experience of doing similar work with Django's built in SQL functionality and found that to be pretty convenient. However, this project isn't using Django and I don't know if there are any other, smaller packages that provide this functionality. I'd like to keep the interfacing with the SQL server to a minimum because I don't have much experience with the language.
Here are my questions:
To do the above, do I need to use an external database program? Or, does a package exist that will do the above, and also allow me to easily save the data (pickling or other)? I feel, perhaps unjustifiably so, that things start becoming messier and more complicated when you start incorporating SQL.
If a database is not necessary for the above, how do you evaluate when you have enough data where you will want to incorporate an external database?
What packages, e.g. Django, exist to minimize the SQL legwork?
All this being said, what do you recommend I do?
SQLAlchemy is a full featured Python ORM (Object Relational Mapper). It can be found at http://www.sqlalchemy.org/.
Example usage to define a new object:
from sqlalchemy import Column, Integer, String
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(String)
fullname = Column(String)
password = Column(String)
def __init__(self, name, fullname, password):
self.name = name
self.fullname = fullname
self.password = password
To create a new User:
>>> ed_user = User('ed', 'Ed Jones', 'edspassword')
>>> ed_user.name
'ed'
>>> ed_user.password
'edspassword'
To write to the database (after creating a new session):
ed_user = User('ed', 'Ed Jones', 'edspassword')
session.add(ed_user)
session.commit()
And to query:
our_user = session.query(User).filter_by(name='ed').first()
>>> our_user
<User('ed','Ed Jones', 'edspassword')>
More details can be found at http://docs.sqlalchemy.org/en/rel_0_8/orm/tutorial.html (taken from docs).

Disabling committing object changes in SQLAlchemy

I'm using SQLAlchemy in project that is not a web application. It is a server application that loads number of different objects from database and modifies them locally, but don't want to save those updates to the database each time a commit is issued. I was previously working with Django ORM for some web projects and found it better suited for what I'm trying to achieve. In Django ORM I could .save() each object whenever I wanted without saving other things I may not want to save. I understand why it works like this in SQLAlchemy, but I wonder how I could do this in the Django-like way?
Update:
To make it easier to understand what I'm trying to achieve, I'll provide you an example.
This is how it works actually:
a = MyModel.query.get(1)
b = MyModel.query.get(1)
a.somefield = 1
b.somefield = 2
# this will save both of changed models
session.commit()
This is how I want it to work:
a = MyModel.query.get(1)
b = MyModel.query.get(1)
a.somefield = 1
b.somefield = 2
a.save()
# I didn't want to save b, changes of b weren't committed
I want to have greater control of what is actually saved. I want to save changes of each object every 5 minute or so.
I use something like:
class BaseModel(object):
def save(self, commit=True):
# this part can be optimized.
try:
db.session.add(self)
except FlushError:
# In case of an update operation.
pass
if commit:
db.session.commit()
def delete(self, commit=True):
db.session.delete(self)
if commit:
db.session.commit()
and then I define my models as:
class User(db.Model, BaseModel)
So, now I can do:
u = User(username='foo', password='bar')
u.save()
This is what you were planning to achieve ?
I am not sure i understand your predicament.
In Django,
foo = MyModel(field1='value1', field2='value2')
foo.save()
or alternatively
foo = MyModel.objects.create(field1='value1', field2='value2')
In SQLAlchemy,
foo = MyModel(field1='value1', field2='value2')
session.add(foo)
At this point you have only added the object to the session and it has not yet committed the transaction. You need to commit only after you have done whatever changes were required
session.commit()
take a look that this link. I think it will make the transition from Django ORM to SqlAlchemy easier.
UPDATE
For such a situation, you could use multiple sessions.
engine = create_engine("postgresql+psycopg2://user:password#localhost/test")
metadata = MetaData(bind=engine)
Session = sessionmaker(bind=engine)
session1 = Session()
session2 = Session()
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(String)
age = Column(Integer)
def __init__(self, name, age):
self.name = name
self.age = age
def __repr__(self):
return "<User('%s','%s')>" % (self.name, self.age)
Base.metadata.create_all(engine)
Created a table 'users' in the 'test' db. Also, 2 session objects, session1 and session2, have been initialized.
a = User('foo','10')
b = User('bar', '20')
session1.add(a)
session1.add(b)
session1.commit()
The table users will now have 2 records
1: foo, 10
2: bar, 20
Fetching the 'foo' record sing session1 and 'bar' using session2.
foo = session1.query(User).filter(User.name == "foo").first()
bar = session2.query(User).filter(User.name == "bar").first()
Making changes to the 2 records
foo.age = 11
bar.age = 21
Now, if you want the changes of foo alone to carry over,
session1.commit()
and for bar,
session2.commit()
Not to stir up an old post, but
You say:
I want to save changes of each object every 5 minute or so.
So why not use a scheduler like Celery.(I use pyramid_celery)
With this you can save each object every 5 minutes, i.e. You can add a decorator:
#periodic_task(run_every=crontab(minute="*/5")
def somefunction():
#your code here
This works great, especially when you need to update your database to make sure it is up to date(in the case that there is many users using your system)
Hope this helps someone with the, saving every 5 minutes part.

Categories

Resources