I am writing a django app that keeps track of which email addresses are allowed to post content to a user's account. The user can whitelist and blacklist addresses as they like.
Any addresses that aren't specified can either be handled per message or just default to whitelist or blacklist (again user specified).
Here are the django models I wrote... do you think is a good way to do it? or should I add a whitelist and blacklist field to each user's profile model?
class knownEmail(models.Model):
# The user who set this address' permission, NOT
# the user who the address belongs to...
relatedUser = models.ManyToManyField(User)
email = models.EmailField()
class whiteList(knownEmail):
pass
class blackList(knownEmail):
pass
Then I could do something like:
def checkPermission(user, emailAddress):
"Check if 'emailAddress' is allowed to post content to 'user's profile"
if whiteList.objects.filter(relatedUser=user, email=emailAddress):
return True
elif blackList.objects.filter(relatedUser=user, email=emailAddress):
return False
else:
return None
Is there a better way?
I would restructure it so both lists were contained in one model.
class PermissionList(models.Model):
setter = models.ManyToManyField(User)
email = models.EmailField(unique=True) #don't want conflicting results
permission = models.BooleanField()
Then, your lists would just be:
# whitelist
PermissionList.objects.filter(permission=True)
# blacklist
PermissionList.objects.filter(permission=False)
To check a particular user, you just add a couple functions to the model:
class PermissionList(...):
...
#classmethod
def is_on_whitelist(email):
return PermissionList.objects.filter(email=email, permission=True).count() > 0
#classmethod
def is_on_blacklist(email):
return PermissionList.objects.filter(email=email, permission=False).count() > 0
#classmethod
def has_permission(email):
if PermissionList.is_on_whitelist(email):
return True
if PermissionList.is_on_blacklist(email):
return False
return None
Having everything in one place is a lot simpler, and you can make more interesting queries with less work.
[Please start All Class Names With Upper Case Letters.]
Your code doesn't make use of your class distinction very well.
Specifically, your classes don't have any different behavior. Since both classes have all the same methods, it isn't clear why these are two different classes in the first place. If they have different methods, then your solution is good.
If, however, they don't have different methods, you might want to look at providing a customized manager for each of the two subsets of KnownEmail
class WhiteList( models.Manager ):
def get_query_set( self ):
return super( WhiteList, self ).get_query_set().filter( status='W' )
class BlackList( models.Manager )
def get_query_set( self ):
return super( BlackList, self ).get_query_set().filter( status='B' )
class KnownEmail( models.Model ):
relatedUser = models.ForeignKey(User)
email = models.EmailField()
status = models.CharField( max_length=1, choices=LIST_CHOICES )
objects = models.Manager() # default manager shows all lists
whiteList= WhiteList() # KnownEmail.whiteList.all() is whitelist subset
blackList= BlackList() # KnownEmail.blackList.all() is blackList subset
This class compares an email address with a blacklist of email domains. If you preffer you can download this module using pip install django-email-blacklist.
from django.conf import settings
import re
class DisposableEmailChecker():
"""
Check if an email is from a disposable
email service
"""
def __init__(self):
self.emails = [line.strip() for line in open(settings.DISPOSABLE_EMAIL_DOMAINS)]
def chunk(self, l, n):
return (l[i:i + n] for i in range(0, len(l), n))
def is_disposable(self, email):
for email_group in self.chunk(self.emails, 20):
regex = "(.*" + ")|(.*".join(email_group) + ")"
if re.match(regex, email):
return True
return False
Related
I have a model (Event) that has a ForeignKey to the User model (the owner of the Event).
This User can invite other Users, using the following ManyToManyField:
invites = models.ManyToManyField(
User, related_name="invited_users",
verbose_name=_("Invited Users"), blank=True
)
This invite field generates a simple table, containing the ID, event_id and user_id.
In case the Event owner deletes his profile, I don't want the Event to be deleted, but instead to pass the ownership to the first user that was invited.
So I came up with this function:
def get_new_owner():
try:
invited_users = Event.objects.get(id=id).invites.order_by("-id").filter(is_active=True)
if invited_users.exists():
return invited_users.first()
else:
Event.objects.get(id=id).delete()
except ObjectDoesNotExist:
pass
This finds the Event instance, and returns the active invited users ordered by the Invite table ID, so I can get the first item of this queryset, which corresponds to the first user invited.
In order to run the function when a User gets deleted, I used on_delete=models.SET:
owner = models.ForeignKey(User, related_name='evemt_owner', verbose_name=_("Owner"), on_delete=models.SET(get_new_owner()))
Then I ran into some problems:
It can't access the ID of the field I'm passing
I could'n find a way to use it as a classmethod or something, so I had to put the function above the model. Obviously this meant that it could no longer access the class below it, so I tried to pass the Event model as a parameter of the function, but could not make it work.
Any ideas?
First we can define a strategy for the Owner field that will call the function with the object that has been updated. We can define such deletion, for example in the <i.app_name/deletion.py file:
# app_name/deletion.py
def SET_WITH(value):
if callable(value):
def set_with_delete(collector, field, sub_objs, using):
for obj in sub_objs:
collector.add_field_update(field, value(obj), [obj])
else:
def set_with_delete(collector, field, sub_objs, using):
collector.add_field_update(field, value, sub_objs)
set_with_delete.deconstruct = lambda: ('app_name.SET_WITH', (value,), {})
return set_with_delete
You should pass a callable to SET, not call the function, so you implement this as:
from django.conf import settings
from django.db.models import Q
from app_name.deletion import SET_WITH
def get_new_owner(event):
invited_users = event.invites.order_by(
'eventinvites__id'
).filter(~Q(pk=event.owner_id), is_active=True).first()
if invited_users is not None:
return invited_users
else:
event.delete()
class Event(models.Model):
# …
owner = models.ForeignKey(
settings.AUTH_USER_MODEL,
related_name='owned_events',
verbose_name=_('Owner'),
on_delete=models.SET_WITH(get_new_owner)
)
Here we thus will look at the invites to find a user to transfer the object to. Perhaps you need to exclude the current .owner of the event in your get_new_owner from the collection of .inivites.
We can, as #AbdulAzizBarkat says, better work with a CASCADE than explicitly delete the Event object , since that will avoid infinite recursion where an User delete triggers an Event delete that might trigger a User delete: at the moment this is not possible, but later if extra logic is implemented one might end up in such case. In that case we can work with:
from django.db.models import CASCADE
def SET_WITH(value):
if callable(value):
def set_with_delete(collector, field, sub_objs, using):
for obj in sub_objs:
val = value(obj)
if val is None:
CASCADE(collector, field, [obj], using)
else:
collector.add_field_update(field, val, [obj])
else:
def set_with_delete(collector, field, sub_objs, using):
collector.add_field_update(field, value, sub_objs)
set_with_delete.deconstruct = lambda: ('app_name.SET_WITH', (value,), {})
return set_with_delete
and rewrite the get_new_owner to:
def get_new_owner(event):
invited_users = event.invites.order_by(
'eventinvites__id'
).filter(~Q(pk=event.owner_id), is_active=True).first()
if invited_users is not None:
return invited_users
else: # strictly speaking not necessary, but explicit over implicit
return None
We're using django-MPTT for tree items and I have to create functionality for automatic email notifications when a tree item has been modified. Absolutely straight-forward using save signals but - if a parent item is modified, then n (n being number of childen items + parent item) emails are sent instead of just one (because parent's changes are automatically done to children). Is there a good way to prevent that and basically tell Django "Please send only one email per model"?
The best I've come up with is a kind of a hacky way (and a pretty bad hack): add a model to the database that is updated when the parent item is modified and with childrens always check that model before sending email. I.e. if HiddenModel exists, do not send email, else send email. And after like five seconds (probably in another thread) remove/undo modifications to that HiddenModel object. It would probably work, but performance-wise it's just bad, with all those database queries.
EDIT: Model: (shortened version for SO, probably has some errors in this form)
from mptt.models import MPTTModel, TreeForeignKey
class TreeItem(MPTTModel):
parent = TreeForeignKey('self', null=True, blank=True, related_name='children')
users = models.ManyToManyField('accounts.User', blank=True)
name = models.CharField(max_length=255)
#property
def tree_path(self):
if self.is_root_node():
return []
tree = self.get_ancestors(include_self=False, ascending=False)
return list(map(lambda item: item.name, tree))
def save(self, *args, **kwargs):
is_new = self.pk is None
if not is_new:
prev_allowed_users = TreeItem.objects.get(pk=self.pk).users.all()
super().save(*args, **kwargs)
new_allowed_users = self.users.all()
if is_new:
self.send_access_email(list(new_allowed_users)) #TODO
else:
if prev_allowed_users != new_allowed_users:
send_to = self.get_new_access_users(prev_allowed_users, new_allowed_users) # TODO
self.send_access_email(send_to)
def get_new_access_users(self, prev_users, new_users):
send_to = []
for user in new_users:
if user not in prev_users:
send_to.append(user)
return send_to
def send_access_email(self, recipient_list):
email_content = 'example'
email = EmailMessage(
'subject',
email_content,
'ex#ample.com',
recipient_list)
print("Sending email to %s users" % str(len(recipient_list)))
You need some sort of registry to figure out if your object is being saved externally or as part of a tree traversing cascade. It seems that using a class variable might be a more elegant way of doing this. Here is some pseudo-code to illustrate what I mean.
class TreeItem(MPTTModel):
_save_registry = set()
... your model fields go here ...
def is_save_locked(self):
return any(p.id in TreeItem._save_registry for p in self.get_parents())
def save_lock(self):
TreeItem._save_registry.add(self.id)
def save_release(self):
TreeItem._save_registry.remove(self.id)
def save(self, *args, **kwargs):
if not self.is_save_locked:
self.save_lock()
... do things only original save should do ...
self.save_release()
... do things every save should do ...
return super(TreeItem, self).save(*args, **kwargs)
The question:
What are the advantages of using Factory Boy in the following situation? I don't really see why I shouldn't just deliver my own custom objects. If I am wrong please show me why.
I am using Factory Boy to make user instances during my tests, which creates a UserProfile object dynamically (standard recipe from Factory_Boy documentation).
The Data class creates data that will be delivered to forms during a post (other methods I'm using deliver data for self.client.post methods that login, register, and activate users. Unless I'm missing something, I'd have to build a separate DjangoModelFactory subclass for each situation in order to use ClassName.attributes() where the data requirements differ. The other reason I went in this direction is that UserProfile has a User foreign key, so I wasn't able to call UserProfileFactory.attributes() directly, only UserFactory.attributes(). Why not just make my own like I'm doing?
#Factories.py
IMAGE_PATH = os.path.join(os.path.dirname(__file__),
'../../test_files/test_images/image.jpeg')
class UserProfileFactory(DjangoModelFactory):
FACTORY_FOR = UserProfile
user = factory.SubFactory('portal.factories.UserFactory', profile=None)
first_name = factory.Sequence(lambda n: "Joe_%d" % n)
last_name = factory.Sequence(lambda n: "Schmoe_%d" % n)
nickname = factory.Sequence(lambda n: "JoeBlow_%d" % n)
profile_image = factory.LazyAttribute(lambda t: File(open(IMAGE_PATH)))
class UserFactory(DjangoModelFactory):
FACTORY_FOR = User
username = factory.Sequence(lambda n: "user_%d" % n)
password = make_password("password")
email = factory.Sequence(lambda n: "user_%d#gmail.com" % n)
profile = factory.RelatedFactory(UserProfileFactory, 'user')
#classmethod
def _generate(cls, create, attrs):
models.signals.post_save.disconnect(user_post_save, sender=User)
user = super(UserFactory, cls)._generate(create, attrs)
models.signals.post_save.connect(user_post_save, sender=User)
return user
class Data(object):
def __init__(self):
self.IMAGE_PATH = os.path.join(os.path.dirname(__file__),
'../../test_files/test_images/image.jpeg')
self.profile_image = File(open(IMAGE_PATH))
def get_profile_update(self, user):
return {'first_name': 'Jeff',
'last_name': 'Lebowski',
'nickname': 'The Dude',
'profile_image': self.profile_image,
'user': user.pk,}
def and_so_on(self):
continues...
Then I am using data like this in the following context during my integration tests:
class PortalTestCase(TestCase):
"""Shortened and simplified"""
def test_edit_profile_post(self):
user = UserFactory.create()
login_bool = self.client.login(username=user.username,
password=self.data.get_password())
data = self.data.get_profile_update(user)
response = self.client.post(reverse(self.get_edit_profile()),
data=data,
follow=True)
success_url = 'http://testserver%s' % reverse(self.get_portal())
template_name = self.get_portal_template()
content_text_img = 'src="/' + user.get_profile().profile_image.url + '"'
self.assertRedirects(response, success_url)
self.assertTemplateUsed(response, template_name)
self.assertContains(response, content_text_img)
You are correct that in this single instance test case factory boy may be overkill. However think about the application as whole. More often than not you will need to use another UserFactory object. Guess what that footwork has already been done. The reason I choose to use factory boy is consistent reusability.
As the documentation lays out but doesn't exactly spell out this tool is meant to grow with your application and not replace a single isolated tescase.
I am learning Django, and want to retrieve all objects that DON'T have a relationship to the current object I am looking at.
The idea is a simple Twitter copycat.
I am trying to figure out how to implement get_non_followers.
from django.db import models
RELATIONSHIP_FOLLOWING = 1
RELATIONSHIP_BLOCKED = 2
RELATIONSHIP_STATUSES = (
(RELATIONSHIP_FOLLOWING, 'Following'),
(RELATIONSHIP_BLOCKED, 'Blocked'),
)
class UserProfile(models.Model):
name = models.CharField(max_length=200)
website = models.CharField(max_length=200)
email = models.EmailField()
relationships = models.ManyToManyField('self', through='Relationship',
symmetrical=False,
related_name='related_to')
def __unicode__ (self):
return self.name
def add_relationship(self, person, status):
relationship, created = Relationship.objects.get_or_create(
from_person=self,
to_person=person,
status=status)
return relationship
def remove_relationship(self, person, status):
Relationship.objects.filter(
from_person=self,
to_person=person,
status=status).delete()
return
def get_relationships(self, status):
return self.relationships.filter(
to_people__status=status,
to_people__from_person=self)
def get_related_to(self, status):
return self.related_to.filter(
from_people__status=status,
from_people__to_person=self)
def get_following(self):
return self.get_relationships(RELATIONSHIP_FOLLOWING)
def get_followers(self):
return self.get_related_to(RELATIONSHIP_FOLLOWING)
def get_non_followers(self):
# How to do this?
return
class Relationship(models.Model):
from_person = models.ForeignKey(UserProfile, related_name='from_people')
to_person = models.ForeignKey(UserProfile, related_name='to_people')
status = models.IntegerField(choices=RELATIONSHIP_STATUSES)
This isn't particularly glamorous, but it gives correct results (just tested):
def get_non_followers(self):
UserProfile.objects.exclude(to_people=self,
to_people__status=RELATIONSHIP_FOLLOWING).exclude(id=self.id)
In short, use exclude() to filter out all UserProfiles following the current user, which will leave the user themselves (who probably shouldn't be included) and all users not following them.
i'v been searching for a method or some way to do that for like an hour, but i found nothing.
but there is a way to do that.
you can simply use a for loop to iterate through all objects and just remove all objects that they have a special attribute value.
there is a sample code here:
all_objects = className.objects.all()
for obj in all_objects:
if obj.some_attribute == "some_value":
all_objects.remove(obj)
Solution to the implementation of get_non_followers:
def get_non_following(self):
return UserProfile.objects.exclude(to_person__from_person=self, to_person__status=RELATIONSHIP_FOLLOWING).exclude(id=self.id)
This answer was posted as an edit to the question Finding objects without relationship in django by the OP Avi Meir under CC BY-SA 3.0.
current_userprofile = current_user.get_profile()
rest_of_users = Set(UserProfile.objects.filter(user != current_userprofile))
follow_relationships = current_userprofile.relationships.filter(from_person=current_user)
followers = Set();
for follow in follow_relationships:
followers.add(follow.to_person)
non_followeres = rest_of_users.difference(followers)
Here non_followers is the list of userprofiles you desire. current_user is the user whose non_followers you are trying to find.
I haven't tested this out, but it think it should do what you want.
def get_non_followers(self):
return self.related_to.exclude(
from_people__to_person=self)
I am trying to figure out how to use proxy classes in Django. I want to receive a queryset where each object belongs to a proxy class of a common super class so that I can run custom sub-classed methods with the same name and my controller logic doesn't need to know or care about which kind of Proxy model it is working with. One thing I don't want to do is to store the information in multiple tables because I want to have unified identifiers for easier reference/management.
I am pretty new to django/python so I would be happy to hear alternative ways to accomplish what I am trying to do.
Here is what I have:
TYPES = (
('aol','AOL'),
('yhoo','Yahoo'),
)
class SuperConnect(models.Model):
name = models.CharField(max_length=90)
type = models.CharField(max_length=45, choices = TYPES)
connection_string = models.TextField(null=True)
class ConnectAOL(SuperConnect):
class Meta:
proxy = True
def connect(self):
conn_options = self.deconstruct_constring()
# do special stuff to connect to AOL
def deconstruct_constring(self):
return pickle.loads(self.connection_string)
class ConnectYahoo(SuperConnect):
class Meta:
proxy = True
def connect(self):
conn_options = self.deconstruct_constring()
# do special stuff to connect to Yahoo
def deconstruct_constring(self):
return pickle.loads(self.connection_string)
Now what I want to do is this:
connections = SuperConnect.objects.all()
for connection in connections:
connection.connect()
connection.dostuff
I've looked around and found some hacks but they look questionable and may require me to go to the database for each item in order to retrieve data I probably already have...
Somebody please rescue me :) or I am going to go with this hack:
class MixedQuerySet(QuerySet):
def __getitem__(self, k):
item = super(MixedQuerySet, self).__getitem__(k)
if item.atype == 'aol':
yield(ConnectAOL.objects.get(id=item.id))
elif item.atype == 'yhoo':
yield(ConnectYahoo.objects.get(id=item.id))
else:
raise NotImplementedError
def __iter__(self):
for item in super(MixedQuerySet, self).__iter__():
if item.atype == 'aol':
yield(ConnectAOL.objects.get(id=item.id))
elif item.atype == 'yhoo':
yield(ConnectYahoo.objects.get(id=item.id))
else:
raise NotImplementedError
class MixManager(models.Manager):
def get_query_set(self):
return MixedQuerySet(self.model)
TYPES = (
('aol','AOL'),
('yhoo','Yahoo'),
)
class SuperConnect(models.Model):
name = models.CharField(max_length=90)
atype = models.CharField(max_length=45, choices = TYPES)
connection_string = models.TextField(null=True)
objects = MixManager()
class ConnectAOL(SuperConnect):
class Meta:
proxy = True
def connect(self):
conn_options = self.deconstruct_constring()
# do special stuff to connect to AOL
def deconstruct_constring(self):
return pickle.loads(self.connection_string)
class ConnectYahoo(SuperConnect):
class Meta:
proxy = True
def connect(self):
conn_options = self.deconstruct_constring()
# do special stuff to connect to Yahoo
def deconstruct_constring(self):
return pickle.loads(self.connection_string)
As you mentioned in your question, the problem with your solution is that it generates a SQL query for every object instead of using one SQL in = (id1, id2) query. Proxy models cannot contain additional database fields, so there is no need for extra SQL queries.
Instead, you can convert a SuperConnect object to the appropriate type in SuperConnect.__init__, using the __class__ attribute:
class SuperConnect(models.Model):
name = models.CharField(max_length=90)
type = models.CharField(max_length=45, choices = TYPES)
connection_string = models.TextField(null=True)
def __init__(self, *args, **kwargs):
super(SuperConnect, self).__init__(*args, **kwargs)
if self.type == 'aol':
self.__class__ = ConnectAOL
elif self.type == 'yahoo':
self.__class__ = ConnectYahoo
There is no need for custom managers or querysets, the correct type is set when the SuperConnect object is initialized.
How about putting all the logic in one class. Something like this:
def connect(self):
return getattr(self, "connect_%s" % self.type)()
def connect_aol(self):
pass # AOL stuff
def connect_yahoo(self):
pass # Yahoo! stuff
In the end you have your type field and you should be able to do most (if not all) things that you can do with seperate proxy classes.
If this approach doesn't solve your specific use cases, please clarify.