Python Django Multiple Database Committing Objects with Foreign Key Relations - python

I am using Django with multiple databases. I have a 'preview' database that takes messages uploaded by the user and these have to be previewed by an admin and 'accepted', at which point they are committed to the 'default' production database. The following view is supposed to do that, but I'm getting an error. Each newSentenceModel has a Foreign Key to each newMessageSegment, and each newMessageSegment has a Foreign Key to each Message. I want to move each item to the new database if the admin accepts the content, and then delete the old entries in the preview database. Please help! Thanks -
Here is the error:
instance is on database "preview", value is on database "default"
The Error message occurs at this line:
newMessageSegment.msg = newMessage # Setup the foreign key to the msg itself
Here is the View function:
## This view is used when the admin approves content and clicks on the "accept content" button when reviewing
## a recent upload - it saves the data to the production database
def accept_content(request, msg_id=None):
if msg_id == None: # If for some reason we got a None, then it's not a valid page to accept so redirect home
return HttpResponseRedirect("/") # Redirect home
msgList = Message.objects.using('preview').all() # Get all Messages
msgSegmentList = MessageSegment.objects.using('preview').all() # Get all MessageSegment Objects
sentenceModels = SentenceModel.objects.using('preview').all() # Get all SentenceModels
for msgs in msgList: # Iterate all msgs
if int(msgs.id) != int(msg_id): # Don't care if it is not the msg needing review
continue # Short Circuit
msgPrimaryKey = msgs.pk # Extract the primary key from this msg to restore later
msgs.pk = None # Erase the primary key so we can migrate databases properly
newMessage = msgs # This is the msg to transfer to the new one
newMessage.save(using='default') # Save the item to the production database
for msgSegment in msgSegmentList: # Iterate all msg segments for this msg
if msgSegment.msg_id == msgPrimaryKey: # Check the foreign keys on the msg segment to msg connection
newMessageSegment = msgSegment # Define a new msg segment
msgSegment.pk = None # Erase the primary key so we can assign it properly
newMessageSegment.pk = None # Erase the primary key so we can assign it properly
newMessageSegment.msg = newMessage # Setup the foreign key to the msg itself
newMessageSegment.save(using='default') # Save the item to the production database
for sentenceModel in sentenceModels: # Iterate all sentences for this msg segment
if sentenceModel.msg_segment_id == msgSegment.id: # Determine which sentences are for this msg segment
newSentenceModel = sentenceModel # Define the newSentenceModel
newSentenceModel.msg_segment = newMessageSegment # Setup the foreign key to the msg segment
newSentenceModel.save(using='default') # Save the item to the production database
sentenceModel.delete(using='preview') # Delete the item from the review database
msgSegment.delete(using='preview') # Delete the item from the review database
msgs.pk = msgPrimaryKey # Restore the key so we can delete it properly
msgs.delete(using='preview') # Delete the item from the review database
return HttpResponseRedirect("/")

Django remembers which database the object was saved with, so each newMessageSegment is still affiliated with the preview database until you save it to default and it correctly disallows the cross-database FK assignment. This is untested, but it might work to assign to the underlying msg_id field instead:
newMessageSegment.msg_id = newMessage.id
Failing that, you could create a new copy of newMessageSegment rather than just creating a new reference to it. I think you could automate that by iterating over msgSegment._meta.fields, but I might be overlooking a subtlety of inheritance or something. And any many-to-many fields would be a pain.
Or, if you just want to hack it, edit the internal object tracking it. I wouldn't generally recommend that but it's going to be changed when you save anyway.
newMessageSegment._state.db = "default"
newMessageSegment.msg = newMessage
newMessageSegment.save(using="default")

Related

How to access data in documents from realtime listener python

Apologies if some of this doesn't make sense, I'm struggling to understand realtime listeners completely.
I'm trying to add a realtime listener to the chat part of my app, so I can add new messages to the screen as they come into the database. In the below code I load all current messages to the screen when the user opens the page and then I (try to) add the realtime listener in so any new messages can be added to the screen.
However, the doc_snapshot is just a list of the document ids, rather than the message_dict I have been using above, how do I access the data for each document in doc_snapshot, rather than just the id?
Or am I doing it completely wrong, should I not load do a one-time load of the messages when the screen is opened and just use a realtime listener to load the messages and listen for new messages?
self.local_id is the id of the user who has logged in, doc_id is the id of the person they're messaging.
def move_to_chat(self, doc_id):
group_id = self.local_id + ":"+ doc_id
doc_ref = self.my_firestore.db.collection(u'messages').document(group_id)
doc = doc_ref.get()
if doc.exists: # Check if the document exists. If it does, load the messages to the screen
get_messages = self.my_firestore.db.collection(u'messages').document(group_id).collection(group_id).order_by(u'Timestamp').limit(20)
messages = get_messages.stream()
for message in messages:
message_dict = message.to_dict()
try:
if message_dict['IdFrom'] == self.local_id:
#Add label to left of screen
else:
#Add label to right of screen
except:
pass
else: # If it doesn't, create it
self.my_firestore.db.collection(u'messages').document(group_id).set({
u'GroupId': group_id
})
add_to_doc = self.my_firestore.db.collection(u'messages').document(group_id).collection(group_id).document()
add_to_doc.set({
u'Timestamp': datetime.datetime.now()
})
# Watch for new messages
self.query_watch = self.my_firestore.db.collection(u'messages').document(group_id).collection(group_id)
# Watch the document
self.query_watch.on_snapshot(self.on_snapshot)
def on_snapshot(self, doc_snapshot, changes, read_time):
for doc in doc_snapshot:
#Here's where I'd like to access data from the documents, to find the message that has been added.
The Google Firestore documentation for Snapshot method explained the Classes for representing documents for the Google Cloud Firestore API.You can refer to this document to conform the returning values.

Python Heroku Scheduler DB commit doesnt change values - DetachedInstanceError

I am using following code in the free heroku scheduler add-on to send emails to certain users. After the email was sent a value in the DB must be changed, to be more precise:
setattr(user, "stats_email_sent", True)
Somehow the db_session.commit() is executed but doesnt save the new value. Here is the code:
all_users = User.query.all()
for user in all_users:
if user.stats_email_sent is False and user.number_of_rooms > 0:
if date.today() <= user.end_offer_date and date.today() >= user.end_offer_date - relativedelta(days=10):
print user.id, user.email, user.number_of_rooms, user.bezahlt
if user.bezahlt is True:
with app.app_context():
print "app context true", user.id, user.email, user.number_of_rooms, user.bezahlt
html = render_template('stats_email_once.html', usersname=user.username)
subject = u"Update"
setattr(user, "stats_email_sent", True)
#send_email(user.email, subject, html, None)
else:
with app.app_context():
print "app context false", user.id, user.email, user.number_of_rooms, user.bezahlt
html = render_template('stats_email_once.html', usersname=user.username)
subject = u"Update"
setattr(user, "stats_email_sent", True)
#send_email(user.email, subject, html, None)
print "executing commit"
db_session.commit()
I tryed moving db_session.commit() right after setattr then it will work, but only for one user (for the first user).
setattr(user, "stats_email_sent", True)
db_session.commit()
And give me this in the logs:
sqlalchemy.orm.exc.DetachedInstanceError: Instance <User at 0x7f5c04d462d0> is not bound to a Session; attribute refresh operation cannot proceed
I also found some topics on detaching an instance, but I don't need to detach here anything, or?
EDIT:
I tryed now also adding db_session.expire_on_commit = False. This sadly had no effect.
I also looked on the bulk updates, which also didn't worked.
I even tryed to ignore and pass the DetachedInstanceError.
I cant believe that updating multiple rows at once is such an issue. I am running out of ideas here. Any help is appriciated. It either has no effect or is run into DetachedInstanceError.
EDIT
I solved the issue, had yesterday a similar one. I assume the query and its variables were consumed, thats why it didnt worked.
To solve this I created a list client_id_list = [] and appended all the ids of users which got the email and where the value needs to be changed.
Then I created a completely new query and all in all run the same code with the same logic, but the variables here were not consumed I guess? I wont answer the question again, because I am not sure whether this is true or not, here is the code which I appended to the code above which changes the values:
all_users_again = User.query.all()
for user in all_users_again:
if user.id in client_id_list:
setattr(user, "stats_email_sent", True)
db_session.commit()

Django - request.session not being saved

I have a pretty simply utility function that gets an open web order if their is a session key called 'orderId', and will create one if there is no session key, and the parameter 'createIfNotFound' is equal to true in the function. Stepping through it with my debugger I can see that the piece of code that sets the session key after an order has been created does get hit with no exceptions, but when I check the Http request object' session field, it does not have that attribute ?
Utility
def get_open_web_order(request, createIfNotFound=False):
# Check for orderId in session
order_id = request.session.get('orderId')
web_order = None
if None != order_id:
try:
web_order = WebOrder.objects.get(id=order_id, status='O')
logging.info('Found open web order')
except WebOrder.DoesNotExist:
logging.info('Web order not found')
if (None == web_order) and (createIfNotFound == True):
logging.info('Creating new web order')
web_order = WebOrder()
web_order.status = 'O'
web_order.save()
request.session['orderId'] = web_order.id
# Assign logged in user and default billing and shipping
if request.user.is_authenticated() and hasattr(request.user, 'customer'):
customer = request.user.customer
web_order.customer = customer
web_order.set_defaults_from_customer()
web_order.save()
return web_order
In some cases you need to explicitly tell the session that it has been modified.
You can do this by adding request.session.modified = True to your view, after changing something in session
You can read more on this here - https://docs.djangoproject.com/en/1.10/topics/http/sessions/#when-sessions-are-saved
I had a similar issue, turns out I had set SESSION_COOKIE_DOMAIN in settings.py to the incorrect domain so it would not save any of my new session data. If you are using SESSION_COOKIE_DOMAIN, try checking that!
For example, if I am running the server on my localhost but I have in my settings SESSION_COOKIE_DOMAIN = "notlocalhost", then nothing I change in request.session will save.

Django Channels

I've little question about Django Channels, WebSockets, and chat applications. Serving with google gets me to chatrooms, where people can connect and start a chat. But I don't know how one user can send another user instant message.
For example:
1) I add John to friends, and want to start chat.
2) On server side I can generate object Room, with me and John as members.
3) When I send message via WebSocket to this room, I know for who this message is, but I don't know how to get John's channel
#channel_session_user_from_http
def ws_connect(message):
rooms_with_user = Room.objects.filter(members=message.user)
for r in rooms_with_user:
Group('%s' % r.name).add(message.reply_channel)
#channel_session_user
def ws_receive(message):
prefix, label = message['path'].strip('/').split('/')
try:
room = Room.objects.get(name=label)
except Exception, e:
room = Room.objects.create(name=get_random_string(30))
for u in message.chmembers:
room.members.add(u)
# here can be somethis like this
# try
reply_channel = Channels.objects.get(online=True, user=u)
Group('%s' % r.name).add(reply_channel)
Group('%s' % room.name).send({
"text": "%s : %s" % (message.user.username, message['text']),
})
#channel_session_user
def ws_disconnect(message):
prefix, label = message['path'].strip('/').split('/')
Group(label).discard(message.reply_channel)
Simply make "automatic unique rooms" for user pairs. The rest stays the same. For example like this
def get_group_name(user1, user2):
return 'chat-{}-{}'.format(*sorted([user1.id, user2.id]))
Give it two user objects, and it returns a unique room for that pair of users, ordered the User.id, something like "chat-1-2" for the users with User.id "1" and "2".
That way, a user can connect with more than one logged-in device and still get the messages sent between the two users.
You can get the authenticated user's object from message.user.
For the receiving User object, I'd just sent the username along with the message. Then you can unpack it from the message['text'] the same way you unpack the actual message.
payload = json.loads(message.content['text'])
msg = payload['msg']
sender = message.user
receiver = get_object_or_404(User, username=payload['receiver'])
# ... here you could check if they have required permission ...
group_name = get_group_name(sender, receiver)
response = {'msg': msg}
Group(group_name).send({'text': json.dumps(response)})
# ... here you could persist the message in a database ...
So with that, you can drop all the "room" things from your example, including the room table etc. Because group names are always created on-the-fly when a message is send between two users.
Another important thing: One user will connect later than the other user, and may miss initial messages. So when you connect, you probably want to check some "chat_messages" database table, fetch the last 10 or 20 messages between the user pair, and send those back. So users can catch up on their past conversation.

App Engine Python db.delete not working as expected

Please help me understand what I must be doing wrong here.
I run a small video game on Google App Engine, and within the game we have an internal messaging service. Each message has a list of status keys that keep track of whether a player has read a message or not.
My problem is that when I attempt to delete a list of keys, only one of the entities is removed from the datastore, regardless of the number of keys in the list.
class Game_Message(db.Model)
sender = db.StringProperty()
recipients = db.ListProperty(str)
status_keys = db.ListProperty(db.Key)
payload = db.TextProperty()
def add_status_keys(self):
self.status_keys = []
for user_id in self.recipients:
gms = Game_Message_Status.create(self, user_id)
self.status_keys.append(gms.key())
def remove_status_keys(self):
db.delete(self.status_keys)
What I have found is that calling db.delete multiple times does delete all the entities, but I don't understand why.
for example: This works correctly.
def remove_status_keys(self):
db.delete(self.status_keys)
db.delete(self.status_keys)

Categories

Resources