I am building a Django application which will be contacted by multiple external applications. Django application is supposed to provide UI and populate the database with the data received from external applications.
First idea was to use django_rest_framework but this seemed like creating a tightly coupled system because every external app will have to contact the Django app via REST call.
My other idea is best described with a picture: http://imgur.com/vakZvQs Several publishers would create messages on an RabbitMQ and my Django would consume those and create appropriate models in the DB.
Is something like this possible? I've used async examples from pika library for publisher and consumer and the messages are flowing as expected. Throwing Django in the mix produces errors such as:
RuntimeError: Model class django.contrib.contenttypes.models.ContentType doesn't declare an explicit app_label
django.core.exceptions.ImproperlyConfigured: Requested setting LOGGING_CONFIG, but settings are not configured. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings.
Code excerpts:
# pika consumer
def on_message(self, unused_channel, basic_deliver, properties, body):
# invoking view function
from myapp.views import create_one_foo
create_one_foo()
self.acknowledge_message(basic_deliver.delivery_tag)
# views.py
from .models import Foo
def create_one_foo():
foo = Foo()
foo.bar = "bar"
foo.save()
I had similar issues, it was solved by calling these two lines before you import the models.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "admin.settings")
django.setup()
and then
from .models import Foo
I am still learning django, if I find a detailed explanation I will edit my answer
Use this article for creating consumer
import json
import pika
import django
from sys import path
from os import environ
path.append('/home/john/Dev/SECTION/Likes/Likes/settings.py') #Your path to settings.py file
environ.setdefault('DJANGO_SETTINGS_MODULE', 'Likes.settings')
django.setup()
from likes.models import Quote
connection = pika.BlockingConnection(pika.ConnectionParameters('localhost', heartbeat=600, blocked_connection_timeout=300))
channel = connection.channel()
channel.queue_declare(queue='likes')
def callback(ch, method, properties, body):
print("Received in likes...")
print(body)
data = json.loads(body)
print(data)
if properties.content_type == 'quote_created':
quote = Quote.objects.create(id=data['id'], title=data['title'])
quote.save()
print("quote created")
elif properties.content_type == 'quote_updated':
quote = Quote.objects.get(id=data['id'])
quote.title = data['title']
quote.save()
print("quote updated")
elif properties.content_type == 'quote_deleted':
quote = Quote.objects.get(id=data)
quote.delete()
print("quote deleted")
channel.basic_consume(queue='likes', on_message_callback=callback, auto_ack=True)
print("Started Consuming...")
channel.start_consuming()
Look at celery: http://www.celeryproject.org It's a framework helping to create RabbitMQ-based workers
Run a celery worker service on a host where your Django app is. If you need to change the state of Django DB, just import the Django models and put data to the database by the worker. Otherwise you can run celery workers inside Django app.
Related
I am trying to use a global configuration when defining an ElasticSearch DSL model, which is more or less a regular Python class aka service class.
"""Define models"""
from elasticsearch_dsl import Document, Text
from flask import current_app
class Greeting(Document):
"""Define Greeting model"""
message = Text()
class Index:
name = current_app.config['GREETINGS_INDEX']
def save(self, ** kwargs):
return super().save(** kwargs)
Unfortunately, if my import statement is at the top of the view file, I get this error message:
RuntimeError: Working outside of application context.
This typically means that you attempted to use functionality that needed
to interface with the current application object in some way. To solve
this, set up an application context with app.app_context(). See the
documentation for more information.
The only way to get things to work is if I import the model/service class inside the request like this:
from elasticsearch_dsl import Search
from flask import Blueprint, current_app
# from .models import Greeting ### this will throw the application context error
greetings = Blueprint(
'greetings',
__name__,
url_prefix='/greetings/'
)
...
#greetings.route("/elasticsearch/new/")
def new_greeting_using_elasticsearch():
from .models import Greeting ### this avoids the application context error
Greeting.init()
greeting = Greeting(message="hello, elastic")
greeting.save()
return(
"a greeting was saved; "
"it is viewable from https://localhost:5000/greetings/elasticsearch/"
)
This seems like a code smell. Is there another way to accomplish using reusing configurations that can keep the import statement at the top of the file?
These questions/answers seem to suggest that this is the only way:
How to access config value outside view function in flask
Flask - RuntimeError: Working outside of application context
Am I missing something? Should I rearchitect my application to avoid this? Or is this just a Flask-way/thing?
Thank you for your help 🙏
Other questions/answers/articles that did not help me:
"RuntimeError: Working outside of application context " with Python Flask app ( Sending gmail using scheduler )
https://flask.palletsprojects.com/en/0.12.x/appcontext/#creating-an-application-context
Access config values in Flask from other files
RuntimeError: working outside of application context
Python #property in Flask configs?
Reading properties from config file with Flask and Python
I'm trying to create a simple twitter bot that recieves info from different API's and posts the tweets based on the info. I'm using a Django for this (not sure if completely needed, but I'm just trying to learn the framework) so I created 2 different apps, one that recieves the information from the API and creates an object (so far this object is just a quote) and send it to another app which handles the twitter publication. The general idea is that my quote app will generate a signal which will be sent to the posting app where I created a function with a receiver decorator so it will be listening all the time. Nevertheless, for some reason the decorator is not working and after sending the signal I get no response.
This is the creation of the signal:
from django.dispatch import Signal
new_quote = Signal(providing_args=['author', 'content'])
This is the sending of the signal:
quote=Quote.objects.create(author=author, content=content)
new_quote.send_robust(sender=quote, author=author, content=content)
The object is being created with no problem, already check that.
And this is the catching of the signal.
from .models import Post
from django.dispatch import receiver
from quotes.signals import new_quote
from quotes.models import Quote
#receiver(new_quote, sender=Quote)
def post_tweet(Quote, **kwargs):
print('here')
auth = kwargs['author']
content = kwargs['content']
Post.objects.create(title=auth, content=content)
The print is just for checking if the function actually runs.
The posting creation also works fine, already check that too.
I'm just learning Django and already read the documentation and followed the steps of the 'tutorial', but there must be something I'm not seeing.
EDIT:
app file of the post creator
from django.apps import AppConfig
class PostsConfig(AppConfig):
name = 'posts'
def ready(self):
import posts.signals
app file of quotes:
from django.apps import AppConfig
class QuotesConfig(AppConfig):
name = 'quotes'
def ready(self):
pass
The sender is quote which is an instance of the Quote class and the receiver is defined for sender=Quote where the sender is the Quote class. So, the sender doesn't match. According to the documentation on sending signals you want to define the sender as Quote instead of quote when you call send_robust.
I would like to use my Django application as a relay for a session-based online service and share this session among all users. For this, I've configured a python-requests Session object.
I would like to initialise this Session when Django starts up and keep it alive forever. My idea is to have all requests to my Django application share the session object by allowing the view for the particular request access the session object.
In Flask setting this up (for experimental purposes) is fairly easy:
from flask import Flask, render_template, request, session
from requests import Session
app = Flask(__name__)
session = Session()
session.post() # Setup Session by logging in
#app.route("/")
def use_session():
reply = session.get() # Get resource from web service
return jsonify(reply)
Here session would be created when starting and can be accessed by use_session().
I struggle to set up the same in Django though. Where would be the preferred place to create the session?
The equivalent of your Flask code in Django would be to put the same logic in a views.py file:
# yourapp/views.py
from django.http import HttpResponse
from requests import Session
session = Session()
session.post('https://httpbin.org/post') # Setup Session by logging in
def use_session(request):
reply = session.get('https://example.com') # Get resource from web service
return HttpResponse(reply.content, status=reply.status_code)
# yourproject/urls.py
from django.urls import path
from yourapp.views import use_session
urlpatterns = [
path('', use_session)
]
The object will get created as you start the server.
One problem with this approach is that in a real-world deployment you normally run multiple copies of your app (doesn't matter if it's Flask or Django or any other framework) to avoid thread blocking, utilize multiple CPU cores on the same server or have multiple servers, each copy will end up with its own session, which might work fine, but might be problematic.
In a situation like this, you can share a Python object1 across multiple processes by serializing it using the pickle module and storing the byte data in a database. A good place for the data would be something like Redis or Memcached but Django's ORM can be used as well:
# yourapp/models.py
from django.db import models
class Session(models.Model):
pickled = models.BinaryField()
# yourapp/views.py
import pickle
from django.http import HttpResponse
from requests import Session
from . import models
def use_session(request):
# Load the pickled session from the database
session_db_obj = models.Session.objects.first()
if session_db_obj:
# Un-pickle the session
session = pickle.loads(session_db_obj.pickled)
else:
# Create a new session
session = Session()
session.post('https://httpbin.org/post') # Setup Session by logging in
# Create the database object, pickle the session and save it
session_db_obj = models.Session()
session_db_obj.pickled = pickle.dumps(session)
session_db_obj.save()
reply = session.get('https://example.com') # Get resource from web service
return HttpResponse(reply.content, status=reply.status_code)
1: Not any object can be pickled and unpickled reliably, be careful!
Probably the best place to set this up is in settings.py since it's called before the application is initialized and you can easily import your code from there.
This being said, you may want to look into connection pooling or have a wrapper on top of your session which can recreate it in case of failure. Networks are not as reliable as they seem and if you plan to keep the session running for a long time it increases the chances that the session will be stopped at some point.
I've met a scenario which I have to override a common middleware in CKAN. And in CKAN default plugins/interface.py:
class IMiddleware(Interface):
u'''Hook into CKAN middleware stack
Note that methods on this interface will be called two times,
one for the Pylons stack and one for the Flask stack (eventually
there will be only the Flask stack).
'''
def make_middleware(self, app, config):
u'''Return an app configured with this middleware
When called on the Flask stack, this method will get the actual Flask
app so plugins wanting to install Flask extensions can do it like
this::
import ckan.plugins as p
from flask_mail import Mail
class MyPlugin(p.SingletonPlugin):
p.implements(p.I18nMiddleware)
def make_middleware(app, config):
mail = Mail(app)
return app
'''
return app
It shows that I have to define "MyMiddleWare" class under a plugin that I want to implement in an extension. However, as it shows in the example, the actual middleware Mail is imported from a different class. I want to override TrackingMiddleware especially the __call__(self, environ, start_response) method, which environ and start_response are passed in when make_pylons_stack are invoked during the configuration phase. If I want to override TrackingMiddleware Should I create another config/middleware/myTrackingMiddleware.py under ckanext-myext/ and then in plugin.py I implement the following?:
from myTrackingMiddleware import MyTrackingMiddleware
class MyPlugin(plugins.SingletonPlugin):
plugins.implements(plugins.IMiddleware, inherit=True)
def make_middleware(self, app, config):
tracking = MytrackingMiddleware(app, config)
return app
Update:
I tried to make the myTrackingMiddleware in an hierarchy and imported it in plugin.py, but I didn't received any request to '/_tracking' in myTrackingMiddleware.
I have implemented a set of process, and it works for myself. Basically, I kept what I have done as what I have mentioned in my own question. Then, if your middleware has some conflict with CKAN default Middleware, you probably have to completely disable the default one. I discussed with some major contributors of CKAN here: https://github.com/ckan/ckan/issues/4451. After I disabled CKAN ckan.tracking_enabled in dev.ini, I have the flexibility to get values from environ and handle tracking with my customized logic.
I'd like to have these lines of code executed on server startup (both development and production):
from django.core import management
management.call_command('syncdb', interactive=False)
Putting it in settings.py doesn't work, as it requires the settings to be loaded already.
Putting them in a view and accessing that view externally doesn't work either, as there are some middlewares that use the database and those will fail and not let me access the view.
Putting them in a middleware would work, but that would get called each time my app is accessed. An possible solution might be to create a middleware that does all the job and then removes itself from MIDDLEWARE_CLASSES so it's not called anymore. Can I do that without too much monkey-patching?
Write middleware that does this in __init__ and afterwards raise django.core.exceptions.MiddlewareNotUsed from the __init__, django will remove it for all requests :). __init__ is called at startup by the way, not at the first request, so it won't block your first user.
There is talk about adding a startup signal, but that won't be available soon (a major problem for example is when this signal should be sent)
Related Ticket: https://code.djangoproject.com/ticket/13024
Update: Django 1.7 includes support for this. (Documentation, as linked by the ticket)
In Django 1.7+ if you want to run a startup code and,
1. Avoid running it in migrate, makemigrations, shell sessions, ...
2. Avoid running it twice or more
A solution would be:
file: myapp/apps.py
from django.apps import AppConfig
def startup():
# startup code goes here
class MyAppConfig(AppConfig):
name = 'myapp'
verbose_name = "My Application"
def ready(self):
import os
if os.environ.get('RUN_MAIN'):
startup()
file: myapp/__init__.py
default_app_config = 'myapp.apps.MyAppConfig'
This post is using suggestions from #Pykler and #bdoering
If you were using Apache/mod_wsgi for both, use the WSGI script file described in:
http://blog.dscpl.com.au/2010/03/improved-wsgi-script-for-use-with.html
Add what you need after language translations are activated.
Thus:
import sys
sys.path.insert(0, '/usr/local/django/mysite')
import settings
import django.core.management
django.core.management.setup_environ(settings)
utility = django.core.management.ManagementUtility()
command = utility.fetch_command('runserver')
command.validate()
import django.conf
import django.utils
django.utils.translation.activate(django.conf.settings.LANGUAGE_CODE)
# Your line here.
django.core.management.call_command('syncdb', interactive=False)
import django.core.handlers.wsgi
application = django.core.handlers.wsgi.WSGIHandler()
You can create a custom command and write your code in the handle function. details here https://docs.djangoproject.com/en/dev/howto/custom-management-commands/
Then you can create a startup script that runs the django server then executes your new custom command.
If you are using mod_wsgi you can put it in the wsgi start app
Here is how I work around the missing startup signal for Django:
https://github.com/lsaffre/djangosite/blob/master/djangosite/models.py
The code that is being called there is specific to my djangosite project, but the trick to get it called by writing a special app (based on an idea by Ross McFarland) should work for other environments.
Luc