Get image full url on Django Channel Response - python

I have created a socket with Django channels that return the serialized data of Category Object. But in the response, there is no full URL(the IP address is not there). This problem is similar to this question Django serializer Imagefield to get full URL. The difference is that I am calling the Serializer from a Consumer(Django Channels). Whereas in the link, Serializer is called from a View. In a Consumer, there is no request object as mentioned in the solution. The Django Channels says that scope in Consumers is similar to request in Views. So how can I get the full image url in this case?

The Django Channels says that scope in Consumers is similar to request in Views.
Correct; therefore it depends how to setup your events in the AsyncConsumer.
If you could share more about your code or a better explanation with a dummy example.
In general:
Import the serializers in the consumers and then send the same data to the serializers as shown below.
from <app_name>.serializers import <desired_serializer_name>Serializer
from channels.db import database_sync_to_async
#database_sync_to_async
def serializer_checking_saving_data(self, data):
serializer = <desired_serializer_name>Serializer(data=data)
serializer.is_valid(raise_exception=True)
x = serializer.create(serializer.validated_data)#this will create the value in the DB
return <desired_serializer_name>Serializer(x).data
To fetch data from a websocket request:
Setup a receive event (ie channel-layer will receive the data) wherein it would trigger a particular event[for example I will implement to simply display that data]
#write this inside the AsyncWebsocketConsumer
async def receive_json(self, content, **kwargs):
"""[summary]
• All the events received to the server will be evaluated here.
• If websocket has event-type based on these the receive function will execute
the respective function
"""
message_type = content.get('type')
if message_type == 'start.sepsis':
await self.display_the_data(content)
async def display_the_data(self,data)
message = data.get('payload')
print(f"The data sent to the channel/socket is \n {data}")
You can make the websocket request in the following way:-
create a new python file
import json
import websocket
import asyncio
async def making_websocket_request():
ws_pat = websocket.WebSocket()
ws_pat.connect(
'ws://localhost:8000/<ws-router-url>/')
asyncio.sleep(2)#it might take a couple of seconds to connect to the server
ws.send(json.dumps({
'type':'display.the_data'
#the channels will convert "display.the_data" to "display_the_data"
#since "display_the_data" their is an event as defined above it would be called
'payload':{<can-be-any-json-data>}
#this payload will be sent as a parameter when calling the function.
}))

Related

IB_insync - Sanic error after one successful order preventing any further orders

I'm writing an API using ib_insync, Sanic and ngrok to forward webhook signals from Tradingview onto Interactive Brokers. It works on only the first attempt and the following error is thrown preventing any further orders:
[ERROR] Exception occurred while handling uri:
'http://url.ngrok.io/webhook' Traceback (most recent
call last): File "handle_request", line 103, in handle_request
"_future_listeners", sanic.exceptions.ServerError: Invalid response type None (need HTTPResponse)
The code is as follows:
from datetime import datetime
from sanic import Sanic
from sanic import response
from ib_insync import *
#Create Sanic object
app = Sanic(__name__)
app.ib = None
#Create root
#app.route('/')
async def root(request):
return response.text('online')
#Listen for signals and execute orders
#app.route('/webhook', methods=['POST'])
async def webhook(request):
if request.method == 'POST':
await checkIfReconnect()
#Parse alert data
alert = request.json
order = MarketOrder(alert['action'],alert['quantity'],account=app.ib.wrapper.accounts[0])
#Submit market order
stock_contract = Stock('NVDA','SMART','USD')
app.ib.placeOrder(stock_contract,order)
#Reconnect if needed
async def checkIfReconnect():
if not app.ib.isConnected() or not app.ib.client.isConnected():
app.ib.disconnect()
app.ib = IB()
app.ib.connect('127.0.0.1',7496,clientId=1)
#Run app
if __name__ == '__main__':
#Connect to IB
app.ib = IB()
app.ib.connect('127.0.0.1',7496,clientId=1)
app.run(port=5000)
You are seeing this error because you have forgotten to send a response to the first POST request. All HTTP requests need a corresponding response, even if it is just for triggering an action.
Ie, change your webhook code to this:
#app.route('/webhook', methods=['POST'])
async def webhook(request):
if request.method == 'POST':
await checkIfReconnect()
#Parse alert data
alert = request.json
order = MarketOrder(alert['action'],alert['quantity'],account=app.ib.wrapper.accounts[0])
#Submit market order
stock_contract = Stock('NVDA','SMART','USD')
app.ib.placeOrder(stock_contract,order)
return HTTPResponse("ok", 200) #<-- This line added
return HTTPResponse("", 405) #<-- else return this
Im working with the same code, using breakpoints every POST triggers correctly. I see exactly what you mean how only 1 order can be placed each time the app is started. I tried using ib.qualifyContract(Stock) but it creates an error within the loop for the webhook. I wonder if you can move your order placement outside any loop functions. Ill try when I get some time and report back.
I'm using almost the same script as you do.
I'm guessing your problem is not the http response as mentioned before(I don't use it).
The thing is that each order sent to IB has to have a unique identifier and im not seeing you applied to your code.
You can read about here (https://interactivebrokers.github.io/tws-api/order_submission.html).
I found a way for doing that but its complicated for me to explain here.
basically you'll have to add the order id to each and every order sent and from there there are two ways to go:
connect properly to IBapi and check for the current available unique order ID
use an ID of your own(numeric) for example in a form of a loop, and reset the sequence in TWS on each restart of the script as shown on the link I added.
I encountered the same issue (_future_listeners) while working on the same code. Been looking around to find the solution but none of them worked so far. I am sharing this to see if you guys were able to fix it.
I tried (or intended to try) these solutions:
1- I used asynchronous connect (app.ib.connectAsync) instead of app.ib.connect in both places. But it returned the await warning error. The second app.ib.connectAsync is outside an async function so cannot be awaited. You can run the code but it gives another error: "list index out of range" for the MarketOrder function.
2- I added app.ib.qualifyContracts . but it did not resolve the issue as well. I used it and not even the first order was sent to TWS.
3- Adding the unique orderid. I have not tried it because I am not sure if it works. I printed the orders, it seems like they already been ordered.
I started out with the same widely distributed boilerplate code you are using. After making the changes outlined below, my code works. I lack the expertise to explain why it does--but it does.
Assuming you have the following installed:
Python 3.10.7_64
Sanic 22.6.2
ib_insync 0.9.71
(1) pip install --upgrade sanic-cors sanic-plugin-toolkit
ref: IB_insync - Sanic error after one successful order preventing any further orders
(not sure if necessary)
(2) Add:
import time (see clientId note below)
(3) Add:
import nest_asyncio
nest_asyncio.apply()
#ref: RuntimeError: This event loop is already running in python
(4) Use Async connect in initial connect at bottom (but not in the reconnect area)
app.ib.connectAsync('127.0.0.1',7497,clientId=see below) # 7946=live, 7947=paper
ref:IB_insync - Sanic error after one successful order preventing any further orders
(5) Use time to derive a unique, ascending clientId in both connect statements
clientId=int(int((time.time()*1000)-1663849395690)/1000000))
This helps avoid a "socket in use" condition on reconnect
(6) Add HTTPResponse statements as suggested above.
(7) Per Ewald de Wit, ib_insync author:
"There's no need to set the orderId, it's issued automatically when the
order is placed."
There is an alternative, return return response.json({}) at the end of the async function webhook
........
from sanic import response
......
#Listen for signals and execute orders
#app.route('/webhook', methods=['POST'])
async def webhook(request):
if request.method == 'POST':
await checkIfReconnect()
#Parse alert data
alert = request.json
order = MarketOrder(alert['action'],alert['quantity'],account=app.ib.wrapper.accounts[0])
#Submit market order
stock_contract = Stock('NVDA','SMART','USD')
app.ib.placeOrder(stock_contract,order)
return response.json({}) # return a empty JSON

How to send a Starlette FormData data structure to a FastAPI endpoint via python request library

My system architecture currently sends a form data blob from the frontend to the backend, both hosted on localhost on different ports. The form data is recieved in the backend via the FastAPI library as shown.
#app.post('/avatar/request')
async def get_avatar_request(request: Request, Authorize: AuthJWT = Depends()):
form = await request.form()
return run_function_in_jwt_wrapper(get_avatar_output, form, Authorize, False)
Currently, I am trying to relay the form data unmodified to another FASTApi end point from the backend using the request library, as follows:
response = requests.post(models_config["avatar_api"], data = form_data, headers = {"response-type": "blob"})
While the destination endpoint does receive the Form Data, it seemed to not have parsed the UploadFile component properly. Instead of getting the corresponding starlette UploadFile data structure, I instead receive the string of the classname, as shown in this error message:
FormData([('avatarFile', '<starlette.datastructures.UploadFile object at 0x7f8d25468550>'), ('avatarFileType', 'jpeg'), ('background', 'From Image'), ('voice', 'en-US-Wavenet-B'), ('transcriptKind', 'text'), ('translateToLanguage', 'No translation'), ('transcriptText', 'do')])
How should I handle this problem?
FileUpload is a python object, you'd need to serialize it somehow before using requests.post() then deserialize it before actually getting the content out of it via content = await form["upload-file"].read(). I don't think you'd want to serialize a FileUpload object though (if it is possible), rather you'd read the content of the form data and then post that.
Even better, if your other FastAPI endpoint is part of the same service, you might consider just calling a function instead and avoid requests altogether (maybe use a controller function that the route function calls in case you also need this endpoint to be callable from outside the service, then just call the controller function directly avoiding the route and the need for requests). This way you can pass whatever you want without needing to serialize it.
If you must use requests, then I'd read the content of the form then create a new post with with that form data. e.g.
form = await request.form() # starlette.datastructures.FormData
upload_file = form["upload_file"] # starlette.datastructures.UploadFile - not used here, but just for illustrative purposes
filename = form["upload_file"].filename # str
contents = await form["upload_file"].read() # bytes
content_type = form["upload_file"].content_type # str
...
data = {k: v for k, v in form.items() if k != "upload-file"} # the form data except the file
files = {"upload-file": (filename, contents, content_type)} # the file
requests.post(models_config["avatar_api"], files=files, data=data, headers = {"response-type": "blob"})

API request to already opened django channels consumer

I've got a django channels consumer communicating with a client. I've got a view from an external API that wants something from the client. From this view I want then to tell that consumer to ask a request to the client through his socket.
I'm currently exploring django rest framework but I can't find a way for now to directly ask anything to that consumer.
Well I've got an idea but it involves creating another socket and communicate through channels' channel. But I wish I could get rid of this overload.
From your reponse in the comments, it seems you want to send a message to the client through the consumer from your DRF view. You can check out the answer to a similar question.
First, you need to have a method in your consumer that sends a message back to the client:
...
async def send_alert(self, event):
# Send message to WebSocket
await self.send(text_data={
'type': 'alert',
'details': 'An external API api.external.com needs some data from you'
})
...
So now you can send a message to this method. Assuming the client is connected to channel1, you can do this in your view:
from channels.layers import get_channel_layer
from asgiref.sync import async_to_sync
...
channel_layer = get_channel_layer()
async_to_sync(channel_layer.send)("channel1", {
"type": "send.alert"
})
...
async_to_sync usage

Flask + RabbitMQ + SocketIO - forwarding messages

I'm facing problem in emiting messages from RabbitMQ to User via SocketIO.
I have Flask application with SocketIO integration.
Current user flow seems like
The problem is i'm not able to set up RabbitMQ listener which forward messages to browser via SocketIO. Every time i'm getting different error. Mostly is that connection is closed, or i'm working outside of application context.
I tried many approaches, here is my last one.
# callback
def mq_listen(uid):
rabbit = RabbitMQ()
def cb(ch, method, properties, body, mq=rabbit):
to_return = [0] # mutable
message = Message.load(body)
to_return[0] = message.get_message()
emit('report_part', {"data": to_return[0]})
rabbit.listen('results', callback=cb, id=uid)
# this is the page, which user reach
#blueprint.route('/report_result/<uid>', methods=['GET'])
def report_result(uid):
thread = threading.Thread(target=mq_listen, args=(uid,))
thread.start()
return render_template("property/report_result.html", socket_id=uid)
where rabbit.listen method is abstraction like:
def listen(self, queue_name, callback=None, id=None):
if callback is not None:
callback_function = callback
else:
callback_function = self.__callback
if id is None:
self.channel.queue_declare(queue=queue_name, durable=True)
self.channel.basic_qos(prefetch_count=1)
self.consumer_tag = self.channel.basic_consume(callback_function, queue=queue_name)
self.channel.start_consuming()
else:
self.channel.exchange_declare(exchange=queue_name, type='direct')
result = self.channel.queue_declare(exclusive=True)
exchange_name = result.method.queue
self.channel.queue_bind(exchange=queue_name, queue=exchange_name, routing_key=id)
self.channel.basic_consume(callback_function, queue=exchange_name, no_ack=True)
self.channel.start_consuming()
which resulted into
RuntimeError: working outside of request context
I will be happy for any tip or example of usage.
Thanks a lot
I had a similar issue, in the end of the day it's because when you make a request flask passes the request context to client. But the solution is NOT to add with app.app_context(). That is hackey and will definitely have errors as you're not natively sending the request context.
My solution was to create a redirect so that the request context is maintained like:
def sendToRedisFeed(eventPerson, type):
eventPerson['type'] = type
requests.get('http://localhost:5012/zmq-redirect', json=eventPerson)
This is my redirect function, so whenever there is an event I'd like to push to my PubSub it goes through this function, which then pushes to that localhost endpoint.
from flask_sse import sse
app.register_blueprint(sse, url_prefix='/stream')
#app.route('/zmq-redirect', methods=['GET'])
def send_message():
try:
sse.publish(request.get_json(), type='greeting')
return Response('Sent!', mimetype="text/event-stream")
except Exception as e:
print (e)
pass
Now, whenever an event is pushed to my /zmq-redirect endpoint, it is redirected and published via SSE.
And now finally, just to wrap everything up, the client:
var source = new EventSource("/stream");
source.addEventListener(
"greeting",
function(event) {
console.log(event)
}
)
The error message suggests that it's a Flask issue. While handling requests, Flask sets a context, but because you're using threads this context is lost. By the time it's needed, it is no longer available, so Flask gives the "working outside of request context" error.
A common way to resolve this is to provide the context manually. There is a section about this in the documentation: http://flask.pocoo.org/docs/1.0/appcontext/#manually-push-a-context
Your code doesn't show the socketio part. But I wonder if using something like flask-socketio could simplify some stuff... (https://flask-socketio.readthedocs.io/en/latest/). I would open up the RabbitMQ connection in the background (preferably once) and use the emit function to send any updates to connected SocketIO clients.

How to push notification from server (django) to client (socketio)?

I want to emit message from server to client.
I have look at this but cannot use because I cannot create a namespace instance.
How to emit SocketIO event on the serverside
My use case is:
I have a database of price of product. A lot of users are currently surf my website. Some of them is viewing product X.
On the server side, the admin can edit the price of the product. If he edit the price of X, all the client must see the notification that X price change (e.x: a simple js alert).
My client javascript now:
var socket = io.connect('/product');
#notify server that this client is viewing product X
socket.emit("join", current_product.id);
#upon receive msg from server
socket.on('notification', function (data) {
alert("Price change");
});
My server code (socket.py):
#namespace('/products')
class ProductsNamespace(BaseNamespace, ProductSubscriberMixin):
def initialize(self, *args, **kwargs):
_connections[id(self)] = self
super(ProductsNamespace, self).initialize(*args, **kwargs)
def disconnect(self, *args, **kwargs):
del _connections[id(self)]
super(ProductsNamespace, self).disconnect(*args, **kwargs)
def on_join(self, *args):
print "joining"
def emit_to_subscribers(self): pass
I use the runserver_socketio.py as in this link.
(Thanks to Calvin Cheng for this excellent up-to-date example.)
I don't know how to call the emit_to_subscribers. Since I have no instance of namespace.
As I read from this doc ,
Namespaces are created only when some packets arrive that ask for the namespace.
But how can I send the packet to that namespace from the code? IF I can only create the instance when a client emit message to server, when no one is surfing the site, right after finish editing the price, the system will fail.
I am really confused about the namespace and its instance. If you have any clearer docs, please help me.
Thanks a lot!
This is my current state of understanding, hopefully it will be helpful to someone. Building up further from How to emit SocketIO event on the serverside, you now have a dictionary with ProductsNamespace objects as values. You can iterate through this dictionary to find the desired socket object. For example, if you set socket identifier upon connection, as described in the Django and Flask example apps by using on_nickname method, then you can retrieve the socket like so:
for key in _connections:
socket = _connections[key]
if 'nickname' in socket.session and socket.session['nickname'] == unicode('uniqueName'):
socket.emit('eventTag', 'message from server')
Similarly socket.session['rooms'] can be used to emit to all members of the room, and if there are multiple SocketIO namespaces, socket.ns_name can be used.

Categories

Resources