Now I’m working on developing WebSocket to get data (ex.btcusdt) from the website FTX
since the ftx.com you can trade the crypto without having to pay fees and I have a minimum budget
so now I want to try out to get some data and making my own bot
but now I’m having a problem with how I connect to the website
since I watch Binance video I was trying the same way but still didn’t get any message from the " wss://ftx.com/ws/ "
I do not quite understand the document they provide
my question is how can I connect to the data stream for example if I want to get the JSON file of BTCUSDT or BULLUSDT
this is the document they provide
https://docs.ftx.com/#websocket-api
Thank you
My code
import websocket
SOCKET = "wss://ftx.com/ws/"
def on_open(ws):
print('opened connection')
def on_close(ws):
print('closed connection')
def on_message(ws, message):
print("got message")
ws = websocket.WebSocketApp(SOCKET, on_open=on_open, on_close=on_close, on_message=on_message)
ws.run_forever()
This works find with Binance
Using the example code here modified to accept api keys as arguments, here is an example of grabbing ticker data:
if __name__ == '__main__':
# rest = client.FtxClient(api_key=key, api_secret=secret)
ws = ws_client.FtxWebsocketClient(api_key=key, api_secret=secret)
ws.connect()
for i in range(1, 10):
print(ws.get_ticker(market='BTC-PERP'))
time.sleep(1)
As Chev_603 say you can copy the two files into your directory.
Then, import the files at the begging of your app and use:
from client import FtxWebsocketClient
from websocket_manager import WebsocketManager
if __name__ == '__main__':
for i in range(1, 1000):
print(ws.get_ticker(market='BTC-PERP'))
time.sleep(0.2)
# your own api_key and api_secret must be written into the client.py lines 20 and 21
JSON message:
import websocket
import json
this = json.dumps({'op': 'subscribe', 'channel': 'trades', 'market': 'BTC-
PERP'})
def on_open(wsapp):
wsapp.send(this)
def on_message(wsapp, message):
print(message)
wsapp = websocket.WebSocketApp("wss://ftx.com/ws/", on_message=on_message,
on_open=on_open)
wsapp.run_forever()
Related
I want to combine REST with MQTT. I am using fastapi_mqtt and fastapi.
I want to handle messages from specfic topics ( my code is below, some methods are copied from fastapi-mqtt documentation https://sabuhish.github.io/fastapi-mqtt/mqtt/). I realized that no if statement actually works in on_message method, only print which is outside of on_message. Do you have any idea why? And how to manage custom on_message handler?
import logging
import requests
from fastapi import FastAPI
from fastapi_mqtt import FastMQTT, MQTTConfig
url = '127.0.0.1:8000' # im testing on my local broker
settings = get_settings()
app = FastAPI(title="System Controller")
mqtt_config = MQTTConfig(
host = settings.mqtt_host,
port= settings.mqtt_port,
username=settings.mqtt_user,
password=settings.mqtt_password
)
mqtt = FastMQTT(
config=mqtt_config)
mqtt.init_app(app)
async def publish():
mqtt.publish("/mqtt", "Hello from Fastapi") #publishing mqtt topic
return {"result": True,"message":"Published" }
#mqtt.on_connect()
def connect(client, flags, rc, properties):
# subscribing mqtt topic
if rc == 0:
print('Connected to MQTT Broker')
mqtt.client.subscribe("#")
print("Connected: ", client, flags, rc, properties)
else:
print('Failed to connect, return code %d\n',rc)
def test_events_handler(client, topic, payload):
# topic stucture /test/{sub_topic}/{device_id}
device_id = topic.split('/')[2]
sub_topic = topic.split('/')[1]
msg = payload.decode()
if sub_topic == '1':
logging.debug(f'Received level message from {device_id}')
try:
PARAMS = {'level': msg}
r = requests.get(url = f'{url}/test/msg', params=PARAMS)
data = r.json()
print(data)
except:
print('Level value is not correct')
elif sub_topic =='2':
print(f'Received dose message from user. Send to {device_id}')
else:
print('There is no handler to that topic')
def topic_handler(client, topic, payload):
if topic.startswith('/test'):
# handling topics related to feeder device
tests_events_handler(client, topic, payload)
print("Received message: ",topic, payload.decode())
elif topic.startswith('/nexttest'):
pass
#mqtt.on_message()
async def message(client, topic, payload, qos, properties):
print(topic)
topic_handler(client, topic, payload)
#mqtt.subscribe("#")
async def message_to_topic(client, topic, payload, qos, properties):
print("Received message to specific topic: ", topic, payload.decode(), qos, properties)
#mqtt.on_disconnect()
def disconnect(client, packet, exc=None):
print("Disconnected")
#mqtt.on_subscribe()
def subscribe(client, mid, qos, properties):
print("subscribed", client, mid, qos, properties)
#app.get('/')
async def func():
return {'result': True, 'message': 'Published'}
I want this program to works like that:
subscribes MQTT topics, lets say'/test/#' and '/nextest/#'. and on message from one of that topics it checks if message came from topic one or two and then reads the rest of the topic and depends on remaining part of the topic it sends specific get request to fast api.
I have a async_receive method of Eventhub developed in python and also has a checkpoint with it. The code was taken from the official Eventhub sample github repo.
Problem- Using the above-mentioned code, I am just able to receive 20-35 messages per minute if I keep the receiver on for the whole day whereas my Eventhub has a lot of stream data ingested (~200 messages per Minute). The enqueued time at eventhub for a message is now lagging behind by 90 minutes due to poor throughput at the receiver's end which means that the data that got enqueued at X minute in the Eventhub got pulled out of it at X+90 minutes
Investigation- I tried to look at the receive subclass in the Eventhub python SDK and came across a prefetch parameter (line 318) which is set to 300 by default. If this is already set to 300 then I should be able to pull more than 30-35 messages by default.
Any idea on how can I increase my pull capacity? I'm stuck at this point and have no direction forward, any help is highly appreciated.
EDIT 1-
I'm now attaching my Python Code as shown below-
import asyncio
import json
import logging
import os
import sys
import time
from datetime import date
import requests
from azure.eventhub.aio import EventHubConsumerClient
from azure.eventhub.extensions.checkpointstoreblobaio import BlobCheckpointStore
import log_handler
import threading
import traceback
try:
## Set env variables
CONNECTION_STR = os.environ["ECS"].strip()
EVENTHUB_NAME = os.environ['EN'].strip()
EVENTHUB_CONSUMER = os.environ["EC"].strip()
API = os.environ['API_variable'].strip()
AZURE_BLOB_CONNECTION_STR = os.environ["ACS"].strip()
BLOB_CONTAINER_NAME = os.environ["BCN"].strip()
BLOB_ACCOUNT_URL = os.environ["BAU"].strip()
PREFETCH_COUNT = int(os.environ["PREFETCH_COUNT"])
MAX_WAIT_TIME = float(os.environ["MAX_WAIT_TIME"])
except Exception as exception:
logging.debug(traceback.format_exc())
logging.warning(
"*** Please check the environment variables for {}".format(str(exception)))
sys.exit()
def API_CALL(event_data):
"""
Sends the request to the API
"""
try:
url = event_data['image_url']
payload = {"url": url}
## API call to the server
service_response = requests.post(API, json=payload)
logging.info(f"*** service_response.status_code : {service_response.status_code}")
cloud_response = json.loads(
service_response.text) if service_response.status_code == 200 else None
today = date.today()
response_date = today.strftime("%B %d, %Y")
response_time = time.strftime("%H:%M:%S", time.gmtime())
response_data = {
"type": "response_data",
"consumer_group": EVENTHUB_CONSUMER,
'current_date': response_date,
'current_time': response_time,
'image_url': url,
'status_code': service_response.status_code,
'response': cloud_response,
'api_response_time': int(service_response.elapsed.total_seconds()*1000),
"eventhub_data": event_data
}
logging.info(f"*** response_data {json.dumps(response_data)}")
logging.debug(f"*** response_data {json.dumps(response_data)}")
except Exception as exception:
logging.debug(traceback.format_exc())
logging.error(
"**** RaiseError: Failed request url %s, Root Cause of error: %s", url, exception)
async def event_operations(partition_context, event):
start_time = time.time()
data_ = event.body_as_str(encoding='UTF-8')
json_data = json.loads(data_)
## forming data payload
additional_data = {
"type": "event_data",
"consumer_group": EVENTHUB_CONSUMER,
"image_name": json_data["image_url"].split("/")[-1]
}
json_data.update(additional_data)
logging.info(f"*** Data fetched from EH : {json_data}")
logging.debug(f"*** Data fetched from EH : {json_data}")
API_CALL(json_data)
logging.info(f"*** time taken to process an event(ms): {(time.time()-start_time)*1000}")
def between_callback(partition_context, event):
"""
Loop to create threads
"""
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(event_operations(partition_context, event))
loop.close()
async def on_event(partition_context, event):
"""
Put your code here.
Do some sync or async operations.
If the operation is i/o intensive, async will have better performanceself.
"""
t1 = time.time()
_thread = threading.Thread(target=between_callback, args=(partition_context, event))
_thread.start()
logging.info(f"*** time taken to start a thread(ms): {(time.time()-t1)*1000}")
logging.info("*** Fetching the next event")
## Update checkpoint per event
t2 = time.time()
await partition_context.update_checkpoint(event)
logging.info(f"*** time taken to update checkpoint(ms): {(time.time()-t2)*1000}")
async def main(client):
"""
Run the on_event method for each event received
Args:
client ([type]): Azure Eventhub listener client
"""
try:
async with client:
# Call the receive method. Only read current data (#latest)
logging.info("*** Listening to event")
await client.receive(on_event=on_event,
prefetch=PREFETCH_COUNT,
max_wait_time=MAX_WAIT_TIME)
except KeyboardInterrupt:
print("*** Stopped receiving due to keyboard interrupt")
except Exception as err:
logging.debug(traceback.format_exc())
print("*** some error occured :", err)
if __name__ == '__main__':
## Checkpoint initialization
checkpoint_store = BlobCheckpointStore(
blob_account_url=BLOB_ACCOUNT_URL,
container_name=BLOB_CONTAINER_NAME,
credential=AZURE_BLOB_CONNECTION_STR
)
## Client initialization
client = EventHubConsumerClient.from_connection_string(
CONNECTION_STR,
consumer_group=EVENTHUB_CONSUMER,
eventhub_name=EVENTHUB_NAME,
checkpoint_store=checkpoint_store, #COMMENT TO RUN WITHOUT CHECKPOINT
logging_enable=True,
on_partition_initialize=on_partition_initialize,
on_partition_close=on_partition_close,
idle_timeout=10,
on_error=on_error,
retry_total=3
)
logging.info("Connecting to eventhub {} consumer {}".format(
EVENTHUB_NAME, EVENTHUB_CONSUMER))
logging.info("Registering receive callback.")
loop = asyncio.get_event_loop()
try:
loop.run_until_complete(main(client))
except KeyboardInterrupt as exception:
pass
finally:
loop.stop()
Execution-flow main()-->on_event()-->Thread(between_callback-->API_CALL)-->update_checkpoint
Change the function for the receiver in the below format when we can able to get the events and run them until they complete.
import asyncio
from azure.eventhub.aio import EventHubConsumerClient
from azure.eventhub.extensions.checkpointstoreblobaio import BlobCheckpointStore
async def on_event(partition_context, event):
# Print the event data.
print("Received the event: \"{}\" from the partition with ID: \"{}\"".format(event.body_as_str(encoding='UTF-8'), partition_context.partition_id))
# Update the checkpoint so that the program doesn't read the events
# that it has already read when you run it next time.
await partition_context.update_checkpoint(event)
async def main():
# Create an Azure blob checkpoint store to store the checkpoints.
checkpoint_store = BlobCheckpointStore.from_connection_string("AZURE STORAGE CONNECTION STRING", "BLOB CONTAINER NAME")
# Create a consumer client for the event hub.
client = EventHubConsumerClient.from_connection_string("EVENT HUBS NAMESPACE CONNECTION STRING", consumer_group="$Default", eventhub_name="EVENT HUB NAME", checkpoint_store=checkpoint_store)
async with client:
# Call the receive method. Read from the beginning of the partition (starting_position: "-1")
await client.receive(on_event=on_event, starting_position="-1")
if __name__ == '__main__':
loop = asyncio.get_event_loop()
# Run the main method.
loop.run_until_complete(main())
Also as per the suggest in the comment, call the API on an interval bases.
Goal: I'm trying to handle POST requests from Trello webhooks, then send an embed with relevant data on a discord guild.
Current Progress: I got handling POST requests done, and I know how to send an embed and such. However, when it finally got to a point where I need to implement the two together, I realized it wouldn't be possible to just py trelloHandler.py and they both start running. I did some research and found someone asking a similar question. To most people this would be helpful, though I am pretty new to python (I like working on projects to learn) and not sure how I'd implement threading. I did manage to find a guide over on realpython.com, but I'm failing to understand it.
My Question (TL;DR): How can I run an HTTP server that will listen to post requests in the same program as a discord.py bot (more specifically, using threading)?
My Code ("bot_token_here" is replaced with my discord token):
import discord
import json
from http.server import HTTPServer, BaseHTTPRequestHandler
from discord.ext import commands
client = commands.Bot(command_prefix = "~")
class requestHandler(BaseHTTPRequestHandler):
def do_POST(self):
# Interpret and process the data
content_len = int(self.headers.get('content-length', 0))
post_body = self.rfile.read(content_len)
data = json.loads(post_body)
# Action and Models data
action = data['action']
actionData = action['data']
model = data['model']
# Board and card data
board = action['data']['board']
card = action['data']['card']
# Member data
member = action['memberCreator']
username = member['username']
# Keep at end of do_POST
self.send_response(204)
self.send_header('content-type', 'text/html')
self.end_headers()
def do_HEAD(self):
self.send_response(200)
self.end_headers()
#client.event
async def on_ready():
print("Bot is online, and ready to go! (Listening to {} servers!)".format(len(list(client.guilds))))
def main():
PORT = 9090
server_address = ('localhost', PORT)
server = HTTPServer(server_address, requestHandler)
server.serve_forever()
client.run("bot_token_here")
if __name__ == '__main__':
main()
I have the following code for a Sanic hello world based off combining different endpoints here:
https://sanic.readthedocs.io/en/latest/sanic/response.html
https://sanic.readthedocs.io/en/latest/sanic/websocket.html
Code is:
from sanic import Sanic
from sanic import response
from sanic.websocket import WebSocketProtocol
app = Sanic()
#app.route("/")
async def test(request):
return response.json({"hello": "world"})
#app.route('/html')
async def handle_request(request):
return response.html('<p>Hello world!</p>')
#app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
#app.route('/html2')
async def handle_request(request):
return response.html("""<html><head><script>
var exampleSocket = new WebSocket("wss://0.0.0.0:8000/feed", "protocolOne");
exampleSocket.onmessage = function (event) {
console.log(event.data)};</script></head><body><h1>Hello socket!</h1><p>hello</p></body></html>""")
app.run(host="0.0.0.0", port=8000)
# app.run(host="0.0.0.0", port=8000, protocol=WebSocketProtocol) # ws
The routes "/" and "/html" work fine, but
http://0.0.0.0:8000/feed
produces:
Error: Invalid websocket request
and "/html2" renders the page fine, but doesn't log to console, showing in the debugger:
Firefox can’t establish a connection to the server at wss://0.0.0.0:8000/feed.
What should I change or add to make a viable websocket endpoint that plays nicely with the http ones, too?
Using 0.0.0.0 as your endpoint within your client html doesn't make any sense and you're not using SSL so you want to use ws:// rather than wss://. In other words,
from sanic import Sanic
from sanic import response
from sanic.websocket import WebSocketProtocol
app = Sanic()
#app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
#app.route('/html2')
async def handle_request(request):
return response.html("""<html><head><script>
var exampleSocket = new WebSocket("ws://" + location.host + '/feed');
exampleSocket.onmessage = function (event) {
console.log(event.data)};</script></head><body><h1>Hello socket!</h1><p>hello</p></body></html>""")
app.run(host="0.0.0.0", port=8000)
I am trying to get live data in Python 2.7.13 from Poloniex through the push API.
I read many posts (including How to connect to poloniex.com websocket api using a python library) and I arrived to the following code:
from autobahn.twisted.wamp import ApplicationSession
from autobahn.twisted.wamp import ApplicationRunner
from twisted.internet.defer import inlineCallbacks
import six
class PoloniexComponent(ApplicationSession):
def onConnect(self):
self.join(self.config.realm)
#inlineCallbacks
def onJoin(self, details):
def onTicker(*args):
print("Ticker event received:", args)
try:
yield self.subscribe(onTicker, 'ticker')
except Exception as e:
print("Could not subscribe to topic:", e)
def main():
runner = ApplicationRunner(six.u("wss://api.poloniex.com"), six.u("realm1"))
runner.run(PoloniexComponent)
if __name__ == "__main__":
main()
Now, when I run the code, it looks like it's running successfully, but I don't know where I am getting the data. I have two questions:
I would really appreciate if someone could walk me through the process of subscribing and getting ticker data, that I will elaborate in python, from step 0: I am running the program on Spyder on Windows. Am I supposed to activate somehow Crossbar?
How do I quit the connection? I simply killed the process with Ctrl+c and now when I try to run it agan, I get the error: ReactorNonRestartable.
I ran into a lot of issues using Poloniex with Python2.7 but finally came to a solution that hopefully helps you.
I found that Poloniex has pulled support for the original WAMP socket endpoint so I would probably stray from this method altogether. Maybe this is the entirety of the answer you need but if not here is an alternate way to get ticker information.
The code that ended up working best for me is actually from the post you linked to above but there was some info regarding currency pair ids I found elsewhere.
import websocket
import thread
import time
import json
def on_message(ws, message):
print(message)
def on_error(ws, error):
print(error)
def on_close(ws):
print("### closed ###")
def on_open(ws):
print("ONOPEN")
def run(*args):
# ws.send(json.dumps({'command':'subscribe','channel':1001}))
ws.send(json.dumps({'command':'subscribe','channel':1002}))
# ws.send(json.dumps({'command':'subscribe','channel':1003}))
# ws.send(json.dumps({'command':'subscribe','channel':'BTC_XMR'}))
while True:
time.sleep(1)
ws.close()
print("thread terminating...")
thread.start_new_thread(run, ())
if __name__ == "__main__":
websocket.enableTrace(True)
ws = websocket.WebSocketApp("wss://api2.poloniex.com/",
on_message = on_message,
on_error = on_error,
on_close = on_close)
ws.on_open = on_open
ws.run_forever()
I commented out the lines that pull data you don't seem to want, but for reference here is some more info from that previous post:
1001 = trollbox (you will get nothing but a heartbeat)
1002 = ticker
1003 = base coin 24h volume stats
1010 = heartbeat
'MARKET_PAIR' = market order books
Now you should get some data that looks something like this:
[121,"2759.99999999","2759.99999999","2758.00000000","0.02184376","12268375.01419869","4495.18724321",0,"2767.80020000","2680.10000000"]]
This is also annoying because the "121" at the beginning is the currency pair id, and this is undocumented and also unanswered in the other stack overflow question referred to here.
However, if you visit this url: https://poloniex.com/public?command=returnTicker it seems the id is shown as the first field, so you could create your own mapping of id->currency pair or parse the data by the ids you want from this.
Alternatively, something as simple as:
import urllib
import urllib2
import json
ret = urllib2.urlopen(urllib2.Request('https://poloniex.com/public?command=returnTicker'))
print json.loads(ret.read())
will return to you the data that you want, but you'll have to put it in a loop to get constantly updating information. Not sure of your needs once the data is received so I will leave the rest up to you.
Hope this helps!
I made, with the help of other posts, the following code to get the latest data using Python 3.x. I hope this helps you:
#TO SAVE THE HISTORICAL DATA (X MINUTES/HOURS) OF EVERY CRYPTOCURRENCY PAIR IN POLONIEX:
from poloniex import Poloniex
import pandas as pd
from time import time
import os
api = Poloniex(jsonNums=float)
#Obtains the pairs of cryptocurrencies traded in poloniex
pairs = [pair for pair in api.returnTicker()]
i = 0
while i < len(pairs):
#Available candle periods: 5min(300), 15min(900), 30min(1800), 2hr(7200), 4hr(14400), and 24hr(86400)
raw = api.returnChartData(pairs[i], period=86400, start=time()-api.YEAR*10)
df = pd.DataFrame(raw)
# adjust dates format and set dates as index
df['date'] = pd.to_datetime(df["date"], unit='s')
df.set_index('date', inplace=True)
# Saves the historical data of every pair in a csv file
path=r'C:\x\y\Desktop\z\folder_name'
df.to_csv(os.path.join(path,r'%s.csv' % pairs[i]))
i += 1