execute a specific method into sgx using gramine - python

I have an application which is using gRPC, client.py and server.py , I want to use gramine in order to execute the service inside SGX.
how can I run a specific method not the whole script inside sgx using gramine?
client.py:
"""The Python implementation of the GRPC helloworld.Greeter client."""
from __future__ import print_function
import logging
import grpc
import helloworld_pb2
import helloworld_pb2_grpc
def run():
# NOTE(gRPC Python Team): .close() is possible on a channel and should be
# used in circumstances in which the with statement does not fit the needs
# of the code.
print("Will try to greet world ...")
with grpc.insecure_channel('localhost:50051') as channel:
stub = helloworld_pb2_grpc.GreeterStub(channel)
response = stub.SayHello(helloworld_pb2.HelloRequest(name='you'))
print("Greeter client received: " + response.message)
if __name__ == '__main__':
logging.basicConfig()
run()
and server.py:
from concurrent import futures
import logging
import grpc
import helloworld_pb2
import helloworld_pb2_grpc
class Greeter(helloworld_pb2_grpc.GreeterServicer):
def SayHello(self, request, context):
return helloworld_pb2.HelloReply(message='Hello, %s!' % request.name)
def serve():
port = '50051'
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
helloworld_pb2_grpc.add_GreeterServicer_to_server(Greeter(), server)
server.add_insecure_port('[::]:' + port)
server.start()
print("Server started, listening on " + port)
server.wait_for_termination()
if __name__ == '__main__':
logging.basicConfig()
serve()
let say I want to execute sayhello inside SGX when I run client.py
currently I am running gramine-sgx ./python client.py that is going to execute only client inside SGX or is it going to also run sayhello from server.py inside SGX?

Related

How to listen to a TCP stream on my FastAPI server

I am working on a project building an API that is able to send the live location of vehicles to a frontend.
I get this location data by subscribing to a ZMQ stream by running a while loop. This is all working and if I just run my stream as a script I can print all kinds of information to the terminal (I'll store those in a database later on).
I also have the FastAPI server up and running
Now what I'd like to do is:
At startup start the server so I can make API calls
Start the while loop and start receiving data from the ZMQ stream
What happens instead:
It seems either / or.
I can import a function with the while loop but this blocks the server from starting up
Or I can run the server with no means to start the stream
Here is my code:
# General FastAPI Imports
from fastapi import Depends, FastAPI, Request
from data_collection.livestream import enable_data_stream
from client_service import client_api
app = FastAPI()
app.include_router(client_api.router, prefix="/API/V1")
#app.get('/')
def read_root(request: Request):
return {"Hello": "World"}
The Stream:
from gzip import GzipFile
from io import BytesIO
import zmq
import xml.etree.ElementTree as ET
context = zmq.Context()
subscriber = context.socket(zmq.SUB)
subscriber.connect("tcp://SERVER")
subscriber.setsockopt(zmq.SUBSCRIBE)
while True:
multipart = subscriber.recv_multipart()
address = multipart[0]
try:
contents = GzipFile('', 'r', 0, BytesIO(multipart[1])).read()
root = ET.fromstring(contents)
print("Updates Received:")
# Gets the timestamp
print('time', root[3].text)
print('X Coord: ', root[4][0][12].text)
print('Y Coord: ', root[4][0][13].text)
I tried looking into the multiprocess and threading implementations for python but I'm unsure how those tie in with starting the FastAPI process (as that's enabled from Uvicorn)
In the example below, the server and worker are started in separate processes because the While loop won't resolve. It seems that you were on the right track. In my example, I have these functions in one file, but there are no restrictions on someone breaking them out into their own files:
import uvicorn
import multiprocessing
import time
import zmq
import xml.etree.ElementTree as ET
from gzip import GzipFile
from io import BytesIO
from fastapi import FastAPI
app = FastAPI()
#app.get("/")
async def root():
return {"message": "Hello World"}
def server():
uvicorn.run(app, host="localhost", port=8000)
def worker():
context = zmq.Context()
subscriber = context.socket(zmq.SUB)
subscriber.connect("tcp://SERVER")
subscriber.setsockopt(zmq.SUBSCRIBE)
while True:
multipart = subscriber.recv_multipart()
address = multipart[0]
try:
contents = GzipFile('', 'r', 0, BytesIO(multipart[1])).read()
root = ET.fromstring(contents)
print("Updates Received:")
# Gets the timestamp
print('time', root[3].text)
print('X Coord: ', root[4][0][12].text)
print('Y Coord: ', root[4][0][13].text)
except Exception as e:
print(e)
print("Error: %s" % multipart[1])
break
if __name__ == '__main__':
# Runs server and worker in separate processes
p1 = multiprocessing.Process(target=server)
p1.start()
time.sleep(1) # Wait for server to start
p2 = multiprocessing.Process(target=worker)
p2.start()
p1.join()
p2.join()

Closing flask-socket io server programmatically

I am new to server development so please be kind...
I am developing a test application that starts a flask-socketio server and after interacting with a clients, it needs to shutdown and open another instance.
However this is not possible
I get error
File "C:\Python39\lib\site-packages\eventlet\convenience.py", line 78, in listen
sock.bind(addr)
OSError: [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted
How can I programmatically shutdown the server?
I looked in answers here How to stop flask application without using ctrl-c and using a process indeed does the trick.
But I don't really want to have a separate process because sharing the variables between process is too tricky.
I also didn't understand from the same post how to send a request from the server to the server itself in order to shutdown the flask application.
This is an example of my code
import socketio
import eventlet
import eventlet.wsgi
from flask import Flask, render_template
import socket
import threading
import time
ip_addr=socket.gethostbyname(socket.gethostname())
appFlask = Flask(__name__)
sio = socketio.Server( ) #engineio_logger=True,logger=True)
# wrap Flask application with engineio's middleware
app = socketio.Middleware(sio, appFlask)
#sio.on('connect')
def connect(sid, environ):
print('connect ', sid)
#sio.on('message')
def message(sid, data):
print('message '+data, data)
#sio.on('disconnect')
def disconnect(sid):
print('disconnect ', sid)
#sio.on('result')
def result(sid,data):
print('result ', sid)
def worker1():
socket_port=3000
eventlet.wsgi.server(eventlet.listen((ip_addr, socket_port)), app)
if __name__ == '__main__':
sio.start_background_task(worker1)
# do some stuff and interact with the client
sio.sleep(2)
# how can I close the server so that I can do the following?
sio.start_background_task(worker1)
EDITED wit flask socket io functionality
import socketio
import eventlet
import eventlet.wsgi
from flask import Flask, render_template
import socket
import threading
import time
import requests
from flask import request
from flask_socketio import SocketIO
ip_addr=socket.gethostbyname(socket.gethostname())
socket_port=3000
app = Flask(__name__)
app.config['SECRET_KEY'] = 'secret!'
sio = SocketIO(app)
#app.route('/stop')
def stop():
sio.stop()
#sio.on('connect')
def connect(sid, environ):
print('connect ', sid)
#sio.on('message')
def message(sid, data):
print('message '+data, data)
#sio.on('disconnect')
def disconnect(sid):
print('disconnect ', sid)
#sio.on('result')
def result(sid,data):
print('result ', sid)
def worker1():
eventlet.wsgi.server(eventlet.listen((ip_addr, socket_port)), app)
if __name__ == '__main__':
eventlet_thr=sio.start_background_task(worker1)
# do some stuff and interact with the client
sio.sleep(2)
# now wait that the server is stopped
# invoke in a different process a request to stop
eventlet_thr.join()
# how can I close the server so that I can do the following?
sio.start_background_task(worker1)
You are using the eventlet web server is seems, so the question is how to stop the eventlet web server, Flask-SocketIO has nothing to do with the server.
As a convenience, Flask-SocketIO provides the stop() method, which you have to call from inside a handler. I'm not sure if that will work when the server is running on a thread that is not the main thread though, you'll have to test that out.
So basically what you need to do is add an endpoint that forces the server to exit, maybe something like this:
#app.route('/stop')
def stop():
sio.stop()
return ''
So then you can start and stop the server as follows:
if __name__ == '__main__':
thread = sio.start_background_task(worker1)
# do some stuff and interact with the client
requests.get('http://localhost:5000/stop')
thread.join()

Running Asyncio and Socketio at the same time in python

I want to run two websockets on in my program, where one communicates to my Raspberry Pi (via websockets) and the other one to my browser via Flask-SocketIO.
Everything runs fine, when I run them in two different python files.
But I can't get them to run in the same file.
This is the Flask-SocketIO Server:
from flask import Flask, render_template
from flask_socketio import SocketIO
app = Flask(__name__)
app.config['SECRET_KEY'] = 'secret!'
socketio = SocketIO(app)
#app.route('/hello')
def hello():
return 'Hello'
def main():
print("abc")
if __name__ == '__main__':
socketio.run(app, debug=True, port=5000)
print("defg")
Output:
* Restarting with stat
* Debugger is active!
* Debugger PIN: ...
(12444) wsgi starting up on http://...
Noting gets printed.
This is the websockets server: (You can find that piece of code when googleing OCPP Python on mobilityhouses github)
import asyncio
import logging
import websockets
from datetime import datetime
from ocpp.routing import on
from ocpp.v201 import ChargePoint as cp
from ocpp.v201 import call_result
logging.basicConfig(level=logging.INFO)
class ChargePoint(cp):
# One function was left out to decluster the question
async def on_connect(websocket, path):
try:
requested_protocols = websocket.request_headers[
'Sec-WebSocket-Protocol']
except KeyError:
logging.info("Client hasn't requested any Subprotocol. "
"Closing Connection")
if websocket.subprotocol:
logging.info("Protocols Matched: %s", websocket.subprotocol)
else:
# In the websockets lib if no subprotocols are supported by the
# client and the server, it proceeds without a subprotocol,
# so we have to manually close the connection.
logging.warning('Protocols Mismatched | Expected Subprotocols: %s,'
' but client supports %s | Closing connection',
websocket.available_subprotocols,
requested_protocols)
return await websocket.close()
charge_point_id = path.strip('/')
cp = ChargePoint(charge_point_id, websocket)
await cp.start()
async def main():
server = await websockets.serve(
on_connect,
'0.0.0.0',
9000,
subprotocols=['ocpp2.0.1']
)
logging.info("WebSocket Server Started")
print("456")
await server.wait_closed()
if __name__ == '__main__':
asyncio.run(main())
print("123")
Output
INFO:root:WebSocket Server Started
456
I tried pasting it all in the same document and just doing this:
if __name__ == '__main__':
asyncio.run(main())
print("123")
socketio.run(app, debug=True, port=5000)
print("456")
But this just runs the first asyncio.run(main()) and doesnt print 123 etc.
If I switch it up, it will also just run the first .run and than stop.
I tried threading, but that had the same results.
Does anyone know how I can run these two at the same time?

Flask Socket-IO Having a server wait for a client callback

I am building a pure python application with flask socket-io. Currently, I am trying to have the server emit an event to a specific client and wait for the callback before moving on.
This is my Server.py
import socketio
import eventlet
sio = socketio.Server(async_handlers=False)
app = socketio.WSGIApp(sio)
#sio.event
def connect(sid, environ):
nm = None
def namee(name):
print(name) # this has the value and is trying to assign it to nm
nonlocal nm
nm = name
sio.emit('name_', "name plz", callback=namee)
print(nm) # this shouldn't be None, but it is
print(sid, "in lobby")
#sio.event
def disconnect(sid):
print('disconnect', sid)
if __name__ == '__main__':
eventlet.wsgi.server(eventlet.listen(('', 5000)), app)
And this is my client.py
import sys
import socketio
sio = socketio.Client()
#sio.event
def connect():
print("you have connected to the server")
#sio.event
def connect_error(data):
print("The connection failed!")
#sio.event
def disconnect():
print("You have left the server")
#sio.event
def name_(data):
print("name asked for")
return "test"
def main():
sio.connect('http://localhost:5000')
print('Your sid is', sio.sid)
if __name__ == '__main__':
main()
I tried using time.sleep() but that delayed the whole process. I also tried making a while loop
while nm is None:
pass
but that kicked the client off the server and a while later, the server crashed.
You can use the call() method instead of emit().
nm = sio.call('name_', "name plz")
print(nm)
I recommend that you move this logic outside of the connect handler though. This handler is just to accept or reject the connection, it is not supposed to block for a long period of time. Do this in a regular handler that you use after the connection has been established.

Adding custom module via RPyC

I'm trying to add a new module to a connection.
I have the following files:
main.py
UpdateDB.py
In UpdateDB:
def UpdateDB():
...
In main.py:
import UpdateDB
import rpyc
conn = rpyc.classic.connect(...)
rpyc.utils.classic.upload_package(conn, UpdateDB)
conn.modules.UpdateDB.UpdateDB()
And I can figure out how to invoke the UpdateDB() function.
I get:
AttributeArror: 'module' object has no attribute 'UpdateDB'
Perhaps I'm trying to do it wrong. So let me explain what I'm trying to do:
I want to create a connection to the server and run on it a function from the UpdateDB.py file.
Not sure how to do that in classic mode (not sure why you'd use it), but here is how to accomplish the task in the newer RPyC service mode.
Script ran as Server:
import rpyc
from rpyc.utils.server import ThreadedServer
class MyService(rpyc.Service):
def exposed_printSomething(self, a):
print a
print "printed on server!"
return 'printed on client!'
if __name__ == '__main__':
server = ThreadedServer(MyService, port=18812)
server.start()
Script ran as Client:
import rpyc
if __name__ == '__main__':
conn = rpyc.connect("127.0.0.1", port=18812)
print conn.root.printSomething("passed to server!")
Result on Server:
passed to server!
printed on server!
Result on Client:
printed on client!

Categories

Resources