I'm building a simple web app using Python as part of an online course, currently I'm trying to implement some CRUD operations. This app is a simple restaurant menu server that connects to an SQLite database (using SQLAlchemy) and executes some CRUD operations on it.
Part of the app serves a simple html form to modify a restaurant name (using POST method), when implementing this operation, an error causes the server to return an empty response, and when looking at the log, no POST request was issued and the do_POST body was executed 3 times.
Here's the body of the do_POST method.
def do_POST(self):
try:
if self.path.endswith(....):
(.....)
if self.path.endswith('/edit'):
print('inside post ' + self.path) # debugging message
ctype, pdict = cgi.parse_header(self.headers.getheader('content-type'))
if ctype == 'multipart/form-data':
fields = cgi.parse_multipart(self.rfile, pdict)
messageContent = fields.get('newRestaurantName')
r_id = self.path.split('/')[2]
# if I remove .one() from the instruction bellow, the error goes away.
# 'session' is an SQLAlchemy DBSession defined above.
restaurant = session.query(Restaurant).get(r_id).one()
restaurant.name = messageContent[0]
session.add(restaurant)
session.commit()
self.send_response(301)
self.send_header('Content-type', 'text/html')
self.send_header('Location', '/restaurants')
self.end_headers()
return
except:
(....)
And this is the server output when trying to perform the POST method.
Server running at port 8080
10.0.2.2 - - [25/Oct/2018 23:17:27] "GET /restaurants HTTP/1.1" 200 -
10.0.2.2 - - [25/Oct/2018 23:17:30] "GET /restaurants/1/edit HTTP/1.1" 200 -
inside post /restaurants/1/edit
inside post /restaurants/1/edit
inside post /restaurants/1/edit
After removing the .one() from the query expression, the operation executes without issues.
What I'm trying to understand is why I'm having the debugging message printed three times ! whereas I only performed one (unsuccessful) POST request ? Even if there was an exception, shouldn't the do_POST function only execute once ?
(the server is running from inside a Vagrant VM)
Thank you !
Related
newbie here. I'm working on the Stripe payment method using flask and it all works well on my local machine but when I deploy my code on the server and listen to webhook events in the stripe dashboard, I get this error"No signatures found matching the expected signature for payload". Already tried so many solutions but nothing worked. Any help will be appreciated.
def webhook_received(self, user_id):
payload = request.data
endpoint_secret = 'my_secret_key'
sig_header = request.headers.get('stripe-signature')
try:
event = stripe.Webhook.construct_event(
json.loads(payload), sig_header, endpoint_secret
)
data = event['data']
except Exception as e:
return str(e)
event_type = event['type']
if event_type == 'checkout.session.completed':
self.handle_checkout_session(data, user_id)
elif event_type == 'invoice.paid':
pass
Okay I think I see the problem but I'll try to cover both potential issues.
(Most Likely): Stripe requires the raw, unmodified request body to form the webhook signature. In your try: block you are using json.loads(payload) which converts it to a Python dict object. Try using the raw payload data instead.
If the problem only occurs when you deploy your code to a remote server then the most likely problem is with the endpoint_secret value. I would add some logging in your webhook_received() function to log the value after it's loaded and make sure the value matches the webhook signing secret you can view in your Stripe dashboard.
Lastly, it's important to return proper responses to avoid webhook delivery retries. I know Flask does some stuff implicitly (a pet peeve of mine) but I'm not seeing a 200 or 500 response being returned here. You'll want to make sure you respond appropriately to avoid headaches later. You can check the best practices here. There's also a handy webhook builder here so you can check your implementation against Stripe's Flask code.
Hey guys I'm having a really strange issue with ngrok and Twilio StudioFlow.
For some reason, the ngrok setup using (type "ngrok http 5000" in command line) and then copy and pasting the link into StudioFlow http widgets has stopped working, when it was working fine earlier. Now, when the widgets are reached in the flow, I get these:
127.0.0.1 - - [11/Feb/2021 02:18:10] "POST /reminders HTTP/1.1" 404 -
127.0.0.1 - - [11/Feb/2021 02:18:10] "POST /week2 HTTP/1.1" 404 -
The flask routes are right (/reminders and /week2) and I am pasting them like so in the widgets (http://8b5dba64ef0c.ngrok.io/reminders) so I am really not sure why this is happening all of a sudden. Why can't twilio find my ngrok tunnel?
For context, I've tried running the flask app on different ports, but that doesn't solve the issue either. That leads me to believe theres something wrong with how to link is being read in to either twilio or my app. I haven't been able to find one though. An example of my code:
#app.route('/reminders', methods = ['POST', 'GET'])
def remmy():
# Start our empty response
resp = MessagingResponse()
if request.method == 'POST':
number = request.form['To']
print(type(number))
part = Part.query.filter_by(phone_num=number).first()
def base_reminder(num):
rem = '[#####]: Don\'t forget to complete your baseline survey to receive an incentive of $35.'
message = client.messages.create(from_='+1###########',
to=num,
body=rem)
resp = MessagingResponse()
msg = resp.message
return str(resp)
if not part.baseline:
base_reminder(part.phone_num)
return str(resp)
I'm facing problem in emiting messages from RabbitMQ to User via SocketIO.
I have Flask application with SocketIO integration.
Current user flow seems like
The problem is i'm not able to set up RabbitMQ listener which forward messages to browser via SocketIO. Every time i'm getting different error. Mostly is that connection is closed, or i'm working outside of application context.
I tried many approaches, here is my last one.
# callback
def mq_listen(uid):
rabbit = RabbitMQ()
def cb(ch, method, properties, body, mq=rabbit):
to_return = [0] # mutable
message = Message.load(body)
to_return[0] = message.get_message()
emit('report_part', {"data": to_return[0]})
rabbit.listen('results', callback=cb, id=uid)
# this is the page, which user reach
#blueprint.route('/report_result/<uid>', methods=['GET'])
def report_result(uid):
thread = threading.Thread(target=mq_listen, args=(uid,))
thread.start()
return render_template("property/report_result.html", socket_id=uid)
where rabbit.listen method is abstraction like:
def listen(self, queue_name, callback=None, id=None):
if callback is not None:
callback_function = callback
else:
callback_function = self.__callback
if id is None:
self.channel.queue_declare(queue=queue_name, durable=True)
self.channel.basic_qos(prefetch_count=1)
self.consumer_tag = self.channel.basic_consume(callback_function, queue=queue_name)
self.channel.start_consuming()
else:
self.channel.exchange_declare(exchange=queue_name, type='direct')
result = self.channel.queue_declare(exclusive=True)
exchange_name = result.method.queue
self.channel.queue_bind(exchange=queue_name, queue=exchange_name, routing_key=id)
self.channel.basic_consume(callback_function, queue=exchange_name, no_ack=True)
self.channel.start_consuming()
which resulted into
RuntimeError: working outside of request context
I will be happy for any tip or example of usage.
Thanks a lot
I had a similar issue, in the end of the day it's because when you make a request flask passes the request context to client. But the solution is NOT to add with app.app_context(). That is hackey and will definitely have errors as you're not natively sending the request context.
My solution was to create a redirect so that the request context is maintained like:
def sendToRedisFeed(eventPerson, type):
eventPerson['type'] = type
requests.get('http://localhost:5012/zmq-redirect', json=eventPerson)
This is my redirect function, so whenever there is an event I'd like to push to my PubSub it goes through this function, which then pushes to that localhost endpoint.
from flask_sse import sse
app.register_blueprint(sse, url_prefix='/stream')
#app.route('/zmq-redirect', methods=['GET'])
def send_message():
try:
sse.publish(request.get_json(), type='greeting')
return Response('Sent!', mimetype="text/event-stream")
except Exception as e:
print (e)
pass
Now, whenever an event is pushed to my /zmq-redirect endpoint, it is redirected and published via SSE.
And now finally, just to wrap everything up, the client:
var source = new EventSource("/stream");
source.addEventListener(
"greeting",
function(event) {
console.log(event)
}
)
The error message suggests that it's a Flask issue. While handling requests, Flask sets a context, but because you're using threads this context is lost. By the time it's needed, it is no longer available, so Flask gives the "working outside of request context" error.
A common way to resolve this is to provide the context manually. There is a section about this in the documentation: http://flask.pocoo.org/docs/1.0/appcontext/#manually-push-a-context
Your code doesn't show the socketio part. But I wonder if using something like flask-socketio could simplify some stuff... (https://flask-socketio.readthedocs.io/en/latest/). I would open up the RabbitMQ connection in the background (preferably once) and use the emit function to send any updates to connected SocketIO clients.
We are having some issue with Cross Origin Resource Sharing (CORS) implementation in a restfull web service on web2py.
We try to implement CORS on the server side in web2py as suggested here: ( https://groups.google.com/forum/#!msg/web2py/kSUtyNcUQGI/qfiIqfUiWLwJ )
We added following to models/0.py, (to have the response header updated before actual restfull api handler in the controler)
===============================
if request.env.http_origin:
response.headers['Access-Control-Allow-Origin'] = request.env.http_origin
response.headers['Access-Control-Allow-Origin'] = "*"
response.headers['Access-Control-Allow-Credentials'] = 'true'
response.headers['Access-Control-Max-Age'] = 86400
if request.env.request_method == 'OPTIONS':
if request.env.http_access_control_request_method:
print request.env.http_access_control_request_method
response.headers['Access-Control-Allow-Methods'] = request.env.http_access_control_request_method
if request.env.http_access_control_request_headers:
response.headers['Access-Control-Allow-Headers'] = request.env.http_access_control_request_headers
==========================
RESTful POST & GET are now working
but PUT and DELETE aren't because preflight http OPTIONS request is rejected as "400 BAD REQUEST" by web2py
So for example when calling the restful webservice using ajax call from a local web page,
we get the following error msg in NetBeans log.
Failed to load resource: the server responded with a status of 400
(BAD REQUEST) (10:46:36:182 | error, network) at
127.0.0.1:8000/test/default/api/entries/2.json Failed to load resource: Origin localhost:8383 is not allowed by
Access-Control-Allow-Origin. (10:46:36:183 | error, network) at
127.0.0.1:8000/test/default /api/entries/2.json XMLHttpRequest cannot load 127.0.0.1:8000/test/default /api/entries/2.json. Origin
localhost:8383 is not allowed by Access-Control-Allow-Origin.
(10:46:36:183 | error, javascript) at www/page/test.html
You can add the following line:
response["Access-Control-Allow-Methods"] = "POST, GET, OPTIONS"
This is a really old question, but I just managed to solve the exact same problem. In my case the issue was with the controllers; I had to add the following wrapper before any actions:
def CORS(f):
"""
Enables CORS for any action
"""
def wrapper(*args, **kwds):
if request.env.http_origin and request.env.request_method == 'OPTIONS':
response.view = 'generic.json'
return dict()
return f(*args, **kwds)
return wrapper
Then in your controller justy write
#CORS
def whatever():
do_stuff
return dict(stuff)
Github offers to send Post-receive hooks to an URL of your choice when there's activity on your repo.
I want to write a small Python command-line/background (i.e. no GUI or webapp) application running on my computer (later on a NAS), which continually listens for those incoming POST requests, and once a POST is received from Github, it processes the JSON information contained within. Processing the json as soon as I have it is no problem.
The POST can come from a small number of IPs given by github; I plan/hope to specify a port on my computer where it should get sent.
The problem is, I don't know enough about web technologies to deal with the vast number of options you find when searching.. do I use Django, Requests, sockets,Flask, microframeworks...? I don't know what most of the terms involved mean, and most sound like they offer too much/are too big to solve my problem - I'm simply overwhelmed and don't know where to start.
Most tutorials about POST/GET I could find seem to be concerned with either sending or directly requesting data from a website, but not with continually listening for it.
I feel the problem is not really a difficult one, and will boil down to a couple of lines, once I know where to go/how to do it. Can anybody offer pointers/tutorials/examples/sample code?
First thing is, web is request-response based. So something will request your link, and you will respond accordingly. Your server application will be continuously listening on a port; that you don't have to worry about.
Here is the similar version in Flask (my micro framework of choice):
from flask import Flask, request
import json
app = Flask(__name__)
#app.route('/',methods=['POST'])
def foo():
data = json.loads(request.data)
print "New commit by: {}".format(data['commits'][0]['author']['name'])
return "OK"
if __name__ == '__main__':
app.run()
Here is a sample run, using the example from github:
Running the server (the above code is saved in sample.py):
burhan#lenux:~$ python sample.py
* Running on http://127.0.0.1:5000/
Here is a request to the server, basically what github will do:
burhan#lenux:~$ http POST http://127.0.0.1:5000 < sample.json
HTTP/1.0 200 OK
Content-Length: 2
Content-Type: text/html; charset=utf-8
Date: Sun, 27 Jan 2013 19:07:56 GMT
Server: Werkzeug/0.8.3 Python/2.7.3
OK # <-- this is the response the client gets
Here is the output at the server:
New commit by: Chris Wanstrath
127.0.0.1 - - [27/Jan/2013 22:07:56] "POST / HTTP/1.1" 200 -
Here's a basic web.py example for receiving data via POST and doing something with it (in this case, just printing it to stdout):
import web
urls = ('/.*', 'hooks')
app = web.application(urls, globals())
class hooks:
def POST(self):
data = web.data()
print
print 'DATA RECEIVED:'
print data
print
return 'OK'
if __name__ == '__main__':
app.run()
I POSTed some data to it using hurl.it (after forwarding 8080 on my router), and saw the following output:
$ python hooks.py
http://0.0.0.0:8080/
DATA RECEIVED:
test=thisisatest&test2=25
50.19.170.198:33407 - - [27/Jan/2013 10:18:37] "HTTP/1.1 POST /hooks" - 200 OK
You should be able to swap out the print statements for your JSON processing.
To specify the port number, call the script with an extra argument:
$ python hooks.py 1234
I would use:
https://github.com/carlos-jenkins/python-github-webhooks
You can configure a web server to use it, or if you just need a process running there without a web server you can launch the integrated server:
python webhooks.py
This will allow you to do everything you said you need. It, nevertheless, requires a bit of setup in your repository and in your hooks.
Late to the party and shameless autopromotion, sorry.
If you are using Flask, here's a very minimal code to listen for webhooks:
from flask import Flask, request, Response
app = Flask(__name__)
#app.route('/webhook', methods=['POST'])
def respond():
print(request.json) # Handle webhook request here
return Response(status=200)
And the same example using Django:
from django.http import HttpResponse
from django.views.decorators.http import require_POST
#require_POST
def example(request):
print(request.json) # Handle webhook request here
return HttpResponse('Hello, world. This is the webhook response.')
If you need more information, here's a great tutorial on how to listen for webhooks with Python.
If you're looking to watch for changes in any repo...
1. If you own the repo that you want to watch
In your repo page, Go to settings
click webhooks, new webhook (top right)
give it your ip/endpoint and setup everything to your liking
use any server to get notified
2. Not your Repo
take the url you want i.e https://github.com/fire17/gd-xo/
add /commits/master.atom to the end such as:
https://github.com/fire17/gd-xo/commits/master.atom
Use any library you want to get that page's content, like:
filter out the keys you want, for example the element
response = requests.get("https://github.com/fire17/gd-xo/commits/master.atom").text
response.split("<updated>")[1].split("</updated>")[0]
'2021-08-06T19:01:53Z'
make a loop that checks this every so often and if this string has changed, then you can initiate a clone/pull request or do whatever you like