Given an aiohttp.web application with views like this one:
async def hello(request):
return web.Response(body=b"Hello, world")
I'm trying to understand how to properly unit-test them.
I normally use Django's own test client when writing Django apps, and was looking for something similar for aiohttp.web. I'm not sure this is the right approach or not.
TL;DR: How do I simulate a request to an aiohttp.web app in a unittest?
You may create web server and perform real HTTP requests without simulation.
See https://github.com/KeepSafe/aiohttp/blob/master/tests/test_client_functional.py for example.
Related
What I need is simple: a piece of code that will receive a GET request, process some data, and then generate a response. I'm completely new to python web development, so I've decided to use DRF for this purpose because it seemed like the most robust solution, but every example I found online consisted of CRUDs with models and views, and I figure what I need is something more simple (since the front-end is already created).
Could anyone provide an example on how to do this on DRF? (or even some other viable solution, having in mind that it needs to be robust enough to take on multiple requests at the same time in production)
Simple way to do thing you want is by using Django REST Framework's APIView (or #api_view decorator).
Here is an example of it in the docs: https://www.django-rest-framework.org/api-guide/views/.
Besides code on that page, you would need to register your view on appropriate route, which can be found here: https://www.django-rest-framework.org/api-guide/routers/
Django and Django REST Framework are pretty heavy products out-of-the-box.
If you want something more lightweight that can handle many incoming requests, you could create a simple Express server using Node.js. This would result in very few lines of code on your end.
Sample Node server:
var express = require('express')
var app = express()
app.get('/', (req, res) => {
res.send('hello world')
});
app.listen(8000);
For DRF:
https://www.django-rest-framework.org/tutorial/quickstart/
Another viable option:
Flask:
https://flask.palletsprojects.com/en/1.1.x/quickstart/
#app.route("/")
def start():
#will do some task
return 'completed'
In the above program, after execution the 1st request 2nd request will execute. But I want to make such a server that will accept, execute and response multiple requests at a certain time parallelly by using flask or anything else.
How will I make this?
For multi-request handling/production deployment, gunicorn or apache or gevent has to be used.
http://flask.pocoo.org/docs/0.11/deploying/
Similar approach follows for other python web frameworks too like Django.
You can use klein module, which handle multiple requests in a time.
Please refer the following lin which will give clear explanation
about limiting in FLASK.
Comparison between Flask and Klein
After refering this link I switched from Flask to Klein. Hope it helps you too.
I'm making a Flask app and I was wondering if I could render a template for a route, but redirect the user after a function is complete. Currently using Python 2.7 Here is my example
#app.route('/loading/matched')
def match():
time_match()
return render_template('matched.html')
def time_match():
# match two users based on time
sleep(3) # pretend to be doing
return redirect('/loading/generation')
I don't know where to begin. Is there a library I should use?
This sounds more like a client sided thing to me? Do you want something like a loading bar?
You could provide an ajax route which initiates heavy workload on the server side - while the client does show some progress. Once the workload finished you render a template which than gets loaded via ajax.
For asycn workload you could look into Celery, which is a great library for that. It even can do work on a seperate server...
More sources Integration in Flask
I currently build a web application using flask, sqlalchemy and jinja2.
To get a proper web interface, I build my views as follows:
#app.route('/mydata/', methods=['GET'])
#login_required
def mydata_list():
# build data here...
return render_template('mydata/index.html', data=data))
Now, if I need to build a REST API, I am supposed to terminate with
return jsonify(data)
So, how to handle this to avoid code duplication? Is it a good practice to add a ?api=True to my url, test it in my view, then return appropriate answer?
There is really no right or wrong way to do this, more so with Flask, which is a framework that imposes so few rules on the developer.
If you want my opinion, I think using the same set of view functions for web site and API leads to code that is harder to maintain, because there are a few significant differences between the two, for example:
Authentication: this is typically done in very different ways for web vs. API.
Content: for the API you just return data, but for a web page the view function may need to do more work and obtain extra data that is only needed for rendering the template.
Request methods: APIs use more HTTP request methods than web apps. For example, to delete a resource through an API the client normally sends a DELETE request. A web application running on a web browser needs to do everything with GET and POST requests. Also, the POST request method has different usages in APIs vs. web apps.
My recommendation is that you make your view functions for both APIs and web apps very thin and put the business logic of your application in common classes that both sets of view functions can invoke.
If you want to use the same endpoint for serving a template as well as JSON data, you can test whether this is an AJAX request with request.is_xhr. For example:
#app.route('/numbers/')
def numbers():
data = [1, 2, 3]
if request.is_xhr:
return jsonify(data=data)
return render_template('numbers.html', data=data)
I am looking for a LAMPish/WAMPish experience.
Something very transparent. Write the script, hit F5 and see the results. Very little, if any abstraction.
SQLAlchemy and (maybe) some simple templating engine will be used.
I need simple access to the environment - similar to the PHP way. Something like the COOKIE, SESSION, POST, GET objects.
I don't want to write a middleware layer just to get some web serving up and running. And I do not want to deal with specifics of CGI.
This is not meant for a very complex project and it is for beginning programmers and/or beginning Python programmers.
An MVC framework is not out of the question. ASP.NET MVC is nicely done IMO. One thing I liked is that POSTed data is automatically cast to data model objects if so desired.
Can you help me out here?
Thanks!
PS: I did not really find anything matching these criteria in older posts.
CherryPy might be what you need. It transparently maps URLs onto Python functions, and handles all the cookie and session stuff (and of course the POST / GET parameters for you).
It's not a full-stack solution like Django or Rails. On the other hand, that means that it doesn't lump you with a template engine or ORM you don't like; you're free to use whatever you like.
It includes a WSGI compliant web server, so you don't even need Apache.
For low barrier to entry, web.py is very very light and simple.
Features:
easy (dev) deploy... copy web.py folder into your app directory, then start the server
regex-based url mapping
very simple class mappings
built-in server (most frameworks have this of course)
very thin (as measured by lines of code, at least) layer over python application code.
Here is its hello world:
import web
urls = (
'/(.*)', 'hello'
)
app = web.application(urls, globals())
class hello:
def GET(self, name):
if not name:
name = 'world'
return 'Hello, ' + name + '!'
if __name__ == "__main__":
app.run()
As much as I like Werkzeug conceptually, writing wsgi plumbing in the Hello, World! is deeply unpleasant, and totally gets in the way of actually demoing an app.
That said, web.py isn't perfect, and for big jobs, it's probably not the right tool, since:
routes style systems are (imho) better than pure regex ones
integrating web.py with other middlewares might be adventurous
What you're describing most resembles Pylons, it seems to me. However, the number of web frameworks in/for Python is huge -- see this page for an attempt to list and VERY briefly characterize each and every one of them!-)
Have you looked into the Django web framework? Its an MVC framework written in python, and is relatively simple to set up and get started. You can run it with nothing but python, as it can use SQLite and its own development server, or you can set it up to use MySQL and Apache if you'd like.
Pylons is another framework that supports SQLAlchemy for models. I've never used it but it seems promising.
Look at:
WSGI, the standard Python API for HTTP servers to call Python code.
Django, a popular, feature-rich, well documented Python web framework
web.py, a minimal Python web framework
Don't forget Bottle. It is a single-file micro web framework with no dependencies and very easy to use. Here is an "Hello world" example:
from bottle import route, run
#route('/')
def index():
return 'Hello World!'
run(host='localhost', port=8080)
And here an example for accessing POST variables (cookies and GET vars are similar)
from bottle import route, request
#route('/submit', method='POST')
def submit():
name = request.POST.get('name', 'World')
return 'Hello %s!' % name
Check out web2py. It runs out of the box with no configuration - even from a USB stick. The template language is pure Python and you can develop your entire app through the browser editor (although I find vim faster ;)