Mocking external API for testing with Python - python

Context
I am trying to write tests for functions that query an external API. These functions send requests to the API, get responses and process them.
In my tests, I want to simulate the external API with a Mock server that is ran locally.
So far, the mock server is ran successfully and responds to custom GET queries.
The problem
The external API responds with objects of type <class 'dict'>, while apparently all I can get from my mock server is a response of type <class 'bytes'>. The mock server is fetching pre-defined data from disk and returning them through a stream. Since I can not simulate the external API, my tests throw error messages because of the wrong types of responses.
Following are snippets of my code with some explanations.
1. setUp() function:
The setUp function is ran at the beginning of the test suite. It is responsible of configuring and running the server before running the tests:
def setUp(self):
self.factory = APIRequestFactory()
# Configuring the mock server
self.mock_server_port = get_free_port()
self.mock_server = HTTPServer(('localhost', self.mock_server_port), MockServerRequestHandler)
# Run the mock server in a separate thread
self.mock_server_thread = Thread(target=self.mock_server.serve_forever)
self.mock_server_thread.setDaemon(True)
self.mock_server_thread.start()
2. The MockServerClassHandler:
class MockServerRequestHandler(BaseHTTPRequestHandler):
def do_GET(self):
if re.search(config.SYSTEM_STATUS_PATTERN, self.path):
# Response status code
self.send_response(requests.codes.ok)
# Response headers
self.send_header("Content-Type", "application/json; charset=utf-8")
self.end_headers()
# Purge response from a file and serve it
with open('/path/to/my-json-formatted-file') as data_file:
response_content = json.dumps(json.load(data_file))
# Writing to a stream that need bytes-like input
# https://docs.python.org/2/library/basehttpserver.html
self.wfile.write(response_content.encode('utf-8'))
return
To my understanding from the official documentation, a BaseHTTPRequestHandler can only serve the content of his response through writing in a predefined stream (wfile) which needs to be given (and I am quoting an error message) a byte-like variable.
So my questions are:
Is there a way to make my mock server respond with other types of content than bytes? (JSON, python dicts ...)
Is it safe to write, in the functions that I test, a piece of code that will convert the bytes variables to Python dicts, just so I can test them with my mock server? Or is this violating some principles of testing?
Is there another way of writing a server that responds with JSON and python dicts?

In the comments, it sounds like you solved your main problem, but you're interested in learning how to mock out the web requests instead of launching a dummy web server.
Here's a tutorial on mocking web API requests, and the gory details are in the documentation. If you're using legacy Python, you can install the mock module as a separate package from PyPI.
Here's a snippet from the tutorial:
#patch('project.services.requests.get')
def test_getting_todos_when_response_is_ok(mock_get):
todos = [{
'userId': 1,
'id': 1,
'title': 'Make the bed',
'completed': False
}]
# Configure the mock to return a response with an OK status code. Also, the mock should have
# a `json()` method that returns a list of todos.
mock_get.return_value = Mock(ok=True)
mock_get.return_value.json.return_value = todos
# Call the service, which will send a request to the server.
response = get_todos()
# If the request is sent successfully, then I expect a response to be returned.
assert_list_equal(response.json(), todos)

Related

What's the equivalent of the php Laravel's "Http::fake()" in Python/ Django / DRF / Pytest?

Laravel's Http:fake() method allows you to instruct the HTTP client to return stubbed / dummy responses when requests are made. How can I achieve the same using Django Rest Framework APIClient in tests?
I tried requests_mock but it didn't yield the result I was expecting. It only mocks requests made within test function and not anywhere else within the application or project you're testing.
When you use pytest-django you can use import the fixture admin_client and then do requests like this:
def test_get_project_list(admin_client):
resp = admin_client.get("/projects/")
assert resp.status_code == 200
resp_json = resp.json()
assert resp_json == {"some": "thing"}
The equivalent of Laravel's Http::fake() in Django is requests_mock.
You must user python requests module to call external apis
Then user requests_mock to fake external APIs
The mocker is a loading mechanism to ensure the adapter is correctly in place to intercept calls from requests. Its goal is to provide an interface that is as close to the real requests library interface as possible.
Let me know if you need me to post an example.
You can read more from the requests mock module official website

Contract testing with Kafka in Python environment?

I am working with multiple applications that communicate asynchronously using Kafka. These applications are managed by several departments and contract testing is appropriate to ensure that the messages used during communication follow the expected schema and will evolve according to the contract specification.
It sounded like the pact library for python is a good fit because it helps creating contract tests for HTTP and message integrations.
What I wanted to do is to send an HTTP request and to listen from the appropriate and dedicated Kafka topic immediately after. But it seems that the test is forcing me specify an HTTP code even if what I am expecting is a message from a queue without an HTTP status code. Furthermore, it seems that the HTTP request is being sent before the consumer is listening. Here is some sample code.
from pact.consumer import Consumer as p_Consumer
from pact.provider import Provider as p_Provider
from confluent_kafka import Consumer as k_Consumer
pact = p_Consumer('Consumer').has_pact_with(p_Provider('Provider'))
pact.start_service()
atexit.register(pact.stop_service)
config = {'bootstrap.servers':'server', 'group.id':0, 'auto.offset.reset':'latest'}
consumer = k_consumer(config)
consumer.subscribe(['usertopic'])
def user():
while True:
msg = consumer.poll(timeout=1)
if msg is None:
continue
else:
return msg.value().decode()
class ConstractTesting(unittest.TestCase):
expected = {
'username': 'UserA',
'id':123,
'groups':['Editors']
}
pact.given('UserA exists and is not an administrator')
.upon_receiving('a request for UserA')
.with_request(method='GET',path='/user/')
.will_respond_with(200, body=expected)
with pact:
result = user()
self.assertEqual(result,expected)
How would I carry out contract testing in Python using Kafka? It feels like I am going through a lot of hoops to carry out this test.
With Pact message it's a different API you write tests against. You don't use the standard HTTP one, in fact the transport itself is ignored altogether and it's just the payload - the message - we're interested in capturing and verifying. This allows us to test any queue without having to build specific interfaces for each
See this example: https://github.com/pact-foundation/pact-python/blob/02643d4fb89ff7baad63e6436f6a929256c6bf12/examples/message/tests/consumer/test_message_consumer.py#L65
You can read more about message pact testing here: https://docs.pact.io/getting_started/how_pact_works#non-http-testing-message-pact
And finally here are some Kafka examples for other languages that may be helpful: https://docs.pactflow.io/docs/examples/kafka/js/consumer

Is my API a REST architekture?

I'm currently designing a webservice in Python with Flask. I now got very confused if it is a RESTful service or just a regular webservice. I've been reading quite a few sources about RESTful services, but still I'm not able to say if my service is a REST architekture or not.
The requests to my API are stateless.
Here is what I have:
from flask import Flask,request
if __name__ == "__main__":
appLogger.info("RestFul service initialized and started")
app.run(host="0.0.0.0",port=int("80"),debug=True)
#app.route('/',methods = ['POST'])
def add():
"""
This function is mapped to the POST request of the REST interface
"""
#check if a JSON object is declared in the header
if request.headers['Content-Type'] == 'application/json; charset=UTF-8' and request.data:
try:
data = json.dumps(request.json)
#check if recieved JSON object is valid according to the scheme
#if (validateJSON(data)):
try:
saveToMongo(data)
appLogger.info("Record saved to MongoDB")
return "JSON Message saved in MongoDB"
except:
appLogger.error("Could not write to MongoDB")
except:
appLogger.error("Recieved invalid JSON")
else:
appLogger.error("Content-Type not defined or empty content")
raise FailedRequest
I non of the possible responses, I'll return a json, which is actually the payload of a request. It's always a regular http-response with a custome text as a result description.
Is that right, that because of this fact, it is not a RESTful service, and if I want to call it a RESTful service, that I would need to return back a json object? Or am I completely wrong? Is my API just a simple RPC?
I see only one resource / to which a POST request can be made. There is no way to GET a collection of objects or a single object saved in this way.
One could argue that suche a trivial system does not violate any REST principle. But I think this is not enough to call a system RESTful. It is a trivial RPC system with a single anonymous 'save' method.

python jsonrpc2 client example connecting to remote hello world example using httplib?

I am trying to create a jsonrpc2 server that will accept json over http , process the data and return json back to the requesting client.
I am quite new to rpc servers and wsgi and have only used it as part of a webframework like django.
I am attempting to follow the example given with the jsonrpc2 documentation. The first step is creating a file hello.py
def greeting(name):
return dict(message="Hello, %s!" % name)
The next step involves starting the service
runjsonrpc2 hello
runserver :8080
I know the service is working since when I use a browser on a remote machine and browse to http://myip.dydns.org:8080 , It responds with "405 Method Not Allowed" and I see debug information on my server shell
DEBUG:root:jsonrpc
DEBUG:root:check method
The next step is what I am having a hard time understanding. I want to know how to create a python client to send json to the service and get a response.
What I tried is :
>>> from httplib import HTTPConnection
>>> h = HTTPConnection("myip.dydns.org:8080")
>>> from json import JSONEncoder
>>> call_values = {'jsonrpc':'2.0', 'method':'greeting', 'id':'greeting'}
What are the steps involved to get the response from the webservice using python.
Sadly the jsonrpc2 documentation only uses a TestApp from a webtest library to test on localhost.
I could not find any sample python code that creates a client from a remote machine and gets a response for the greeting function.
Can someone help me to get started.
edit: I got a little further . But I still cannot get the contents of the response
>>> from httplib import HTTPConnection
>>> con = HTTPConnection("myip.dyndns.org:8080")
>>> import json
>>> con.request('POST', '/', json.dumps({"jsonrpc": "2.0", "method": "casoff_jsonrpc2.greeting", "id":1.0,"params":{"name":"harijay"}},ensure_ascii=False).encode('utf-8'), {'Content-Type': 'application/json;charset=utf-8'})
I see the server then echo to its shell
DEBUG:root:jsonrpc
DEBUG:root:check method
DEBUG:root:check content-type
DEBUG:root:response {"jsonrpc": "2.0", "id": 1.0, "result": {"message": "Hello, harijay!"}}
But on the client. I dont know how to get the result.
edit2: I finally solved this
All I had to do was
>>> con.getresponse().read()
Try excellent package requests
I you intend to do anything with http clients in Python, I would highly recommend learning requests - it is an order of magnitude easier to learn and use than any other http related module in Python and for me it became sort of Swiss army knife when experimenting over http.
Example of how to use if for JSON-RPC is here:
https://stackoverflow.com/a/8634905/346478

How to access wsgi params from a request inside a middleware and a flask request without side effect?

I need to read some values from the wsgi request before my flask app is loaded. If I read the url from the wsgi request I can access the file without any issues once the flask app is loaded (after the middleware runs).
But if I attempt to access the params it seems to remove the post data once the flask app is loaded. I even went to the extreme of wrapping the wsgi request with a special Webob Request to prevent this "read once" problem.
Does anyone know how to access values from the wsgi request in middleware without doing any sort of side effect harm to the request so you can get post data / file data in a flask app?
from webob import Request
class SomeMiddleware(object):
def __init__(self, environ):
self.request = Request(environ)
self.orig_environ = environ
def apply_middleware(self):
print self.request.url #will not do any harm
print self.request.params #will cause me to lose data
Here is my flask view
#app.route('/')
def hello_world():
from flask import request
the_file = request.files['file']
print "and the file is", the_file
From what I can tell, this is a limitation of the way that WSGI works. The stream needs only be consumable once (PEP 333 and 3333 only require that the stream support read* calls, tell does not need to be supported). Once the stream is exhausted it cannot be re-streamed to other WSGI applications further "inward". Take a look at these two sections of Werkzeug's documentation for more information:
http://werkzeug.pocoo.org/docs/request_data/
http://werkzeug.pocoo.org/docs/http/#module-werkzeug.formparser
The way to avoid this issue is to wrap the input stream (wsgi.input) in an object that implements the read and readline methods. Then, only when the final application in the chain actually attempts to exhaust the stream will your methods be run. See Flask's documentation on generating a request checksum for an example of this pattern.
That being said, are you sure a middleware is the best solution to your problems? If you need to perform some action (dispatch, logging, authentication) based on the content of the body of the request you may be better off making it a part of your application, rather than a stand-alone application of its own.

Categories

Resources