I'am trying to mock two same get requests but with different result. My goal is to test second exception (except requests.exceptions.RequestException) but I can't make it work so that first requests.get pass and second requests.get fail to "connect" and therefore to reach second exception. Is that even possible? Thx!
try:
tenants = requests.get('https://hostname_1')
for tenant in tenants:
try:
a = requests.get('https://host_2')
try:
some_function(arguments)
except Exception as e:
print(e)
except requests.exceptions.RequestException as e:
print(e)
except Exception as e:
print(e)
here is what I tried:
#patch("argo_probe_poem.poem_cert.requests.get")
#patch("argo_probe_poem.poem_cert.requests.get")
def test_raise_request_exception(self, mock_requests1, mock_requests2):
mock_requests1.side_effect = pass_web_api
mock_requests2.side_effect = requests.exceptions.RequestException
with self.assertRaises(SystemExit) as e:
utils_func(self.arguments)
self.assertEqual(e.exception.code, 2)
You can make the Mock object return different values and/or raise different exceptions on different calls by specifying an iterable as the side_effect attribute.
An excerpt from the documentation:
If you pass in an iterable, it is used to retrieve an iterator which
must yield a value on every call. This value can either be an
exception instance to be raised, or a value to be returned from the
call to the mock...
So your test code should roughly look like:
#patch("argo_probe_poem.poem_cert.requests.get")
def test_raise_request_exception(self, mock_requests):
mock_requests.side_effect = pass_web_api, requests.exceptions.RequestException
with self.assertRaises(requests.exceptions.RequestException) as e:
utils_func(self.arguments)
Related
I have always wondered who takes the control of the program after an exception has thrown. I was seeking for a clear answer but did not find any. I have the following functions described, each one executes an API call which involves a network request, therefore I need to handle any possible errors by a try/except and possibly else block (JSON responses must be parsed/decoded as well):
# This function runs first, if this fails, none of the other functions will run. Should return a JSON.
def get_summary():
pass
# Gets executed after get_summary. Should return a string.
def get_block_hash():
pass
# Gets executed after get_block_hash. Should return a JSON.
def get_block():
pass
# Gets executed after get_block. Should return a JSON.
def get_raw_transaction():
pass
I wish to implement a kind of retry functionality on each function, so if it fails due to a timeout error, connection error, JSON decode error etc., it will keep retrying without compromising the flow of the program:
def get_summary():
try:
response = request.get(API_URL_SUMMARY)
except requests.exceptions.RequestException as error:
logging.warning("...")
#
else:
# Once response has been received, JSON should be
# decoded here wrapped in a try/catch/else
# or outside of this block?
return response.text
def get_block_hash():
try:
response = request.get(API_URL + "...")
except requests.exceptions.RequestException as error:
logging.warning("...")
#
else:
return response.text
def get_block():
try:
response = request.get(API_URL + "...")
except requests.exceptions.RequestException as error:
logging.warning("...")
#
else:
#
#
#
return response.text
def get_raw_transaction():
try:
response = request.get(API_URL + "...")
except requests.exceptions.RequestException as error:
logging.warning("...")
#
else:
#
#
#
return response.text
if __name__ == "__main__":
# summary = get_summary()
# block_hash = get_block_hash()
# block = get_block()
# raw_transaction = get_raw_transaction()
# ...
I want to keep clean code on the outermost part of it (block after if __name__ == "__main__":), I mean, I don't want to fill it with full of confused try/catch blocks, logging, etc.
I tried to call a function itself when an exception threw on any of those functions but then I read about stack limit and thought it was a bad idea, there should be a better way to handle this.
request already retries by itself N number of times when I call the get method, where N is a constant in the source code, it is 100. But when the number of retries has reached 0 it will throw an error I need to catch.
Where should I decode JSON response? Inside each function and wrapped by another try/catch/else block? or in the main block? How can I recover from an exception and keep trying on the function it failed?
Any advice will be grateful.
You could keep those in an infinite loop (to avoid recursion) and once you get the expected response just return:
def get_summary():
while True:
try:
response = request.get(API_URL_SUMMARY)
except requests.exceptions.RequestException as error:
logging.warning("...")
#
else:
# As winklerrr points out, try to return the transformed data as soon
# as possible, so you should be decoding JSON response here.
try:
json_response = json.loads(response)
except ValueError as error: # ValueError will catch any error when decoding response
logging.warning(error)
else:
return json_response
This function keeps executing until it receives the expected result (reaches return json_response) otherwise it will be trying again and again.
You can do the following
def my_function(iteration_number=1):
try:
response = request.get(API_URL_SUMMARY)
except requests.exceptions.RequestException:
if iteration_number < iteration_threshold:
my_function(iteration_number+1)
else:
raise
except Exception: # for all other exceptions, raise
raise
return json.loads(resonse.text)
my_function()
Where should I decode JSON response?
Inside each function and wrapped by another try/catch/else block or in the main block?
As a rule thumb: try to transform data as soon as possible into the format you want it to be. It makes the rest of your code easier if you don't have to extract everything again from a response object all the time. So just return the data you need, in the easiest format you need it to be.
In your scenario: You call that API in every function with the same call to requests.get(). Normally all the responses from an API have the same format. So this means, you could write an extra function which does that call for you to the API and directly loads the response into a proper JSON object.
Tip: For working with JSON make use of the standard library with import json
Example:
import json
def call_api(api_sub_path):
repsonse = requests.get(API_BASE_URL + api_sub_path)
json_repsonse = json.loads(repsonse.text)
# you could verify your result here already, e.g.
if json_response["result_status"] == "successful":
return json_response["result"]
# or maybe throw an exception here, depends on your use case
return json_response["some_other_value"]
How can I recover from an exception and keep trying on the function it failed?
You could use a while loop for that:
def main(retries=100): # default value if no value is given
result = functions_that_could_fail(retries)
if result:
logging.info("Finished successfully")
functions_that_depend_on_result_from_before(result)
else:
logging.info("Finished without result")
def functions_that_could_fail(retry):
while(retry): # is True as long as retry is bigger than 0
try:
# call all functions here so you just have to write one try-except block
summary = get_summary()
block_hash = get_block_hash()
block = get_block()
raw_transaction = get_raw_transaction()
except Exception:
retry -= 1
if retry:
logging.warning("Failed, but trying again...")
else:
# else gets only executed when no exception was raised in the try block
logging.info("Success")
return summary, block_hash, block, raw_transaction
logging.error("Failed - won't try again.")
result = None
def functions_that_depend_on_result_from_before(result):
[use result here ...]
So with the code from above you (and maybe also some other people who use your code) could start your program with:
if __name__ == "__main__":
main()
# or when you want to change the number of retries
main(retries=50)
I use some google API as fellow:
def get_address(lat, lng):
url = "https://maps.googleapis.com/maps/api/geocode/json?{}".\
format(urllib.parse.urlencode(args))
...
try:
r = requests.get(url)
...
return r
except OSError as e:
raise NetException(e.message, 400)
while I try use the try-exception handle the excption if the network errors.
the try-exception Expression is from here
def try_except(success, failure, *exceptions):
try:
return success()
except exceptions or Exception:
return failure() if callable(failure) else failure
But when I try to use these exception I always get the failure reults of http, even if I get successful result if I simply run the success function.
>>> re=get_address(-33.865, 151.2094)
>>> re
'Sydney'
>>> r=try_except(get_address(-33.865, 151.2094),"")
>>> r
''
How to make sure the successful result would get correct string reuslt, while the onlly failture of http request get the failure result?
You have to pass the function as success argument. Currently in
r=try_except(get_address(-33.865, 151.2094),"")
you are passing result value of get_address(-33.865, 151.2094) which is 'Sydney'. The actual error is raised on trying to call success() which translates to 'Sydney'() - something like str object is not callable
Proper call would be
r=try_except(lambda: get_address(-33.865, 151.2094), '')
I was wondering if its possible to write a handling exceptions like with 2 or more except with different task to do.
I'm using Django==1.6.1 and Python 2.7
try:
foo_instance = foo.objects.get(field_name='unknown')
except foo.DoesNotExist:
new_rec = foo.objects.create(field_name='unknown')
new_rec.save()
foo_instance = foo.objects.get(field_name='unknown')
except foo.MultipleObjectsReturned:
foo_list = foo.objects.filter(field_name='unknown')
for record in foo_list[1:]:
print 'Deleting foo id: ', record.id
record.delete()
foo_instance = foo.objects.get(field_name='unknown')
You could use multiple try: except: but in your current scenario Why don't you use get_or_create ?
try: expect: contain all errors on 'Exception'. for this syntax is all
except Exception as e:
get_or_create(defaults=None, **kwargs)
A convenience method for looking up an object with the given kwargs
(may be empty if your model has defaults for all fields), creating one
if necessary.
Returns a tuple of (object, created), where object is the retrieved or
created object and created is a boolean specifying whether a new
object was created.
This reduces your above code to -
obj, created = foo.objects.get_or_create(field_name='unknown')
if created:
obj.save()
I think get_or_create raises IntegrityError or MultipleObjectsReturned, to handle those simply wrap it in a try:
try:
obj, created = foo.objects.get_or_create(field_name='unknown')
if created:
obj.save()
except IntegrityError:
#do something
except MultipleObjectsReturned:
#do something else
except Exception as e:
raise e
If I catch a KeyError, how can I tell what lookup failed?
def poijson2xml(location_node, POI_JSON):
try:
man_json = POI_JSON["FastestMan"]
woman_json = POI_JSON["FastestWoman"]
except KeyError:
# How can I tell what key ("FastestMan" or "FastestWoman") caused the error?
LogErrorMessage ("POIJSON2XML", "Can't find mandatory key in JSON")
Take the current exception (I used it as e in this case); then for a KeyError the first argument is the key that raised the exception. Therefore we can do:
except KeyError as e: # One would do it as 'KeyError, e:' in Python 2.
cause = e.args[0]
With that, you have the offending key stored in cause.
Expanding your sample code, your log might look like this:
def poijson2xml(location_node, POI_JSON):
try:
man_json = POI_JSON["FastestMan"]
woman_json = POI_JSON["FastestWoman"]
except KeyError as e:
LogErrorMessage ("POIJSON2XML", "Can't find mandatory key '"
e.args[0]
"' in JSON")
It should be noted that e.message works in Python 2 but not Python 3, so it shouldn't be used.
Not sure if you're using any modules to assist you - if the JSON is coming in as a dict, one can use dict.get() towards a useful end.
def POIJSON2DOM (location_node, POI_JSON):
man_JSON = POI_JSON.get("FastestMan", 'No Data for fastest man')
woman_JSON = POI_JSON.get("FastestWoman", 'No Data for fastest woman')
#work with the answers as you see fit
dict.get() takes two arguments - the first being the key you want, the second being the value to return if that key does not exist.
If you import the sys module you can get exception info with sys.exc_info()
like this:
def POIJSON2DOM (location_node, POI_JSON):
try:
man_JSON = POI_JSON["FastestMan"]
woman_JSON = POI_JSON["FastestWoman"]
except KeyError:
# you can inspect these variables for error information
err_type, err_value, err_traceback = sys.exc_info()
REDI.LogErrorMessage ("POIJSON2DOM", "Can't find mandatory key in JSON")
I've create little class for parsing websites.
There's URLError exception:
def visit(self, url, referer=None, data=None):
(...)
# Return BeautifulSoup instance.
try:
return BeautifulSoup(self.opener.open(self.request))
# URLError.
except urllib.error.URLError as error:
return error
Everything works okay. But I'm in need to create a wrapper of this function.
def get_links(self, *args, **kwargs):
# Get links with BeautifulSoup.
self.links = self.visit(*args, **kwargs).find_all('a')
Get_links function also works well until there is URLError (403, 404, whatever...). How can I solve this problem? Is there something as inheritance exceptions?
Your visit() function catches exception and returns you a URLError object, on which you're calling find_all(), which it doesn't have.
Something in lines of:
self.links = self.visit(*args, **kwargs)
if not isinstance(self.links, urllib.error.URLError):
self.links = self.links.find_all('a')
else:
# Do something with an HTTP Error
Should give you an idea of a flow. You can't catch that exception in your outer get_links() because it's already caught by visit() and is simply returned.
If you want to catch it in get_links(), change
return error
to
raise error
in your visit() method, although then you'll be throwing the exception you just caught again, I'm not sure whether this is the behavior you want.