this is a test script to request data from Rovi API, provided by the API itself.
test.py
import requests
import time
import hashlib
import urllib
class AllMusicGuide(object):
api_url = 'http://api.rovicorp.com/data/v1.1/descriptor/musicmoods'
key = 'my key'
secret = 'secret'
def _sig(self):
timestamp = int(time.time())
m = hashlib.md5()
m.update(self.key)
m.update(self.secret)
m.update(str(timestamp))
return m.hexdigest()
def get(self, resource, params=None):
"""Take a dict of params, and return what we get from the api"""
if not params:
params = {}
params = urllib.urlencode(params)
sig = self._sig()
url = "%s/%s?apikey=%s&sig=%s&%s" % (self.api_url, resource, self.key, sig, params)
resp = requests.get(url)
if resp.status_code != 200:
# THROW APPROPRIATE ERROR
print ('unknown err')
return resp.content
from another script I import the module:
from roviclient.test import AllMusicGuide
and create an instance of the class inside a mood function:
def mood():
test = AllMusicGuide()
print (test.get('[moodids=moodids]'))
according to documentation, the following is the syntax for requests:
descriptor/musicmoods?apikey=apikey&sig=sig [&moodids=moodids] [&format=format] [&country=country] [&language=language]
but running the script I get the following error:
unknown err
<h1>Gateway Timeout</h1>:
what is wrong?
"504, try once more. 502, it went through."
Your code is fine, this is a network issue. "Gateway Timeout" is a 504. The intermediate host handling your request was unable to complete it. It made its own request to another server on your behalf in order to handle yours, but this request took too long and timed out. Usually this is because of network congestion in the backend; if you try a few more times, does it sometimes work?
In any case, I would talk to your network administrator. There could be any number of reasons for this and they should be able to help fix it for you.
Related
I'm trying to "wrap" Google Python Client for AI Platform (Unified) into a Cloud Function.
import json
from google.cloud import aiplatform
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
def infer(request):
"""Responds to any HTTP request.
Args:
request (flask.Request): HTTP request object.
Returns:
The response text or any set of values that can be turned into a
Response object using
`make_response <http://flask.pocoo.org/docs/1.0/api/#flask.Flask.make_response>`.
"""
request_json = request.get_json()
project="simple-1234"
endpoint_id="7106293183897665536"
location="europe-west4"
api_endpoint = "europe-west4-aiplatform.googleapis.com"
# The AI Platform services require regional API endpoints.
client_options = {"api_endpoint": api_endpoint}
# Initialize client that will be used to create and send requests.
# This client only needs to be created once, and can be reused for multiple requests.
client = aiplatform.gapic.PredictionServiceClient(client_options=client_options)
# for more info on the instance schema, please use get_model_sample.py
# and look at the yaml found in instance_schema_uri
endpoint = client.endpoint_path(
project=project, location=location, endpoint=endpoint_id
)
instance = request.json["instances"]
instances = [instance]
parameters_dict = {}
parameters = json_format.ParseDict(parameters_dict, Value())
try:
response = client.predict(endpoint=endpoint, instances=instances, parameters=parameters)
if 'error' in response:
return (json.dumps({"msg": 'Error during prediction'}), 500)
except Exception as e:
print("Exception when calling predict: ", e)
return (json.dumps({"msg": 'Exception when calling predict'}), 500)
print(" deployed_model_id:", response.deployed_model_id)
# See gs://google-cloud-aiplatform/schema/predict/prediction/tables_classification.yaml for the format of the predictions.
predictions = response.predictions
for prediction in predictions:
print(" prediction:", dict(prediction))
return (json.dumps({"prediction": response['predictions']}), 200)
When calling client.predict() I'm getting exception 400
{"error": "Required property Values is not found"}
What am I doing wrong?
I believe your parameters variable is not correct, in the documentation example that variable is set like this, as an example:
parameters = predict.params.ImageClassificationPredictionParams(
confidence_threshold=0.5, max_predictions=5,
).to_value()
This is probably why the error says the properties are not found. You will have to set your own parameters and then call the predict method.
I'm working with one pretty old server and for some reason it requires to send cookies in next format (RFC2109 section 4.4). Raw Cookie header:
Cookie: $Version="1"; temp="1234567890";$Path="/";$Domain=".example.com"; session="abcdefgh";$Path="/";$Domain=".example.com"; id="00001111";$Path="/";$Domain=".example.com"
I know, that there's a way to implement that formatting manually using prepared request, but maybe there's some other method?
I'm using python requests.
Code which obviously doesn't work as expected:
from requests import Session
from datetime import datetime, timedelta
from http.cookiejar import DefaultCookiePolicy
sess = Session()
sess.cookies.set_policy(DefaultCookiePolicy(rfc2109_as_netscape=True))
sess.cookies.set(name="temp", value="1234567890", domain=".httpbin.org", path="/", expires=int((datetime.now() + timedelta(days=365)).timestamp()))
sess.cookies.set(name="session", value="abcdefgh", domain=".httpbin.org", path="/", expires=int((datetime.now() + timedelta(days=365)).timestamp()))
sess.cookies.set(name="id", value="00001111", domain=".httpbin.org", path="/", expires=int((datetime.now() + timedelta(days=365)).timestamp()))
resp = sess.get("https://httpbin.org/headers")
print(resp.json())
Upd.
I've tried to set cookie policy to this old standard, but it changed nothing.
Unfortunately, I haven't found other way except using prepared request. Sharing my code for future researchers.
Code:
from requests import Session, Request
def dumb_cookies_request(*args, **kwargs):
session = kwargs.pop('session') if 'session' in kwargs else Session()
req = Request(*args, **kwargs)
prepped = session.prepare_request(req)
if len(session.cookies) > 0:
cookies = ['$Version="1"']
for c in session.cookies:
cookies.append(f'{c.name}="{c.value}";$Path="{c.path}";$Domain="{c.domain}"')
prepped.headers['Cookie'] = '; '.join(cookies)
return session.send(prepped)
Usage:
sess = Session()
# some actions
resp = dumb_cookies_request("GET", "https://httpbin.org/headers", session=sess)
I am building a custom responder for the Cortex/Hive, and I have been unable to get my request response to properly convert into a JSON format. When testing in the local development environment, my code in getCount function works flawlessly, but when adding the cortex responder wrapper to it, my code fails.
Since the responder runs from Cortex, I do not receive an error message beyond "input: null", so I had to write to an error log. Using this log, I determined that the error stems from the line data = json.loads(response.text). I tried using simple json, regex-ing the desired value from response.text, changing encoding methods, and banging my head on the keyboard from the sheer stupidity of it not working.
CODE:
import requests
import json
from cortexutils.responder import Responder
class Search(Responder):
def __init__(self):
# Debug
with open('/xxx/xxx/xxx/xxx/error.log','a') as error_log:
error_log.write('Starting: \n')
error_log.write('\n\n')
# End Debug
Responder.__init__(self)
self.apiuser = self.get_param('config.api_user', None)
self.apikey = self.get_param('config.api_key', None)
self.url = self.get_param('config.api_url', None)
self.ioc_type = self.get_param('data.dataType', None)
self.ioc = self.get_param('data.data', None)
# Debug
with open('/xxx/xxx/xxx/xxx/error.log','a') as error_log:
error_log.write('User ID: \n')
error_log.write(self.apiuser)
error_log.write('\n\nSecret: \n')
error_log.write(self.apikey)
error_log.write('\n\n')
error_log.write('IOC Type: \n')
error_log.write(self.ioc_type)
error_log.write('\n\n')
error_log.write('Value: \n')
error_log.write(self.ioc)
error_log.write('\n\n')
# End Debug
def getCount(self):
with open('/xxx/xxx/xxx/xxx/error.log','a') as error_log:
error_log.write('Starting Count: \n')
error_log.write('\n\n')
url = self.url
headers={'Content-Type': 'application/json'}
params={'type': self.ioc_type, 'value': self.ioc}
response = requests.request("GET", url, headers=headers, params = params, auth=(self.apiuser, self.apikey))
data = json.loads(response.text)
with open('/xxx/xxx/xxx/xxx/error.log','a') as error_log:
error_log.write('Response: ')
error_log.write(data)
deviceCount = data['resources'][0]['device_count']
self.count = deviceCount
def run(self):
Responder.run(self)
self.getCount()
self.operations()
def operations(self):
return [self.build_operation('AddTagToCase', tag= self.count)]
if __name__ == '__main__':
Search().run()
Results from response.text:
{"meta":{"query_time":0.014920091,"trace_id":"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"},
"resources":[{"id":"sha256:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"type":"sha256",
"value":"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"device_count":11}],"errors":[]}
Error Logging results:
Starting:
User ID:
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Secret:
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
IOC Type:
sha256
Value:
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
run starting:
Starting Count:
self.count should be equal to device_count, but json.loads fails to format this response. This can be seen in my error log results where Count() starts but abruptly ends before data is written to it.
If you could please provide insight into why this fails to format the response properly, please shed some light.
Thank you
This is my standard way of getting a json response from an API using the requests module
response = requests.get(url, headers=headers, params=params, auth=(self.apiuser, self.apikey)).json()
I lately started using Flask in one of my projects to provide data via a simple route. So far I return a json file containing the data and some other information. When running my Flask app I see the status code of this request in terminal. I would like to return the status code as a part of my final json file. Is it possible to catch the same code I see in terminal?
Some simple might look like this
from flask import Flask
from flask import jsonify
app = Flask(__name__)
#app.route('/test/<int1>/<int2>/')
def test(int1,int2):
int_sum = int1 + int2
return jsonify({"result":int_sum})
if __name__ == '__main__':
app.run(port=8082)
And in terminal I get:
You are who set the response code (by default 200 on success response), you can't catch this value before the response is emited. But if you know the result of your operation you can put it on the final json.
#app.route('/test/<int1>/<int2>/')
def test(int1, int2):
int_sum = int1 + int2
response_data = {
"result": int_sum,
"sucess": True,
"status_code": 200
}
# make sure the status_code on your json and on the return match.
return jsonify(response_data), 200 # <- the status_code displayed code on console
By the way if you access this endpoint from a request library, on the response object you can find the status_code and all the http refered data plus the json you need.
Python requests library example
import requests
req = requests.get('your.domain/test/3/3')
print req.url # your.domain/test/3/3
print req.status_code # 200
print req.json() # {u'result': 6, u'status_code: 200, u'success': True}
You can send HTTP status code as follow:
#app.route('/test')
def test():
status_code = 200
return jsonify({'name': 'Nabin Khadka'}, status_code) # Notice second element of the return tuple(return)
This way you can control what status code to return to the client (typically to web browser.)
I am using Python's requests library in one method of my application. The body of the method looks like this:
def handle_remote_file(url, **kwargs):
response = requests.get(url, ...)
buff = StringIO.StringIO()
buff.write(response.content)
...
return True
I'd like to write some unit tests for that method, however, what I want to do is to pass a fake local url such as:
class RemoteTest(TestCase):
def setUp(self):
self.url = 'file:///tmp/dummy.txt'
def test_handle_remote_file(self):
self.assertTrue(handle_remote_file(self.url))
When I call requests.get with a local url, I got the KeyError exception below:
requests.get('file:///tmp/dummy.txt')
/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/requests/packages/urllib3/poolmanager.pyc in connection_from_host(self, host, port, scheme)
76
77 # Make a fresh ConnectionPool of the desired type
78 pool_cls = pool_classes_by_scheme[scheme]
79 pool = pool_cls(host, port, **self.connection_pool_kw)
80
KeyError: 'file'
The question is how can I pass a local url to requests.get?
PS: I made up the above example. It possibly contains many errors.
As #WooParadog explained requests library doesn't know how to handle local files. Although, current version allows to define transport adapters.
Therefore you can simply define you own adapter which will be able to handle local files, e.g.:
from requests_testadapter import Resp
import os
class LocalFileAdapter(requests.adapters.HTTPAdapter):
def build_response_from_file(self, request):
file_path = request.url[7:]
with open(file_path, 'rb') as file:
buff = bytearray(os.path.getsize(file_path))
file.readinto(buff)
resp = Resp(buff)
r = self.build_response(request, resp)
return r
def send(self, request, stream=False, timeout=None,
verify=True, cert=None, proxies=None):
return self.build_response_from_file(request)
requests_session = requests.session()
requests_session.mount('file://', LocalFileAdapter())
requests_session.get('file://<some_local_path>')
I'm using requests-testadapter module in the above example.
Here's a transport adapter I wrote which is more featureful than b1r3k's and has no additional dependencies beyond Requests itself. I haven't tested it exhaustively yet, but what I have tried seems to be bug-free.
import requests
import os, sys
if sys.version_info.major < 3:
from urllib import url2pathname
else:
from urllib.request import url2pathname
class LocalFileAdapter(requests.adapters.BaseAdapter):
"""Protocol Adapter to allow Requests to GET file:// URLs
#todo: Properly handle non-empty hostname portions.
"""
#staticmethod
def _chkpath(method, path):
"""Return an HTTP status for the given filesystem path."""
if method.lower() in ('put', 'delete'):
return 501, "Not Implemented" # TODO
elif method.lower() not in ('get', 'head'):
return 405, "Method Not Allowed"
elif os.path.isdir(path):
return 400, "Path Not A File"
elif not os.path.isfile(path):
return 404, "File Not Found"
elif not os.access(path, os.R_OK):
return 403, "Access Denied"
else:
return 200, "OK"
def send(self, req, **kwargs): # pylint: disable=unused-argument
"""Return the file specified by the given request
#type req: C{PreparedRequest}
#todo: Should I bother filling `response.headers` and processing
If-Modified-Since and friends using `os.stat`?
"""
path = os.path.normcase(os.path.normpath(url2pathname(req.path_url)))
response = requests.Response()
response.status_code, response.reason = self._chkpath(req.method, path)
if response.status_code == 200 and req.method.lower() != 'head':
try:
response.raw = open(path, 'rb')
except (OSError, IOError) as err:
response.status_code = 500
response.reason = str(err)
if isinstance(req.url, bytes):
response.url = req.url.decode('utf-8')
else:
response.url = req.url
response.request = req
response.connection = self
return response
def close(self):
pass
(Despite the name, it was completely written before I thought to check Google, so it has nothing to do with b1r3k's.) As with the other answer, follow this with:
requests_session = requests.session()
requests_session.mount('file://', LocalFileAdapter())
r = requests_session.get('file:///path/to/your/file')
The easiest way seems using requests-file.
https://github.com/dashea/requests-file (available through PyPI too)
"Requests-File is a transport adapter for use with the Requests Python library to allow local filesystem access via file:// URLs."
This in combination with requests-html is pure magic :)
packages/urllib3/poolmanager.py pretty much explains it. Requests doesn't support local url.
pool_classes_by_scheme = {
'http': HTTPConnectionPool,
'https': HTTPSConnectionPool,
}
In a recent project, I've had the same issue. Since requests doesn't support the "file" scheme, I'll patch our code to load the content locally. First, I define a function to replace requests.get:
def local_get(self, url):
"Fetch a stream from local files."
p_url = six.moves.urllib.parse.urlparse(url)
if p_url.scheme != 'file':
raise ValueError("Expected file scheme")
filename = six.moves.urllib.request.url2pathname(p_url.path)
return open(filename, 'rb')
Then, somewhere in test setup or decorating the test function, I use mock.patch to patch the get function on requests:
#mock.patch('requests.get', local_get)
def test_handle_remote_file(self):
...
This technique is somewhat brittle -- it doesn't help if the underlying code calls requests.request or constructs a Session and calls that. There may be a way to patch requests at a lower level to support file: URLs, but in my initial investigation, there didn't seem to be an obvious hook point, so I went with this simpler approach.
To load a file from a local URL, e.g. an image file you can do this:
import urllib
from PIL import Image
Image.open(urllib.request.urlopen('file:///path/to/your/file.png'))
I think simple solution for this will be creating temporary http server using python and using it.
Put all your files in temporary folder eg. tempFolder
Go to that directory and create a temporary http server in terminal/cmd as per your OS using command python -m http.server 8000 (Note 8000 is port no.)
This will you give you a link to http server. You can access it from http://127.0.0.1:8000/
Open your desired file in browser and copy the link to your url.