Viewing POST data with Python Requests Module - python

I have a this example program, but currently, it doesn't show the post data.
import requests
r = requests.post('https://requestb.in/12p8nqo1',data={'key':'value'})
print(r.text)
. >>> 'ok'
Why doesn't print " key:value "?
Thanks

To access the request data you must deal with the underlying PreparedRequest object, like so:
import requests
r = requests.post('https://requestb.in/12p8nqo1',data={'key':'value'})
print(r.request.body)

Related

What parameters need to be put in the code in order to execute correctly the requests.post function?

What parameters need to be put in regards to this site (www.pyszne.pl) so that the requests function can be executed properly? I need to have a url which leads to the restaurants available under a specific postcode.
here is my code:
import requests
payload = {'myvaluestring':'30-529'}
r = requests.post('https://www.pyszne.pl', data=payload)
print(r.url)
I'm only receiving the same main page url https://www.pyszne.pl/
It's a GET situation, not POST. Try this:
In [1]: import requests
In [2]: r = requests.get("https://www.pyszne.pl/30-529")
In [3]: r.url
Out[3]: 'https://www.pyszne.pl/restauracja-krakow-krakow-podgorze-30-529'
I recommend you to do a search on web "what's the difference of HTTP POST and GET".

Accessing Elasticsearch with Python 3

I want to use the Python 3 module urllib to access an Elasticsearch database at localhost:9200. My script gets a valid request (generated by Kibana) piped to STDIN in JSON format.
Here is what I did:
import json
import sys
import urllib.parse
import urllib.request
er = json.load(sys.stdin)
data = urllib.parse.urlencode(er)
data = data.encode('ascii')
uri = urllib.request.Request('http://localhost:9200/_search', data)
with urllib.request.urlopen(uri) as repsonse:
response.read()
(I understand that my repsonse.read() doesn't make much sense by itself but I just wanted to keep it simple.)
When I execute the script, I get an
HTTP Error 400: Bad request
I am very sure that the JSON data I'm piping to the script is correct, since I had it printed and fed it via curl to Elasticsearch, and got back the documents I expected to get back.
Any ideas where I went wrong? Am I using urllib correctly? Do I maybe mess up the JSON data in the urlencode line? Am I querying Elasticsearch correctly?
Thanks for your help.
With requests you can do one of two things
1) Either you create the string representation of the json object yourself and send it off like so:
payload = {'param': 'value'}
response = requests.post(url, data=json.dumps(payload))
2) Or you have requests do it for you like so:
payload = {'param': 'value'}
response = requests.post(url, json = payload)
So depending on what actually comes out of the sys.stdin call (probably - as Kibana would be sending that if the target was ElasticSearch - a string representation of a json object == equivalent of doing json.dumps on a dictionary), but you might have to adjust a bit depending on the output of sys.stdin.
My guess is that your code could work by just doing so:
import sys
import requests
payload = sys.stdin
response = requests.post('http://localhost:9200/_search', data=payload)
And if you then want to do some work with it in Python, requests has a built in support for this too. You just call this:
json_response = response.json()
Hope this helps you on the right track. For further reading om json.dumps/loads - this answer has some good stuff on it.
For anyone who doesn't want to use requests (for example if you're using IronPython where its not supported):
import urllib2
import json
req = urllib2.Request(url, json.dumps(data), headers={'Content-Type': 'application/json'})
response = urllib2.urlopen(req)
Where 'url' can be something like this (example below is search in index):
http://<elasticsearch-ip>:9200/<index-name>/_search/

When calling Rest API from Python 2.7 requests, it responds "reason" but I don't see that in my API

When I call the socialcast api from python 2.7 using requests, I get a response "reason" but I don't see that text in my actual API. Here's my code:
import requests
parameters = {"username" = "myUsername", "password" = "myPassword"}
response = requests.get("https://hub.sas.com/api/groups/808/messages.json", parameters)
response.json()
The beginning of the JSON that I'm passing through is this:
{"messages":[{"id":126433,"user":{"id":4468,"name":
So I would expect something else to come back, but what it returns is:
{u'reason': u''}
Is this an error or is there something I'm not understanding?
I solved my problem by using this code:
import requests
from requests.auth import HTTPBasicAuth
r = requests.get('https://hub.sas.com/api/groups/808/messages.json', auth=HTTPBasicAuth('username', 'password'))
data = r.json()
for message in data['messages']:
print(message['user']['name'])
I'm not sure that the from requests.auth import HTTPBasicAuth or the data = r.json() were necessary but it ended up working for me so I left them in there.

How do I display an API's data with Python?

I am quite new with using API's and working with Python, but what I want to achieve is to display my data I receive from my Json request in HTML on my site with Python, how do I go about displaying the data? For now I only have generated a request from the API and receive a Json response.
# Import the modules
import requests
import json
# Get the feed
rtrans = requests.get("https://42matters.com/api/1/apps/top_google_charts.json?list_name=topselling_free&cat_key=TRANSPORTATION&country=DK&limit=10&access_token=f033114ffaa48a2d31139bd1eb55d9fc54ed6729")
rtrans.text
# Convert it to a Python dictionary
datatransportation = json.loads(rtrans.text)
print datatransportation
My code looks like this currently.
I usually use the urllib2 library when I access API data, but the requests library is very similar.
Here is the code I would use with the urllib2 library:
import urllib2
import json
access_token = "<YOUR ACCESS TOKEN>"
url_address = "https://42matters.com/api/1/apps/top_google_charts.json?list_name=topselling_free&cat_key=TRANSPORTATION&country=DK&limit=10&access_token=" + access_token
url_content_as_text = urllib2.urlopen(url_address).read()
url_content_as_json = json.loads(url_content_as_text)
print url_content_as_json
Here is the code I would use with the requests library:
import requests
access_token = "<YOUR ACCESS TOKEN>"
url_address = "https://42matters.com/api/1/apps/top_google_charts.json?list_name=topselling_free&cat_key=TRANSPORTATION&country=DK&limit=10&access_token=" + access_token
url_content = requests.get(url_address)
print url_content.json()

Connecting to API with python 2.7 and urllib2

How can I connect to an API using python 2.7. I have recently tried to use urllib2.urlopen('pastedUrl with APIkey') and it is not working. When I try this nothing happens. It just freezes.
import urllib2
import json
//api key is not real api key
locu_api = '12345'
url = 'https://api.locu.com/v1_0/venue/search/?has_menu=TRUE&locality=Austin&api_key=locu_api'
json_obj = urllib2.urlopen(url)
data = json.load(json_obj)
print data
*Update 12/20/15
I didn't want to put my API key in there so I made a variable called "locu_api". But here it is this is exactly what I have in my code:
import urllib2
import json
locu_api = '6252bab312fd63a8b43f273bbbc5b8ae973d982'
url = 'https://api.locu.com/v1_0/venue/search/?has_menu=TRUE&locality=Austin&api_key=6252bab312fd63a8b43f273bbbc5b8ae973d982'
json_obj = urllib2.urlopen(url)
data = json.load(json_obj)
print data
The problem with your code is: You are not using your api key anywhere, the url should look like
url = 'https://api.locu.com/v1_0/venue/search/?has_menu=TRUE&locality=Austin&api_key={}'.format(locu_api)
With your request you should get an HTTP Error 401 . If your application really freezes, there is a problem with your connection

Categories

Resources