Translating a CURL call to Python script - python

Finding it problematic to convert a simple CURL request in Windows to a Python script
The CURL command is
curl -X POST -d "{\"query\": \"NEW YORK\"}" http://192.168.0.106:8080/parser
I get the output:
[{"label":"state","value":"new york"}]
The Python script is
from urllib.parse import urlencode
from urllib.request import Request, urlopen
url = 'http://192.168.0.106:8080/parser' # Set destination URL here
post_fields = {"query": "NEW YORK"} # Set POST fields here
request = Request(url, urlencode(post_fields).encode())
json = urlopen(request).read().decode()
print(json)
The output is []. Nothing basically.

Just use requests:
import requests
data = '{"query": "NEW YORK"}'
response = requests.post('http://192.168.0.106:8080/parser', data=data)
Full documentation can be found at Requests: HTTP for Humans

Related

Unable to get response from FreshDesk API in proxy mode {"code" : "invalid_credentials", "message" : "You need to be logged in to perform action"}

So I'm using FreshDesk API and able to get the response using request module, But whenever I'm using proxy servers I'm unable to get the response.
import base64
import requests
import os
from requests.auth import HTTPBasicAuth
import ssl
method = "get"
url = "https://mycompanydomain.freshdesk.com/api/v2/tickets"
apiKey = "XXXXXXXXX"
secret = "x"
os.environ["REQUESTS_CA_BUNDLE"] = "Path to CA Certs"
auth = HTTPBasicAuth(apiKey, secret)
rsp = requests.request(method, url, headers=None, auth=auth)
print(rsp.text)
But whenever I'm using the proxy server in my organization, I'm getting an error message as {"code":"invalid_credentials","message":"You have to be logged in to perform this action."}
Code which I'm using for the proxy servers
import base64
import requests
import http.client
import urllib.parse
method = "get"
apiKey = "XXXXXXXX"
secret = "x"
url = "https://mycompanydomain.freshdesk.com/api/v2/tickets"
cred= '{}:{}'.format(apiKey, secret)
cred = base64.b64encode(cred.encode('utf-8')).decode('utf-8')
authorization_headers = {
'Proxy-Authorization': 'Basic {}'.format(cred)
}
conn = http.client.HTTPSConnection("11.125.250.121", 3128)
conn.set_tunnel("mycompanydomain.freshdesk.com", headers = authorization_headers)
headers = { 'Content-Type' : 'application/json' }
conn.request("GET", "/api/v2/tickets",headers = headers)
res = conn.getresponse()
data = res.read()
print(data.decode("utf-8"))
FreshDesk API Docs for using their API
curl -v -u abcdefghij1234567890:X -H "Content-Type: application/json" -X GET 'https://domain.freshdesk.com/api/v2/tickets'
Any possible way to resolve this error?
Here is a bit of a guess, I haven't actually tried it but I do use Freshdesk, and I should be using their API.
Here is a link to the Freshdesk API:
https://support.freshdesk.com/support/solutions/articles/216548-create-and-update-tickets-with-custom-fields-using-api
I would try to taking out the "-H "Content-Type: application/json" to better match the suggested code. I would add the Content Type for POSTS not GETs in most cases, unless the API specifically calls for it. Try it and let us know how it works.
curl -u API_KEY:X -X GET https://domain.freshdesk.com/api/v2/ticket_fields
Encoding the api key with the x for example as follows: someapikey:x helps.
See link:
How do I encode and decode a base64 string?
Also see FreshDesk api doc:
https://developers.freshdesk.com/api/#authentication
See the Note which says Encode if plain api key does not work.

Converting curl with --form to python requests

I have a curl request like this:
curl -X POST http://mdom-n-plus-1.nonprod.appsight.us:8081/mesmerdom/v1/getByScreen -F "data={\"screen\":{\"screen-id\":\"57675\"}}"
I am trying to convert it to python by using something like this:
import requests
import json
url = "http://mdom-n-plus-1.nonprod.appsight.us:8081/mesmerdom/v1/getByScreen"
payload = {"data": json.dumps({"screen":["screen-id", "57675"]})}
req = requests.post(url, data=payload)
print (req.text)
but I get the following error:
io.finch.Error$NotPresent: Required param 'data' not present in the request.
What would be the best way to convert the bash curl call to python request in this case?
Welcome to stackoverflow.com.
-F switch of curl denotes form-encoded data.
passing data makes the Content-Type: x-www-form-urlencoded
but it seems that server is accepting Content-Type: multipart/form-data
so we need to pass files as well. but since server is looking for actual data inside form we need to pass data as well.
So this should work:
import requests
url = "http://mdom-n-plus-1.nonprod.appsight.us:8081/mesmerdom/v1/getByScreen"
payload = { 'data' : '{"screen" : {"screen-id": "57675"}}'}
req = requests.post(url, files=dict(data='{"screen":{"screen-id":"57675"}}'), data=payload)
print (req.text)
hope this helps.

API response with a proxy is working with the curl, but nothing is returned with python

I am trying to access data using API which is behind the proxy server. If I use curl command like below it works:
curl --proxy http://MY_PROXY_SERVER:PORT --header "Accept: application/csv" http://WEB_SERVER_ADDRESS/data/CHANNEL?start=1470011400
I get the data which is expected.
When I am trying to access the same URL with python either by requests or urllib2 I am not able to get the data back. This is the code:
from __future__ import print_function
import requests
s = requests.Session()
s.proxies = {"http": "http://MY_PROXY_SERVER:PORT"}
headers = {'Accept': 'application/csv'}
url = "http://WEB_SERVER_ADDRESS/data/CHANNEL?start=1470011400"
r = s.get(url, headers=headers)
print(r.text)
I don't get any error and request is able to go through the python successfully. The output is empty list. I also tried with other media type supported by API like 'json', issue still persists.

How to POST a local file using urllib2 in Python?

I am a complete Python noob and am trying to run cURL equivalents using urllib2. What I want is a Python script that, when run, will do the exact same thing as the following cURL command in Terminal:
curl -k -F docfile=#myLocalFile.csv http://myWebsite.com/extension/extension/extension
I found the following template on a tutorial page:
import urllib
import urllib2
url = "https://uploadWebsiteHere.com"
data = "{From: 'sender#email.com', To: 'recipient#email.com', Subject: 'Postmark test', HtmlBody: 'Hello dear Postmark user.'}"
headers = { "Accept" : "application/json",
"Conthent-Type": "application/json",
"X-Postmark-Server-Token": "abcdef-1234-46cc-b2ab-38e3a208ab2b"}
req = urllib2.Request(url, data, headers)
response = urllib2.urlopen(req)
the_page = response.read()
but I am completely lost on the 'data' and 'headers' vars. The urllib2 documentation (https://docs.python.org/2/library/urllib2.html) defines the 'data' input as "a string specifying additional data to send to the server" and the 'headers' input as "a dictionary". I am totally out of my depth in trying to follow this documentation and do not see why a dictionary is necessary when I could accomplish this same task in terminal by only specifying the file and URL. Thoughts, please?
The data you are posting doesn't appear to be valid JSON. Assuming the server is expecting valid JSON, you should change that.
Your curl invocation does not pass any optional headers, so you shouldn't need to provide much in the request. If you want to verify the exact headers you could add -vi to the curl invocation and directly match them in the Python code. Alternatively, this works for me:
import urllib2
url = "http://localhost:8888/"
data = '{"From": "sender#email.com", "To": "recipient#email.com", "Subject": "Postmark test", "HtmlBody": "Hello dear Postmark user."}'
headers = {
"Content-Type": "application/json"
}
req = urllib2.Request(url, data, headers)
response = urllib2.urlopen(req)
the_page = response.read()
It probably is in your best interest to switch over to using requests, but for something this simple the standard library urllib2 can be made to work.
What I want is a Python script that, when run, will do the exact same thing as the following cURL command in Terminal:
$ curl -k -F docfile=#myLocalFile.csv https://myWebsite.com/extension...
curl -F sends the file using multipart/form-data content type. You could reproduce it easily using requests library:
import requests # $ pip install requests
with open('myLocalFile.csv','rb') as input_file:
r = requests.post('https://myWebsite.com/extension/...',
files={'docfile': input_file}, verify=False)
verify=False is to emulate curl -k.

Sending JSON request with Python

I'm new to web services and am trying to send the following JSON based request using a python script:
http://myserver/emoncms2/api/post?apikey=xxxxxxxxxxxxx&json={power:290.4,temperature:19.4}
If I paste the above into a browser, it works as expected. However, I am struggling to send the request from Python. The following is what I am trying:
import json
import urllib2
data = {'temperature':'24.3'}
data_json = json.dumps(data)
host = "http://myserver/emoncms2/api/post"
req = urllib2.Request(host, 'GET', data_json, {'content-type': 'application/json'})
response_stream = urllib2.urlopen(req)
json_response = response_stream.read()
How do I add the apikey data into the request?
Thank you!
Instead of using urllib2, you can use requests. This new python lib is really well written and it's easier and more intuitive to use.
To send your json data you can use something like the following code:
import json
import requests
data = {'temperature':'24.3'}
data_json = json.dumps(data)
payload = {'json_payload': data_json, 'apikey': 'YOUR_API_KEY_HERE'}
r = requests.get('http://myserver/emoncms2/api/post', data=payload)
You can then inspect r to obtain an http status code, content, etc
Even though this doesnt exactly answer OPs question, it should be mentioned here that requests module has a json option that can be used like this:
import requests
requests.post(
'http://myserver/emoncms2/api/post?apikey=xxxxxxxxxxxxx',
json={"temperature": "24.3"}
)
which would be equivalent to the curl:
curl 'http://myserver/emoncms2/api/post?apikey=xxxxxxxxxxxxx' \
-H 'Content-Type: application/json' \
--data-binary '{"temperature":"24.3"}'
Maybe the problem is that json.dumps puts " and in the json you put in the url there are no "s.
For example:
data = {'temperature':'24.3'}
print json.dumps(data)
prints:
{"temperature": "24.3"}
and not:
{temperature: 24.3}
like you put in your url.
One way of solving this (which is trouble prone) is to do:
json.dumps(data).replace('"', '')

Categories

Resources