I am looking at doing "proof of life" test for some sites my team is developing by just doing status code confirmation and do not actually need the document body. From what the Python Requests documentation says stream is False by default and the headers AND body are pulled down. However by setting stream to True only the headers are grabbed. My concern is the possibility of false positives.
I am trying something like the following:
url = random.choice(app.conf['TEST_SITES'])
ua = random.choice(app.conf['USER_AGENTS'])
proxies = {
'https':'{0}:{1}#{2}:{3}'.format(proxy_user, proxy_pass, proxy_ip, proxy_port),
'https':'{0}:{1}#{2}:{3}'.format(proxy_user, proxy_pass, proxy_ip, proxy_port)
}
headers = {'user-agent': ua}
proxy_session = requests.Session()
proxy_session.max_redirects = app.conf['MAX_REDIRECTS']
response = requests.get(headers=headers, proxies=proxies, stream=True, timeout=5)
ret_code = response.status_code
response.close
# Do stuff based on status code #
Related
I'm building a small script to test the certain proxies against the API.
It seems that the actual request isn't trigger under the provided proxy. For example, the following request will be valid and I will get an response from the API.
import requests
r = requests.post("https://someapi.com", data=request_data,
proxies={"http": "http://999.999.999.999:1212"}, timeout=5)
print(r.text)
How come I get the response even if the proxy provided was invalid?
You can define the proxies like this;
import requests
pxy = "http://999.999.999.999:1212"
proxyDict = {
'http': pxy,
'https': pxy,
'ftp': pxy,
'SOCKS4': pxy
}
r = requests.post("https://someapi.com", data=request_data,
proxies=proxyDict, timeout=5)
print(r.text)
I am trying to post some information into an API based on their recommended format. When I use Postman( tool to test APIs), I see that the response has the isSuccess flag set to true. However, when I write the same code in Python using the requests library, I get the isSuccess flag as false
As mentioned about, I verified the headers and the json data object, both are the same yet the results defer
import requests
data = {"AccountNumber":"100007777",
"ActivityID":"78",
"ActivityDT":"2019-08-07 12:00:00",
"ActivityValue":"1"
}
url = "http://<IP>/<API_PATH>"
headers = {
"X-Tenant":"Default",
"Content-Type":"application/json"
}
response = requests.post(url,data=data, headers = headers)
print(response.content)
This code should successfully post the data and I should get a isSuccess:true in my response variable.
Can anyone help me figure out what might be wrong?
Can you try to change;
response = requests.post(url,data=data, headers = headers)
to;
response = requests.post(url,json=data, headers = headers)
or;
response = requests.post(url,body=data, headers = headers)
I have gone through number of similar posts related to firing GET requests with Basic Auth (eg: Python, HTTPS GET with basic authentication), still can't figure out the problem. I keep getting the error requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url
With the same credentials, headers tried the same in postman it works as expected. Verified that base64encoded value for the api_key, password is exactly same as the value used in postman, so I don't think its encoding or resource access permission problem.
python -V
Python 3.6.4 :: Anaconda, Inc.
Approach 1
api_key = 'some_api_key'
password = 'some_password'
headers = {'accept': 'application/json'}
url = 'https://test.access.com/this/url'
api_key_password = "%s:%s" % (api_key, password)
b64_encoded = b64encode(bytes(api_key_password, 'utf-8')).decode("ascii")
headers['authorization'] = 'Basic %s' % b64_encoded
response = requests.get(url,
headers=headers)
if (response.ok):
json_data = json.loads(response.content)
print (json_data)
else:
print (response)
response.raise_for_status()
Approach 2
api_key = 'some_api_key'
password = 'some_password'
url = 'https://test.access.com/this/url'
headers = {
'accept': 'application/json',
}
response = requests.get(url, headers=headers, auth=(api_key, password))
print (response.ok)
if (response.ok):
json_data = json.loads(response.content)
print (json_data)
else:
print (response)
response.raise_for_status()
Can you please provide some pointers?
I had a similar issue (although in .NET Framework).
In my case the reason was that I was using the url without a forward slash in the end and the API apparently does not support that.
So https://test.access.com/this/url
Throws 401 error Unauthorized
but
https://test.access.com/this/url/
Returns 200 OK.
Older post but I had a similar issue. Postman will cache your JSESSIONID. Be sure you are clearing out that cookie while testing. If you are hitting an API that requires a login API call to establish a session before you can make subsequent API calls, this Postman behavior can produce a false sense of security.
In this situation with Python requests, it can be handled with code similar to what I've provided below:
import requests,json
loginAPI = "https://myapi.myco.comv/someuri/someuri/users/login"
someHTTPGetAPI = "https://myapi.myco.com/someuri/someuri/someservice"
username = "myuser"
password = "mypass"
headers = {
"Content-Type": "application/json",
"login": username,
"password": password
}
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
verify=False
session = requests.Session()
sessionResponse = session.get(url=loginURL,headers=headers, verify=verify)
if sessionResponse.status_code == 200:
getResponse = session.get(url=someHTTPGetAPI)
if getResponse.status_code == 200:
responseJSON = agentStatus.json()
I am trying to build a simple webbot in Python, on Windows, using MechanicalSoup. Unfortunately, I am sitting behind a (company-enforced) proxy. I could not find a way to provide a proxy to MechanicalSoup. Is there such an option at all? If not, what are my alternatives?
EDIT: Following Eytan's hint, I added proxies and verify to my code, which got me a step further, but I still cannot submit a form:
import mechanicalsoup
proxies = {
'https': 'my.https.proxy:8080',
'http': 'my.http.proxy:8080'
}
url = 'https://stackoverflow.com/'
browser = mechanicalsoup.StatefulBrowser()
front_page = browser.open(url, proxies=proxies, verify=False)
form = browser.select_form('form[action="/search"]')
form.print_summary()
form["q"] = "MechanicalSoup"
form.print_summary()
browser.submit(form, url=url)
The code hangs in the last line, and submitdoesn't accept proxies as an argument.
It seems that proxies have to be specified on the session level. Then they are not required in browser.open and submitting the form also works:
import mechanicalsoup
proxies = {
'https': 'my.https.proxy:8080',
'http': 'my.http.proxy:8080'
}
url = 'https://stackoverflow.com/'
browser = mechanicalsoup.StatefulBrowser()
browser.session.proxies = proxies # THIS IS THE SOLUTION!
front_page = browser.open(url, verify=False)
form = browser.select_form('form[action="/search"]')
form["q"] = "MechanicalSoup"
result = browser.submit(form, url=url)
result.status_code
returns 200 (i.e. "OK").
According to their doc, this should work:
browser.get(url, proxies=proxy)
Try passing the 'proxies' argument to your requests.
I am trying to send GET request through a proxy with authentification.
I have the following existing code:
import httplib
username = 'myname'
password = '1234'
proxyserver = "136.137.138.139"
url = "http://google.com"
c = httplib.HTTPConnection(proxyserver, 83, timeout = 30)
c.connect()
c.request("GET", url)
resp = c.getresponse()
data = resp.read()
print data
when running this code, I get an answer from the proxy saying that I must provide authentification, which is correct.
In my code, I don't use login and password. My problem is that i don't know how to use them !
Any idea ?
You can refer this code if you specifically want to use httplib.
https://gist.github.com/beugley/13dd4cba88a19169bcb0
But you could also use the easier requests module.
import requests
proxies = {
"http": "http://username:password#proxyserver:port/",
# "https": "https://username:password#proxyserver:port/",
}
url = 'http://google.com'
data = requests.get(url, proxies=proxies)