I am trying to login into a site, which requires NTLM authentication, using HttpNtlmAuth from requests_ntlm. Here is the code. I have not shown actual url (url1) as it is from my company.
url1 = "http://url"
payload={"ref": "B72048061"}
header_data = {'User-agent':'Mozilla/5.0'}
auth = HttpNtlmAuth(User, Password)
r1 = requests.get(url1, params=payload, headers=header_data, auth=auth, proxies=proxies, verify=False)
r1
I am getting below error.
<Response [401]>
Here are more details.
print(r1.url)
print(r1.headers)
print(r1.text)
output
http://url?ref=B72048061
{'Date': 'Mon, 19 Dec 2022 11:55:05 GMT', 'Set-Cookie': 'SMSESSION=8WEKGZ5xQuxJ9Uefebxe+YIv6HDZhltVtmL9gdrSvmguTuaIH6zZ40TW9+y5ZSTQzEKJLpRta26Fj1txU2lHrnnU7F8GSuviGsLPrKoh0xtlxhkrdogKgOtFuaw9k2e9cusAWjtBE+as2i1Qd0D2cPRUGm/4fJez+itFVyAzJ0eP7pW/ggGyIFoZmU7XjgaqliRc9tuqU5SufXrEPztO4yZSEUgaXUz/ul6XK0NZ/QO211cCmxanYkmGHvCdAhT/z9p2V8Xq5wlRRvpgupVbuxgrvu+OQOANdhuLWOj+KZFiZmyyECKe0QQnK08tFiV7GbZUHNRMK+8lm/zYOkSm/w9NXtIluAIBEzuClzk1cmfesjxCmDXjHuZ90jtwQDxuSRwIcXzdZqvl4L6k8oBIPZVJD3QFAkmudxnXZTddHRFx+YaKvF2Yjhq8vmef+ucanEavekMktpoo9r226og31Zd6uV3C1mZT/8zcN4PVdoJR0XZMLSUKwxLh28EQjIkDX0RhXQkOkVaX3BAiMqubPqJsjlZDRVjNdIqa05qbLBmAG9tzNIv1PNzXltJ6/zptiy81Hms8W3IZrNQLbe8Ry2tAqCCbmkdP+KJ1N20OHGbcUiyyigW2YN24u3/JFx8XTEd4DkSpzFk/kVzBWw/zDlNAI3N0rMNcbfZ/MdXO54lLCa35+znwyb9fSxRt3n4tNCnfpaf6okIXJjYFoEZl/moICdXmEXXF+u3LJAWSc5hZM78S7kM4r0rrxz1qfZZSK0cVPCyc4KfAksnrIgq8ciCEZuieVcXiYbYic9d0B9XcOEArkDxoremnavk0RZAxisyaYR5UJcKtQTKfr5r2lzgEwD2mHIU8sjiDG0lNchlE05uJfYTrnas6oNOA1ZLuKHHRJ0kx8U6QSi4yHbuPeJ4Gubqbpiq/a8E6x5j00hwJl/ZDp8CB77HzhgWvACnYD8TrNwzEpjOYD70PyAq1QHeoUsmZVgggPHs0PtX05UhL+77aIVT/STdZCKj4ZdDbPHbTgz81vh9jfpR6lv6FXCWEWO/b+DxDsBuaPlDid36gVxCcvj4r8YnLIclWEVcznrZfulE2wTIB86Ckk+3lVnRw5FWPWLhvgdcbeiQFho7u8B6x4H7zTqWinCpvVH8mVEu6fPA7swECLMii+YrbC2v67Uyc1qdGj5+HO7qJYEcMMA2yI5ygaQK7kMYeeLOPj2U6Qw7ni4u989WFvV24ZdQE9TW6hMDWYz82MEnJkI7spDO3JKxfiMStYY1/TdaeVsGrc3KonDBwNPjfjn03zkI+kX+CtJML; path=/; domain=xxx.corp, SMCHALLENGE=YES; Path=/; Domain=xxx.corp, SMONDENIEDREDIR=NO; Expires=Thu, 01-Dec-94 16:00:00 GMT; Path=/; Domain=xxx.corp', 'X-Powered-By': 'Servlet/3.0', 'WWW-Authenticate': 'Basic realm="REP_F370_TAI-root [12:55:5:945] "', '$WSEP': '', 'Expires': 'Thu, 01 Dec 1994 16:00:00 GMT', 'Cache-Control': 'no-cache="set-cookie, set-cookie2"', 'X-OneAgent-JS-Injection': 'true', 'X-ruxit-JS-Agent': 'true', 'Server-Timing': 'dtSInfo;desc="0", dtRpid;desc="-521266895"', 'Keep-Alive': 'timeout=10, max=99', 'Connection': 'Keep-Alive', 'Transfer-Encoding': 'chunked', 'Content-Type': 'text/html;charset=ISO-8859-1', 'Content-Language': 'en-US'}
Error 401: ACCESS-DENIED
Take a look into response history:
for resp in r1.history:
print(resp.url)
print(resp.text)
print('Status Code:', resp.status_code)
print(resp.headers)
print('\n')
Output:
http:////url?ref=B72048061
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>302 Found</title>
</head><body>
<h1>Found</h1>
<p>The document has moved here.</p>
<hr>
<address>IBM_HTTP_Server at xxx-pdm-services.eu.xxx.corp Port 1080</address>
</body></html>
Status Code: 302
{'Date': 'Mon, 19 Dec 2022 12:25:06 GMT', 'Cache-Control': 'no-store', 'Location': 'https://winssor12-vip.xxx.corp:443/siteminderagent/ntlm/creds.ntc?CHALLENGE=&SMAGENTNAME=-SM-9xex6NsJ4%2b587a54UsP5CW4QMeT35xwiKJoZua8FNdPb8Uvg%2bW%2fjUV2leieKYjCz&TARGET=-SM-HTTP%3a%2f%2fxxx--pdm--services%2eeu%2exxx%2ecorp%3a1080%2faps--web%2fP%2fview%2fO_IP%3fref%3dV92B72048061', 'Server-Timing': 'dtSInfo;desc="0", dtRpid;desc="-139048060"', 'Set-Cookie': 'dtCookie=v_4_srv_9_sn_9A3A0571C2E35B5E0D7C5AB97C0604EB_perc_100000_ol_0_mul_1_app-3Afb71c72ef431f887_1; Path=/; Domain=.xxx.corp', 'Content-Length': '570', 'Keep-Alive': 'timeout=10, max=100', 'Connection': 'Keep-Alive', 'Content-Type': 'text/html; charset=iso-8859-1'}
https://winssor12-vip.xxx.corp:443/siteminderagent/ntlm/creds.ntc?CHALLENGE=&SMAGENTNAME=-SM-9xex6NsJ4%2b587a54UsP5CW4QMeT35xwiKJoZua8FNdPb8Uvg%2bW%2fjUV2leieKYjCz&TARGET=-SM-HTTP%3a%2f%2fxxx--pdm--services.eu.xxx.corp%3a1080%2faps--web%2fP%2fview%2fO_IP%3fref%3dB72048061
Status Code: 302
{'Via': 'proxy A', 'Date': 'Mon, 19 Dec 2022 12:25:06 GMT', 'Server': 'Microsoft-IIS/10.0', 'Location': 'HTTP://xxx-pdm-services.eu.xxx.corp:1080/aps-web/P/view/O_IP?ref=B72048061', 'Connection': 'Keep-Alive', 'set-cookie': 'SMCHALLENGE=NTC_CHALLENGE_DONE; path=/; domain=xxx.corp, SMSESSION=H5Ya4mHFxMf5o5wXubpUuiKrZM09+RY8Na0i+Q2kxE1aor1MQhybJlg4WgFLB5Iw8lRNFhr1qvPUElsa4g+M0obXRSvP6jt0XwQPQOHc6pnmhXYyDB6L2lAIrgccnPvZsFxYF6Cig+coLbE0DIDeWFuDXD47LXBRzTqY1KRwCOkX+oSuC0AdvhGoeIOljgthwm8KLvHEzVq9fvUSfJAPeRPOyoRRtt0JA6Vg4BgCvsiNT1KHV1/9UNugmWp668juLilZHM6ZJZZt5i//5zFMTlgxSH91z7PkQUzKBwHrxZ0BPXIC4dlVsirfaMgRcZ33T5Cj5jLuDWk628Ce3ps6JbxCopaxKiD5xQFxLGbkK0juKvqBPAlJqLG8Rvv3vEGuujPt0G8mGmysGjHBjIYup7xFzCfD4mDkB2haZAOJrIhEJ7NBERoeemak2XmeUfeM8WxdXlAAgZmcxhH5+04RovujA7Qwwv4LW22jO3nPVqBQi81sN+5faW/ZPgCXXZ4Pzd7icTri6yJYVT5SBcaMhXbxkfNwNSk7YgFrU+5BDj7rDJNjVCiGAiapEQjU+3GxrRc9Ql7Rf6+gKWEEcNrWKS68O70+0uZftnldkA8a5MmJuTQ0SyApSWMPIvICQvKzBRrhBPE5R1LWxZRZIQ3ysDhG92fg3s8Arau5dFwuFYFMvQXAAYgjSNZ4c114yfWOjcV2edtDRYH9SekqpX6QCxX0UBex0LeKjbrSmNiY2IPdmBOoxHHRA9KreHgebugh4wEX9Xrp+oUm9MjxhQee+/lqiAqcuh81crWYttu3LkaXkj0BUoRmm6rqhm3GuaSwQSAN35LpcStN/+IgEEYFsIswd1OiWNY6IfARN5nLpqQgupMhVa3G/kZpadSfDmt0/u+G07M1EjjpaJB2eoDCHeDBFVZAJhgNxtzj2Na6T0SE5JYmCSp1XdZj0oeOC33eDX1M9q9xDYOCHLhmcCcFP7KE5mi0Q3KZ7N//ws95uPTdgPwMdxxKVykNebdbEPQjMltN09Uj80ws5N4FRu/9Cyq+4L+jS0gEaZz0OIR4l0v4nhqWqtoq3s1pkTlGU8z52uJE1IKH7pO5gG5iz9+/HYpA1JIdQH/HuL2Nu6SOh14FTdrATU5Id9Ip9Wvyi62G/FgLXHh/BYHC9yxrZYQX0M3NYQSjCuDRtbhGLPKkmYvMSDl8ZGmjlLapRdaQn66J6ILPmLKnifoydT8FJizsAzVtdGhNo5yszH5iCK6Y0976oVWqYfIadT+I6RdBC+/iVVzxHYZEMxsDOLBKAKPLDR5tyK2hilM4; path=/; domain=airbus.corp', 'X-Powered-By': 'ASP.NET', 'Cache-Control': 'no-store', 'Content-Length': '0', 'Persistent-Auth': 'false'}
So as can be seen in history, NTLM authentication was completed and is redirected back to required url1. However due to some issues, which I am not able to figure out, access is denied with error code 401 as can be seen above. Another thing which I noticed is that SMchallenge in response header set cookie is 'YES' and there exists SMONDENIEDREDIR. When I checked the form in browser network for succeful login, SMchallenge in set cookies remain 'No' and there exists nothing like SMONDENIEDREDIR. I am now wondering whether NTLM authentication was successful or not and how to solve this issue!
Related
I created Slack app, added Bot and Incoming Webhook to it and posted some messages with Bot. Now I would like to find out time stamp of Slack message in order to delete it later with chat.delete method.
I found it that I can use channels.history method.
Here is how I tried to use it. I used it with token found under OAuth Access Token, since per docs I cannot use Bot token with channels.history method.
from slackclient import SlackClient
slack_token_user_token = 'xoxp-long_string_of_integers'
sc_user_token = SlackClient(slack_token_user_token)
sc_user_token.api_call(
"channels.history",
channel="CHXXXXXXX")
I got back the following error:
{'error': 'missing_scope',
'headers': {'Access-Control-Allow-Headers': 'slack-route, x-slack-version-ts',
'Access-Control-Allow-Origin': '*',
'Access-Control-Expose-Headers': 'x-slack-req-id',
'Cache-Control': 'private, no-cache, no-store, must-revalidate',
'Connection': 'keep-alive',
'Content-Encoding': 'gzip',
'Content-Length': '108',
'Content-Type': 'application/json; charset=utf-8',
'Date': 'Fri, 05 Apr 2019 18:18:11 GMT',
'Expires': 'Mon, 26 Jul 1997 05:00:00 GMT',
'Pragma': 'no-cache',
'Referrer-Policy': 'no-referrer',
'Server': 'Apache',
'Strict-Transport-Security': 'max-age=31536000; includeSubDomains; preload',
'Vary': 'Accept-Encoding',
'Via': '1.1 f0f1092b2ad1f0e573a4fcbefe4fb621.cloudfront.net (CloudFront)',
'X-Accepted-OAuth-Scopes': 'channels:history',
'X-Amz-Cf-Id': 'fSm6uo2H88E43JCvqd2h5mohnzA6z0B3kmdsG3u9nW0PJNrsrpK7mg==',
'X-Cache': 'Miss from cloudfront',
'X-Content-Type-Options': 'nosniff',
'X-OAuth-Scopes': 'identify,bot,incoming-webhook',
'X-Slack-Req-Id': 'c158668d-ddc9-4bbc-9a7d-6b9a9011d2dc',
'X-Via': 'haproxy-www-yfr6',
'X-XSS-Protection': '0'},
'needed': 'channels:history',
'ok': False,
'provided': 'identify,bot,incoming-webhook'}
If this is permission issue, how do I find out proper token to use?
According to the error message you posted the token used is lacking the required scope.
'needed': 'channels:history'
It looks like you provided the bot token, which can not work.
'provided': 'identify,bot,incoming-webhook'
Provide the access token and make sure you first add the channel.history scope and reinstall the app to activate.
I'm currently trying to access the location field in a response header from a GET request to url: https://dbr.ee/aUJA/d?. Currently, I have been able to view the location field through this Python code:
import requests
r = requests.get('hhttps://dbr.ee/aUJA/d?', allow_redirects=False, headers={'User-Agent': 'Mozilla/5.0'})
print r.headers
But the output is the wrong location field
{'Status': '302 Found', 'X-Request-Id':
'9e968067-1bee-4cc9-9305-19d45d5cb6ea', 'X-XSS-Protection': '1;
mode=block', 'X-Content-Type-Options': 'nosniff', 'Transfer-Encoding':
'chunked', 'Set-Cookie':
'__cfduid=d21c538fd46c153a046bf461ca281978d1499637583; expires=Mon,
09-Jul-18 21:59:43 GMT; path=/; domain=.dbr.ee; HttpOnly,
ahoy_visitor=f4f1c08c-add3-45c0-8325-675b1caf3048; path=/;
expires=Tue, 09 Jul 2019 21:59:44 -0000,
ahoy_visit=cdbb4ca8-3272-473c-8562-03596d88ec0f; path=/; expires=Mon,
10 Jul 2017 01:59:44 -0000, ahoy_track=true; path=/, SERVERID=;
Expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/', 'X-Runtime':
'0.006820', 'Server': 'cloudflare-nginx', 'Connection': 'keep-alive',
'Location': 'hhttps://dbr.ee/aUJA', 'Cache-Control': 'no-cache',
'Date': 'Sun, 09 Jul 2017 21:59:44 GMT', 'X-Frame-Options':
'SAMEORIGIN', 'Content-Type': 'text/html; charset=utf-8', 'CF-RAY':
'37be8d52fdc83822-ATL'}
Which is:
'Location': 'hhttps://dbr.ee/aUJA'
While on the site, the actual response header is this (viewed through Chrome Developer Tools)
cache-control:no-cache cf-ray:37be8bacacb437d4-ATL
content-type:text/html; charset=utf-8 date:Sun, 09 Jul 2017 21:58:36
GMT
location:hhttps://s.dbr.ee/sffc/python%2Dlogo%2Dmaster%2Dv3%2DTM.png.zip?temp_url_sig=41ebabb749293a6fe3f3ec82c5ab8ec01b0ed053&temp_url_expires=1499637816&filename=python-logo-master-v3-TM.png.zip;&attachment
server:cloudflare-nginx
set-cookie:ahoy_visit=f7d15e42-155c-443f-a637-22c3681863a5; path=/;
expires=Mon, 10 Jul 2017 01:58:36 -0000
set-cookie:_dbree_session=U2x6akdCbUJ4c28wdW9MeUFYOXo1QUVxLzV3ZVNxcGtTWW1jbVdkWEdPOWZPMWFiOEl4M0VWY1dOWGNYTjNubEJoVWJHejRCTlQwQlkwL0UrM09QallTMzhFZlU3RFBBTDZxaW9xcGRMeXNlQS9mZFByYTZQWTM0ZlBHMU50ekhhTkt1bjZENXJHRnc2a3dWeGY2d3BBPT0tLVNKOTJnL0Q3SjloWEc0MTZqTnRPNFE9PQ%3D%3D--2dd8f3e77a673f385c9a231af426b55f1d1f71c0;
domain=dbr.ee; path=/; HttpOnly set-cookie:SERVERID=; Expires=Thu,
01-Jan-1970 00:00:01 GMT; path=/ status:302 status:302 Found
x-content-type-options:nosniff x-frame-options:SAMEORIGIN
x-request-id:f57f3ca7-c7aa-4449-a2d7-7b5014010d0f x-runtime:0.015892
x-xss-protection:1; mode=block
where Location is
location:hhttps://s.dbr.ee/sffc/python%2Dlogo%2Dmaster%2Dv3%2DTM.png.zip?temp_url_sig=41ebabb749293a6fe3f3ec82c5ab8ec01b0ed053&temp_url_expires=1499637816&filename=python-logo-master-v3-TM.png.zip;&attachment
Which is the download link I am trying to scrape in Python. This appears in Developer Tools after clicking the Direct Download button.
How can I get the header to show me the correct field location in Python?
*links have been modified with h in front of http because of not allowing me to post more than 2 links, but are necessary for context of the question
Looks like the issue was the missing referer header. Once I add that to your code I get the appropriate 302 redirect response, with the correct Location header:
import requests
r = requests.get('https://dbr.ee/aUJA/d?', allow_redirects=False, headers={
'Referer': 'https://dbr.ee/aUJA'
})
print(r.headers)
Which produces:
{'Date': 'Sun, 09 Jul 2017 23:44:55 GMT', 'Content-Type': 'text/html; charset=utf-8', 'Transfer-Encoding': 'chunked', 'Connection': 'keep-alive', 'Set-Cookie': '__cfduid=d071cba66cc515ca7f2bc620362c6d46d1499643895; expires=Mon, 09-Jul-18 23:44:55 GMT; path=/; domain=.dbr.ee; HttpOnly, ahoy_visitor=64d9f580-781e-4037-8951-ce57b73df720; path=/; expires=Tue, 09 Jul 2019 23:44:55 -0000, ahoy_visit=802132cc-4e0e-4089-9be5-49f05223f567; path=/; expires=Mon, 10 Jul 2017 03:44:55 -0000, SERVERID=; Expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/', 'Status': '302 Found', 'Cache-Control': 'no-cache', 'X-XSS-Protection': '1; mode=block', 'X-Request-Id': '14a0d0df-c14d-477d-b87c-b6edb823619c', 'Location': 'https://s.dbr.ee/sffc/python%2Dlogo%2Dmaster%2Dv3%2DTM.png.zip?temp_url_sig=084b2b71c8c12df993d528e991a5b44e46e974ef&temp_url_expires=1499644195&filename=python-logo-master-v3-TM.png.zip;&attachment', 'X-Runtime': '0.006968', 'X-Frame-Options': 'SAMEORIGIN', 'X-Content-Type-Options': 'nosniff', 'Server': 'cloudflare-nginx', 'CF-RAY': '37bf2769fda80fa5-YYZ'}
I want to make a web crawler to make a statistic about most popular server software among Bulgarian sites, such as Apache, nginx, etc. Here is what I came up with:
import requests
r = requests.get('http://start.bg')
print(r.headers)
Which return the following:
{'Debug': 'unk',
'Content-Type': 'text/html; charset=utf-8',
'X-Powered-By': 'PHP/5.3.3',
'Content-Length': '29761',
'Connection': 'close',
'Set-Cookie': 'fbnr=1; expires=Sat, 13-Feb-2016 22:00:01 GMT; path=/; domain=.start.bg',
'Date': 'Sat, 13 Feb 2016 13:43:50 GMT',
'Vary': 'Accept-Encoding',
'Server': 'Apache/2.2.15 (CentOS)',
'Content-Encoding': 'gzip'}
Here you can easily see that it runs on Apache/2.2.15 and you can get this result by simply saying r.headers['Server']. I tried that with several Bulgarian websites and they all had the Server key.
However, when I request the header of a more sophisticated website, such as www.teslamotors.com, I get the following info:
{'Content-Type': 'text/html; charset=utf-8',
'X-Cache-Hits': '9',
'Cache-Control': 'max-age=0, no-cache, no-store',
'X-Content-Type-Options': 'nosniff',
'Connection': 'keep-alive',
'X-Varnish-Server': 'sjc04p1wwwvr11.sjc05.teslamotors.com',
'Content-Language': 'en',
'Pragma': 'no-cache',
'Last-Modified': 'Sat, 13 Feb 2016 13:07:50 GMT',
'X-Server': 'web03a',
'Expires': 'Sat, 13 Feb 2016 13:37:55 GMT',
'Content-Length': '10290',
'Date': 'Sat, 13 Feb 2016 13:37:55 GMT',
'Vary': 'Accept-Encoding',
'ETag': '"1455368870-1"',
'X-Frame-Options': 'SAMEORIGIN',
'Accept-Ranges': 'bytes',
'Content-Encoding': 'gzip'}
As you can see there isn't any ['Server'] key in this dictionary (although there is X-Server and X-Varnish-Server which I'm not sure what they mean, but its value is not a server name like Apache.
So i'm thinking there must be another request I could send that would yield the desired server information, or probably they have their own specific server software (which sounds plausible for facebook).
I also tried other .com websites, such as https://spotify.com and it does have a ['Server'] key.
So is there a way to find the info about the servers Facebook and Tesla Motors use?
That has nothing to do with python, most well configured web servers will not return information inside the "server" http header due to security implications.
No sane developer would want to let you know that they are running an unpatched version of xxx product.
After several attempts and repeated failures, I am posting my code excerpt here. I keep getting Authentication failure. Can somebody point out what is it that I am doing wrong here?
import requests
fileToUpload = {'file': open('/home/pinku/Desktop/Test_Upload.odt', 'rb')}
res = requests.post('https://upload.backupgrid.net/add', fileToUpload)
print res.headers
cookie = {'PHPSESSID': 'tobfr5f31voqmtdul11nu6n9q1'}
requests.post('https://upload.backupgrid.net/add', cookie, fileToUpload)
By print res.headers, I get the following:
CaseInsensitiveDict({'content-length': '67',
'access-control-allow-methods': 'OPTIONS, HEAD, GET, POST, PUT,
DELETE', 'x-content-type-options': 'nosniff', 'content-encoding':
'gzip', 'set-cookie': 'PHPSESSID=ou8eijalgpss204thu7ht532g1; path=/,
B100Serverpoolcookie=4281246842.1.973348976.502419456; path=/',
'expires': 'Thu, 19 Nov 1981 08:52:00 GMT', 'vary': 'Accept-Encoding',
'server': 'Apache/2.2.15 (CentOS)', 'pragma': 'no-cache',
'cache-control': 'no-store, no-cache, must-revalidate', 'date': 'Mon,
09 Sep 2013 09:13:08 GMT', 'access-control-allow-origin': '*',
'access-control-allow-headers': 'X-File-Name, X-File-Type,
X-File-Size', 'content-type': 'text/html; charset=UTF-8'})
It contains the cookies also. Am I passing the cookies correctly? Please help!
You are not passing cookies correctly, should be:
requests.post('https://upload.backupgrid.net/add',
files=fileToUpload,
cookies=cookie)
See also documentation:
Cookies
POST a Multipart-Encoded File
I am trying to authenticate users of my Django application into Facebook via the oauth2 Python package.
def myView(request):
consumer = oauth2.Consumer(
key = settings.FACEBOOK_APP_ID,
secret = settings.FACEBOOK_APP_SECRET)
# Request token URL for Facebook.
request_token_url = "https://www.facebook.com/dialog/oauth/"
# Create client.
client = oauth2.Client(consumer)
# The OAuth Client request works just like httplib2 for the most part.
resp, content = client.request(request_token_url, "GET")
# Return a response that prints out the Facebook response and content.
return HttpResponse(str(resp) + '\n\n ------ \n\n' + content)
However, I am directed to a page that contains an error when I go to this view. The error has this response from Facebook.
{'status': '200', 'content-length': '16418', 'x-xss-protection': '0',
'content-location': u'https://www.facebook.com/dialog/oauth/?oauth_body_hash=2jmj7l5rSw0yVb%2FvlWAYkK%2FYBwk%3D&oauth_nonce=53865791&oauth_timestamp=1342666292&oauth_consumer_key=117889941688718&oauth_signature_method=HMAC-SHA1&oauth_version=1.0&oauth_signature=XD%2BZKqhJzbOD8YBJoU1WgQ4iqtU%3D',
'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked',
'expires': 'Sat, 01 Jan 2000 00:00:00 GMT',
'connection': 'keep-alive',
'-content-encoding': 'gzip',
'pragma': 'no-cache',
'cache-control': 'private, no-cache, no-store, must-revalidate',
'date': 'Thu, 19 Jul 2012 02:51:33 GMT',
'x-frame-options': 'DENY',
'content-type': 'text/html; charset=utf-8',
'x-fb-debug': 'yn3XYqMylh3KFcxU9+FA6cQx8+rFtP/9sJICRgj3GOQ='}
Does anyone see anything awry in my code? I have tried concatenating arguments as strings to request_token_url to no avail. I am sure that my Facebook app ID and secret string are correct.