HTTPCookieProcessor not serving cookies - python

I am trying to access a website that requires cookies. Using urllib2 and cookielib I am able to get a response from the site. The HTML printout informs me that I am not getting access with the line:
<h2>Cookies Disabled</h2>
<p> class="share-prompt"><strong>Cookies must be enabled.</strong></p>
I cannot understand where I am going wrong. Code below:
import urllib2, cookielib
cookieJar = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.ProxyHandler({'http':"http://216.208.156.69:3128"}),urllib2.HTTPCookieProcessor(cookieJar))
request = urllib2.Request("[website]")
response = opener.open(request)
print response.read()
Can anyone see where I have gone wrong?
Cheers,

The code looks good. For example the output from this
import urllib, urllib2, cookielib
cookieJar = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cookieJar))
params = urllib.urlencode({'cookie_name': 'cookie_value'})
request = urllib2.Request('http://httpbin.org/cookies/set?' + params)
opener.open(request)
request = urllib2.Request('http://httpbin.org/cookies')
response = opener.open(request)
print response.read()
is
{
"cookies": {
"cookie_name": "cookie_value"
}
}
Without showing us the url you use not much can be done.

Related

python httplib: connect through proxy with authentification

I am trying to send GET request through a proxy with authentification.
I have the following existing code:
import httplib
username = 'myname'
password = '1234'
proxyserver = "136.137.138.139"
url = "http://google.com"
c = httplib.HTTPConnection(proxyserver, 83, timeout = 30)
c.connect()
c.request("GET", url)
resp = c.getresponse()
data = resp.read()
print data
when running this code, I get an answer from the proxy saying that I must provide authentification, which is correct.
In my code, I don't use login and password. My problem is that i don't know how to use them !
Any idea ?
You can refer this code if you specifically want to use httplib.
https://gist.github.com/beugley/13dd4cba88a19169bcb0
But you could also use the easier requests module.
import requests
proxies = {
"http": "http://username:password#proxyserver:port/",
# "https": "https://username:password#proxyserver:port/",
}
url = 'http://google.com'
data = requests.get(url, proxies=proxies)

how to send cookies inside post request

trying to send Post request with the cookies on my pc from get request
#! /usr/bin/python
import re #regex
import urllib
import urllib2
#get request
x = urllib2.urlopen("http://www.example.com) #GET Request
cookies=x.headers['set-cookie'] #to get the cookies from get request
url = 'http://example' # to know the values type any password to know the cookies
values = {"username" : "admin",
"passwd" : password,
"lang" : "" ,
"option" : "com_login",
"task" : "login",
"return" : "aW5kZXgucGhw" }
data = urllib.urlencode(values)
req = urllib2.Request(url, data)
response = urllib2.urlopen(req)
result = response.read()
cookies=response.headers['set-cookie'] #to get the last cookies from post req in this variable
then i searched in google
how to send cookies inside same post request and found
opener = urllib2.build_opener() # send the cookies
opener.addheaders.append(('Cookie', cookies)) # send the cookies
f = opener.open("http://example")
but i don't exactly where should i type it in my code
what i need to do exactly is to
send GET request, put the cookies from the request in variable,then make post request with the value that i got from the GET request
if anyone know answer i need edit on my code
Just create a HTTP opener and a cookiejar handler. So cookies will be retrieved and will be passed together to next request automatically. See:
import urllib2 as net
import cookielib
import urllib
cookiejar = cookielib.CookieJar()
cookiejar.clear_session_cookies()
opener = net.build_opener(net.HTTPCookieProcessor(cookiejar))
data = urllib.urlencode(values)
request = net.Request(url, urllib.urlencode(data))
response = opener.open(request)
As opener is a global handler, just make any request and the previous cookies sent from previous request will be in the next request (POST/GET), automatically.
You should really look into the requests library python has to offer. All you need to do is make a dictionary for you cookies key/value pair and pass it is as an arg.
Your entire code could be replaced by
#import requests
url = 'http://example' # to know the values type any password to know the cookies
values = {"username" : "admin",
"passwd" : password,
"lang" : "" ,
"option" : "com_login",
"task" : "login",
"return" : "aW5kZXgucGhw" }
session = requests.Session()
response = session.get(url, data=values)
cookies = session.cookies.get_dict()
response = reqeusts.post(url, data=values, cookies=cookies)
The second piece of code is probably what you want, but depends on the format of the response.

How to get cookies from urllib.request?

How to get cookie from an urllib.request?
import urllib.request
import urllib.parse
data = urllib.parse.urlencode({
'user': 'user',
'pass': 'pass'
})
data = data.encode('utf-8')
request = urllib.request.urlopen('http://example.com', data)
print(request.info())
request.info() returns cookies but not in very usable way.
response.info() is a dict type object. so you can parse any info you need. Here is a demo written in python3:
from urllib import request
from urllib.error import HTTPError
# declare url, header_params
req = request.Request(url, data=None, headers=header_params, method='GET')
try:
response = request.urlopen(req)
cookie = response.info().get_all('Set-Cookie')
content_type = response.info()['Content-Type']
except HTTPError as err:
print("err status: {0}".format(err))
return
You can now, parse cookie variable as your application requirement.
Just used the following code to get cookie from Python Challenge #17, hope it helps (Python 3.8 being used):
import http.cookiejar
import urllib
cookiejar = http.cookiejar.CookieJar()
cookieproc = urllib.request.HTTPCookieProcessor(cookiejar)
opener = urllib.request.build_opener(cookieproc)
response = opener.open(url)
for cookie in cookiejar:
print(cookie.name, cookie.value)
I think using the requests package is a much better choice these days. Try this sample code that shows google setting cookies when you visit:
import requests
url = "http://www.google.com"
r = requests.get(url,timeout=5)
if r.status_code == 200:
for cookie in r.cookies:
print(cookie) # Use "print cookie" if you use Python 2.
Gives:
Cookie NID=67=n0l3ME1Jl3-wwlH7oE5pvxJ_CfU12hT5Kh65wh21bvE3hrKFAo1sJVj_UcuLCr76Ubi3yxENROaYNEitdgW4IttL43YZGlf8xAPl1IbzoLG31KP5U2tiP2y4DzVOJ2fA for .google.se/
Cookie PREF=ID=ce66d1288fc0d977:FF=0:TM=1407525509:LM=1407525509:S=LxQv7q8fju-iHJPZ for .google.se/

Python client for multipart form with CAS

I am trying to write a Python script to POST a multipart form to a site that requires authentication through CAS.
There are two approaches that both solve part of the problem:
The Python requests library works well for submitting multipart forms.
There is caslib, with a login function. It returns an OpenerDirector that can presumably be used for further requests.
Unfortunately, I can't figure out how to get a complete solution out what I have so far.
There are just some ideas from a couple hours of research; I am open to just about any solution that works.
Thanks for the help.
I accepted J.F. Sebastian's answer because I think it was closest to what I'd asked, but I actually wound up getting it to work by using mechanize, Python library for web browser automation.
import argparse
import mechanize
import re
import sys
# (SENSITIVE!) Authentication info
username = r'username'
password = r'password'
# Command line arguments
parser = argparse.ArgumentParser(description='Submit lab to CS 235 site (Winter 2013)')
parser.add_argument('lab_num', help='Lab submission number')
parser.add_argument('file_name', help='Submission file (zip)')
args = parser.parse_args()
# Go to login site
br = mechanize.Browser()
br.open('https://cas.byu.edu/cas/login?service=https%3a%2f%2fbeta.cs.byu.edu%2f~sub235%2fsubmit.php')
# Login and forward to submission site
br.form = br.forms().next()
br['username'] = username
br['password'] = password
br.submit()
# Submit
br.form = br.forms().next()
br['labnum'] = list(args.lab_num)
br.add_file(open(args.file_name), 'application/zip', args.file_name)
r = br.submit()
for s in re.findall('<h4>(.+?)</?h4>', r.read()):
print s
You could use poster to prepare multipart/form-data. Try to pass poster's opener to the caslib and use caslib's opener to make requests (not tested):
import urllib2
import caslib
import poster.encode
import poster.streaminghttp
opener = poster.streaminghttp.register_openers()
r, opener = caslib.login_to_cas_service(login_url, username, password,
opener=opener)
params = {'file': open("test.txt", "rb"), 'name': 'upload test'}
datagen, headers = poster.encode.multipart_encode(params)
response = opener.open(urllib2.Request(upload_url, datagen, headers))
print response.read()
You could write a Authentication Handler for Requests using caslib. Then you could do something like:
auth = CasAuthentication("url", "login", "password")
response = requests.get("http://example.com/cas_service", auth=auth)
Or if you're making tons of requests against the website:
s = requests.session()
s.auth = auth
s.post('http://casservice.com/endpoint', data={'key', 'value'}, files={'filename': '/path/to/file'})

count cookies in python

I am struggling with python ,I want to Write a Python script that creates a cookie and Counts how many times the cookie is called during a session
I just have tried as per below:
import Cookie
import os
if os.environ.has_key('HTTP_COOKIE'):
cookie=SimpleCookie(os.environ['HTTP_COOKIE'])
cookie=SimpleCookie()
for key in initialvalues.keys():
if not cookie.has_key(key):
cookie[key]=intialvalues[key]
return cookie
if __name__=='__main__':
c=getCookie({'counter':0})
c['counter']=int(c['counter'].value)+1
print c
But I know it is wrong, can someone help me to write down the script?
Any help would be appreciated
I'm confused by your question. What I believe you want to do is request some webpage and count how many times your cookie was found? You can gather cookies using a CookieJar:
import urllib, urllib2, cookielib
url = "http://example.com/cookies"
form_data = {'username': '', 'password': '' }
jar = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(jar))
form_data = urllib.urlencode(form_data)
# data returned from this pages contains redirection
resp = opener.open(url, form_data)
print resp.read()
for cookie in jar:
# Look for your cookie

Categories

Resources