session error ? using POST using requests or urllib2 - python

trying to authenticate to a website and fill out a form using requests lib
import requests
payload = {"name":"someone", "password":"somepass", "submit":"Submit"}
s = requests.Session()
s.post("https://someurl.com", data=payload)
next_payload = {"item1":"something", "item2":"something", "submit":"Submit"}
r = s.post("https://someurl.com", data=next_payload)
print r.text
authentication works and i verified that i can post to forms but this one i am having problem with gives The action could not be completed, perhaps because your session had expired. Please try again
Attempted in urllib2 and same result -- dont think its an issue with a cookie.
I am wondering if javascript on this page has something to do with giving session error? other form page doesnt have any javascripts.
Thanks for your input...

Related

Log into a website using Requests module in Python

I am new in Python and web scraping, but I keep learning. I have managed to get some exciting results using BeautifulSoup and Requests libraries and my next goal is to log into a website that allows remote access to my heating system to do some web scraping and maybe extend its capabilities further.
Unfortunately, I got stuck. I have used Mozilla's Web Dev Tools to see the url that the form posts to, and the name attributes of the username and password fields. The webpage url is https://emodul.pl/login and the Request payload looks as follows:
{"username":"my_username","password":"my_password","rememberMe":false,"languageId":"en","remote":false}
I am using requests.Session() instance to make a post request to the login url and using the above-mentioned payload:
import requests
url = 'https://emodul.pl/login'
payload = {'username':'my_username','password':'my_password','rememberMe':False,'languageId':'en','remote':False}
with requests.Session() as s:
p = s.post(url, data=payload)
print(p.text)
Apparently I'm doing something wrong because I'm getting the "error":"Sorry, something went wrong. Try again." response.
Any advice will be much appreciated.

Get cookies from selenium to requests

I can login to a website with selenium and i can receive all cookies.
But then I have to quickly submit a request to the site. Meanwhile, selenium stays very slow.
That's why I want to receive cookies with selenium and send requests via the request module.
My Selenium Code (First I log in to the website and received all cookies with the code below.)
browser.get('https://www.example.com/login')
cookiem1 = browser.get_cookies()
print(cookiem1)
2nd stage, I will go to another page of the website and make a request.
s = requests.Session()
for cookie in cookiem1:
s.cookies.set(cookie['name'], cookie['value'])
r = s.get("https://example.com/postcomment')
print(r.content)
I use cookies in this way, but when I send the url via request module, the site does not autohorize my user.
My error:
"errorMessage": "Unauthorized user",\r\n "errorDetails": "No cookie"
Probably with this code the site doesn't unauthorized my session
Thanks in advance
try this
import requests as re
ck = browser.get_cookies()
s = re.Session()
c = [s.cookies.set(c['name'], c['value']) for c in ck]
response = s.get("https://example.com/postcomment")

Trouble logging in to website using python requests module

I'm trying to login to the Starbucks website (login url: https://app.starbucks.com/account/signin?ReturnUrl=https%3A%2F%2Fapp.starbucks.com%2Fprofile) with no success.
I used the firefox inspect tool to find out the url i am supposed to send a POST request to and how should the payload data look like and i found out that the request url is "https://www.starbucks.com/bff/account/signin" and the payload is something like : "{"username": "my_username","password":"my_password"}, so here's my code:
import requests
url = 'https://www.starbucks.com/bff/account/signin'
uname = "my_username"
pwd = "my_password"
payload = {"username":uname, "password":pwd}
with requests.Session() as s:
p = s.post(url,data=payload)
print(p.status_code)
The status_code that is printed is always 200, which is strange because whenever i type invalid credentials manually, on the network tab of the inspect tool i get a 400 response code. And also, whenever i do print(p.content) instead of printing the status code, the content is always the same (both for wrong and correct credentials).
Can somebody help me out?
Thanks in advance

How can i add cookie in headers?

i want to automation testing tool using api.
at first, i login the site and get a cookie.
my code is python3
import urllib
import urllib3
from bs4 import BeautifulSoup
url ='http://ip:port/api/login'
login_req = urllib.parse.urlencode(login_form)
http = urllib3.PoolManager()
r= http.request('POST',url,fields={'userName':'id','password':'passoword'})
soup = BeautifulSoup(r.data.decode('utf-8'),'lxml')
cookie = r.getheaders().get('Set-Cookie')
str1 = r.getheaders().get('Set-Cookie')
str2 = 'JSESSIONID' +str1.split('JSESSIONID')[1]
str2 = str2[0:-2]
print(str2)
-- JSESSIONID=df0010cf-1273-4add-9158-70d817a182f7; Path=/; HttpOnly
and then, i add cookie on head another web site api.
but it is not working!
url2 = 'http://ip:port/api/notebook/job/paragraph'
r2 = http.request('POST',url2)
r2.headers['Set-Cookie']=str2
r2.headers['Cookie']=str2
http.request('POST',url2, headers=r2.headers)
why is not working? it shows another cookie
if you know this situation, please explain to me..
error contents is that.
HTTP ERROR 500
Problem accessing /api/login;JSESSIONID=b8f6d236-494b-4646-8723-ccd0d7ef832f.
Reason: Server Error
Caused by:</h3><pre>javax.servlet.ServletException: Filtered request failed.
ProtocolError: ('Connection aborted.', BadStatusLine('<html>\n',))
thanks a lot!
Use requests module in python 3.x. You have to create a session which you are not doing now that's why you are facing problems.
import requests
s=requests.Session()
url ='http://ip:port/api/login'
r=s.get(url)
dct=s.cookies.get_dict() #it will return cookies(if any) and save it in dict
Take which ever cookie is wanted by the server and all the headers requested and pass it in header
jid=dct["JSESSIONID"]
head = {JSESSIONID="+jid,.....}
payload = {'userName':'id','password':'passoword'}
r = s.post(url, data=payload,headers=head)
r = s.get('whatever url after login')
To get info about which specific headers you have to pass and all the parameters required for POST
Open link in google chrome.
Open Developers Console(fn + F12).
There search for login doc (if cannot find, input wrong details and submit).
You will get info about request headers and POST parameters.

Python - login to website using requests

I have to admit I am complitely clueless about this: I need to login to this site https://segreteriaonline.unisi.it/Home.do and then perform some actions. Problem is I cannot find the form to use in the source of the webpage, and I basically have never tried to login to a website via python.
This is the simple code I wrote.
import requests
url_from = 'https://segreteriaonline.unisi.it/Home.do'
url_in = 'https://segreteriaonline.unisi.it/auth/Logon.do'
data = {'form':'1', 'username':'myUser', 'password':'myPass'}
s = requests.session()
s.get(url_from)
r = s.post(url_in, data)
print r
Obviously, what i get is:
<Response [401]>
Any suggestions?
Thanks in advance.
You need to use the requests authentication header.
Please check here:
from requests.auth import HTTPBasicAuth
requests.get('https://api.github.com/user', auth=HTTPBasicAuth('user', 'pass'))
<Response [200]>
That site appears to not have a login form, but instead uses HTTP Basic auth (causing the browser to request the username and password). requests supports that via the auth argument to get - so you should be able to do something like this:
s.get(url_in, auth=('myUser', 'myPass'))

Categories

Resources