How to use Python Requests session to a longtime? - python

I don't know how to ask this question but I will explain it.
I use Requests to login to website like this
URL = 'http://test.dev/api/login'
with requests.session() as s:
s.get(URL)
login_data = {'username': 'test', 'password': 'testtest'}
ra=s.post(URL, data=login_data)
# Now I got session and cookie I can go to another pages like
r=s.get('http://test.dev/api/user/1')
When login success the server sent the session and cookie back.
But if I go out the the loop "with" I must login again?
Question is how can I go another page with out the "with" loop ?
EDIT1:
EDIT: I try to use at s=requests.Session() firstime
but it error like this:
Python Requests trying to post data to Laravel
EDIT2:
I try to use s = requests.session() again and it work I don't know why = =
Thank you all.

To expand on Daniel's answer the with block is a context manager. It will open a new context for you, in your case your call to requests.session(). Once the block is completed the the context it will be closed.
In your example once the line r=s.get(...) is completed there is no more code for the context so it is closed. For example these two blocks achieve the same result:
with open('file.txt', 'w') as fh:
fh.write('Hello, world')
and
fh = open('file.txt', 'w')
fh.write('Hello, world')
fh.close()

If you don't want to close the session at the end, don't use a with block. Just do
s = requests.session()
and pass s around as necessary.

Expanding on this a little more:
The " Python Requests trying to post data to Laravel " error simply means there is a problem with the Laravel server - not a python issue.
s=requests.session()
should work just fine and you can use that "s" session object wherever you wish. You can even pass the "s" object as a parameter to other functions.
Using the
with requests.session() as s:
merely limits the scope of "s" object to the indented block of code under the "with"... You can pass the "s" to functions within that block, but coding outside of that block leaves the context of that "s" object.

Related

[Python][request.get] Wait before geting the page

I want to be able to scrapp data from a website. I use the requests.get fonction to do so. Everything works out fine, except that the said website takes time to full load. Then, when I download it, some parts are not fully loaded.
I tried to use the timeout and stream arguments of the get function but it doesn't work.
Here is my code:
acc = open(r'C:\Users\axelg\.spyder-py3\accueil.html','w',encoding="utf-8")
with requests.Session() as s:
url = 'http://localhost/mysiste.php'
s.get(url)
login_data = {'log' : 'myLog', 'pwd' : 'MyPwd'}
s.post(url, data=login_data)
r = s.get('http://localhost/location/',stream = True)
acc.write(r.text)
Thank you for your answers !
You might be using the timeout argument the wrong way
r = s.get('http://localhost/location/',timeout=(5, 20))
The first value of the tuple of timeout argument is to set the session time out and the second value of the tuple is to specify how much time the browser should wait before sending the response. Try changing the second value to suit your requirements

How to send cookie (header) using Python requests library

Hi I am new with python requests and would like to have some help.
When I try to use python requests and get the session cookie, use the following command:
session_req = requests.session()
result = session_req.get(
get_url
)
after execute GET from requests, I use the '.cookies' property ant the respective key I want to send at the POST Header, I get the value successfully, but the POST action is not working.
session_req.cookies['IFCSHOPSESSID']
but when I get the request from the same API via POSTMAN and try to get the cookie property (exporting the code as python requests) I found some differences, and if I use this same cookie exported from POSTMAN it works.
POSTMAN EXAMPLE
'cookie': 'IFCSHOPSESSID=hrthhiqdeg0dvf4ecooc83lui3; nikega=GA1.4.831513767.1599354095; nikega_gid=GA1.4.1839484382.1599354095; _ga=GA1.3.831513767.1599354095; _gid=GA1.3.733956911.1599354099; chaordic_browserId=0-fv_3j6NdVlbNFFwPRzUGQVse7e1bbqga-3OS1599354098234702; chaordic_anonymousUserId=anon-0-fv_3j6NdVlbNFFwPRzUGQVse7e1bbqga-3OS1599354098234702; chaordic_testGroup=%7B%22experiment%22%3Anull%2C%22group%22%3Anull%2C%22testCode%22%3Anull%2C%22code%22%3Anull%2C%22session%22%3Anull%7D; user_unic_ac_id=bec863cf-4e06-0ab1-d881-b566595d3e8f; _gcl_au=1.1.1305519862.1599354100; _fbp=fb.2.1599354100232.504934336; smeventsclear_16df2784b41e46129645c2417f131191=true; smViewOnSite=true; __pr.cvh=4ftsyf8x16; _gaexp=GAX1.3.tupm6REJTMeD-piAakRDMA.18557.0; blueID=75a502b6-e7c2-4eb3-8442-75aea5d95fdc; _cm_ads_activation_retry=false; sback_client=5816989a58791059954e4c52; sback_partner=false; sb_days=1599356617672; sback_refresh_wp=no; smClickOnSite=true; smClickOnSite_652c0aaee02549a3a6ea89988778d3fc=true; _rtbhouse_source_=socialminer; RKT=false; dedup=socialminer; lmd_cj=socialminer; advcake_url=https%3A%2F%2Fwww.nike.com.br%2Flancamentos%3Futm_source%3Dsocialminer%26utm_medium%3Dsocialminer_onsitedesktop%26utm_campaign%3Dsocialminer_onsitedesktop_lancamentos_desk%26smid%3D3-17; advcake_trackid=dd7e2ef0-dd50-889a-aeea-559a0d8bcd22; advcake_utm_content=socialminer_onsitedesktop_lancamentos_desk; advcake_utm_campaign=socialminer; Campanha=; Parceiro=; Midia=; AMCVS_F0935E09512D2C270A490D4D%40AdobeOrg=1; s_cc=true; lmd_orig=direct; SIZEBAY_SESSION_ID=0AC1A70CB19F4f03610665d04bb088ef3b9af0942fc8; sback_customer_w=true; sback_browser=0-87718800-1599408894bff13e290b9fee5fc2b430382f639b87dd9cf25112334287575f550afed62983-14051381-17920887216,13017640152-1599408894; sback_access_token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJhcGkuc2JhY2sudGVjaCIsImlhdCI6MTU5OTQwODg5NSwiZXhwIjoxNTk5NDk1Mjk1LCJhcGkiOiJ2MiIsImRhdGEiOnsiY2xpZW50X2lkIjoiNTgxNjk4OWE1ODc5MTA1OTk1NGU0YzUyIiwiY2xpZW50X2RvbWFpbiI6Im5pa2UuY29tLmJyIiwiY3VzdG9tZXJfaWQiOiI1ZjU0M2VjODA5ZjFkMDkzMmQzMjQ2OTUiLCJjdXN0b21lcl9hbm9ueW1vdXMiOmZhbHNlLCJjb25uZWN0aW9uX2lkIjoiNWY1NDNlYzgwOWYxZDA5MzJkMzI0Njk2IiwiYWNjZXNzX2xldmVsIjoiY3VzdG9tZXIifX0.K6FYVBasHjMg_PLbT1yZfrnIp97USqijoMObF4eUSms.WrWrDrHeHezRqBiYiYHeDr; sback_customer=$2gSxATWYdVYOVGMI10bUdkW2pWeoZERU1kc1YWWhd1SNR0aMJ0QUVzTHpHdJZERnpVS6FTSkRUTOBjMys2bUdnT2$12; sback_pageview=false; ak_bmsc=B6177778CB59637165F7EC43342C1559C9063147DA220000234E555F8D78F831~plACNrc4cNxoHZNcO7aF4o+U0KQNKjzPECGSfb42NdayPvdNkBWwUT9QOhGjuLJJ3vStuFIRkiI/35wsHEyUE3/h2guphhaEy71BnfekvDtb/6F84hS+fWhPxxVG5RAlph8WzGpYMn6NZESNVcgnZYfH4HoZ/IzBPR6AMG9UGn6W4xm/j/j9kOfef8v/fZf2pXw4mxJuiN5Cxc7g2sV4nCdoEW98Q4AgqplzxWZjpamZk=; bm_sz=6586256DDAFC895D740341E4214D0D40~YAAQRzEGybYT7yN0AQAAfDw5ZQnXjJtKI2SxkwQFV9vLZpF5mACXNUtUFDSkidKuYM2fac5sQgRozU9fA3+017dht/PUtH+wtibATtTmoVOlpKnW+V76+1rySk3HK6q83Q9rtQc/LaaQ8VYtK/tDi0VOc7/0wLyKy/+Z4OLtgUpySYZZcEX4k8/46no8rFD6OQ==; AMCV_F0935E09512D2C270A490D4D%40AdobeOrg=359503849%7CMCIDTS%7C18512%7CMCMID%7C56897587165425478193529762442649463163%7CMCAAMLH-1600030892%7C4%7CMCAAMB-1600030892%7CRKhpRz8krg2tLO6pguXWp5olkAcUniQYPHaMWWgdJ3xzPWQmdj0y%7CMCOPTOUT-1599433292s%7CNONE%7CMCSYNCSOP%7C411-18519%7CvVersion%7C5.0.1; sback_total_sessions=3; sback_session=5f554e3c73a63da56d739d87; lmd_traf=direct-1599402359608&direct-1599408890286&direct-1599414284313&direct-1599427194077; chaordic_realUserId=2962653; chaordic_session=1599429266491-0.4343169041143473; _st_ses=49222273669791505; _st_cart_script=helper_nike.js; _st_cart_url=/; _sptid=1592; _spcid=1592; _st_id=cnVkc29ucmFtb25AZ21haWwuY29t; _st_idb=cnVkc29ucmFtb25AZ21haWwuY29t; lx_sales_channel=%5B%222%22%5D; sback_cart=5f555ba24f507d767721c387; CSRFtoken=1ac8a198f88ac1ccc1f8555ab41c8a95; gpv_v70=nikecombr%3Echeckout%3Eaddress; pv_templateName=CHECKOUT; gptype_v60=checkout%3Aaddress; stc119288=env:1599429270%7C20201007215430%7C20200906222939%7C5%7C1088071:20210906215939|uid:1599354102799.1149977977.6705985.119288.1871143352:20210906215939|srchist:1088071%3A1599429270%3A20201007215430:20210906215939|tsa:1599429270805.1898407973.364911.7635034620790062.2:20200906222939; bm_sv=C9C3A8C6B2F6CB232317BB794ADC0497~ZnoksXquh4Yrh4uN87gycXdh+ixzU+xMFsb94sO9uE5JMLyZz9eJPp5odX7vx944KIXG1nvOxuq8pdrQUDjBrchRJLC4yiD1yWX0h4BjWhZwbfHPtnzaT3ASbIZnf2Ts1TRt+ZAescJJwrNPs4oV2If7vyiWi2AYILFvCstCTS8=; _uetsid=a9a0bfd4fe4e4db52bcd4ca66850a785; _uetvid=9ba47ed116a48f496f6b1a9844e21c95; __udf_j=f08aeb668454efbf6ddc83dd9d4b7a8385abde9f9fbd92526f1de0441da2126ec40330dfc36d0b9c3eae98557c94447d; _spl_pv=40; s_sq=lojanike-new-production%252Clojanike-nikebr%3D%2526c.%2526a.%2526activitymap.%2526page%253Dnikecombr%25253Echeckout%25253Eaddress%2526link%253DSeguir%252520para%252520pagamento%2526region%253Didentificacao-form%2526pageIDType%253D1%2526.activitymap%2526.a%2526.c%2526pid%253Dnikecombr%25253Echeckout%25253Eaddress%2526pidt%253D1%2526oid%253DSeguir%252520para%252520pagamento%2526oidt%253D3%2526ot%253DSUBMIT; RT="z=1&dm=nike.com.br&si=92b42534-25ee-4155-aa1a-e7d127581869&ss=kermvxyl&sl=9&tt=17e8&bcn=%2F%2F173e2544.akstat.io%2F"; _abck=F6E1C280C3F9D735A2B1AB62443DB479~-1~YAAQVjEGycno+iJ0AQAAmtRxZQT8kxLFalTup4dkYT5+cq/PavPcY4/0zAeJv4GoSQQwYVj4EWydkfxbJR3Rgaa4k6ma+5O72J/lsiajATrx0oaZJuB5b/FIP6RymanPRVGlb3kLJXpBQDkCmVv62kkxLKxySrlAYDCg0ORCpSXlTCbFBVEchC9ih5t094egSeVdM6VjfQSO9uDKISBoP4923qkJMTpbk9B1nOoiylKK+y+FGFu8pzEpQqZYj7tIMTJVpqe0OpXaQ8m8nPyp0K+PmBcAndIHcBMTZUEqma9/72Enx8yvGbKXrYbAzNDw6ZtKY9OAbNuVeqprza/Af0aUkinm0l3JqxjTH1LpglNxNN4=~-1~-1~-1; CSRFtoken=20a208bad599aa3ead0bbe944b27a368; bm_sv=C9C3A8C6B2F6CB232317BB794ADC0497~ZnoksXquh4Yrh4uN87gycXdh+ixzU+xMFsb94sO9uE5JMLyZz9eJPp5odX7vx944KIXG1nvOxuq8pdrQUDjBrchRJLC4yiD1yWX0h4BjWhbSXhHWWrgkUsOTt9033P5Wxu1qmo5M6w0VAWeAzBaCN7yZC2Ll7DiGq0CwpjxlOW4=; _abck=F6E1C280C3F9D735A2B1AB62443DB479~-1~YAAQVjEGyRKO+iJ0AQAA+4U9ZQSNIWTEz/60Uk5gz2tnzVtbMbX0hpaMbkbeJxSYSMD1xo7TTedXnJ0UuTLxxcHhLVrRRCrZfSjZ+yH00Ld6FLIajmYFefKPehzA6GgwjnLyucI1O6nDw2ZU1CV0WJLeWGgcmX7sinsLr3DVtmoGJyNR1Q9EWpvq71/W1Ys4Bqhq1628YKEz/0Z1Ic1bWMujcG03064ZZYYXTSTz9jrkxHKaEoJQNQgyUg9NXQhv4EFoMSESy/AIKRy+hVCULLJscbkpH8WakuvYQ1raghVfheks/Xra9AmiUoOqAbWAPXOij1nWQ9PSV2hxQZfkibD0+YP14pTXPoCAUA9jCQHRJIw=~0~-1~-1'
session_req.cookies['IFCSHOPSESSID'] EXAMPLE
qnabtagl4pu7gm2jg3sij03cu6
Other curious thing is that when I use the '.cookies' property, my POST call return sucess even without update the cart where it should be inserting a new register.
As I am trying to develop one site bot, I would like to generate this same cookie via python requests code. Can anyone try to help me on it?
This is an example with python 3. You can customize it.
import requests
data ="param_1=value_1&param_2=value_2&.....&param_n=value_n"; #your request parameters.
cookie = "cookie_name=xxxxxxxx;....." #define cookie
url_endpoint = "htpps://........." # your url endpoint
# add cookies to endpoints
resp = requests.get(url_endpoint, data=data.encode('utf-8'),cookies=cookie)
if(resp.status_code==200):
print("success ")
else:
print("error ")

While loop makes the browser freeze with Brython

I’m trying to get the response of an api request made with ajax.ajax() and the response is stored into ['apiResponse'] in the HTML5 Local Storage (but the rest of the python function processes without waiting for it be put into localStorage).
Because of this I need to wait for it before getting the response and I thought I could do what I did below for the program to wait before it proceed.
Unfortunately the browser seems to freeze every time I put a while loop...
If someone know how to make Brython and the browser to stop freezing or another method to do what I wanna do...
(It would really help me as it’s the only step before succeeding getting Spotify api requests response)
from browser import ajax #to make requests
from browser.local_storage import storage as localStorage #to access HTML5 Local Storage
import json #to convert a json-like string into a Python Dict
#Request to the API
def on_complete(req):
if req.status==200 or req.status==0:
localStorage['apiResponse'] = req.text
else:
print("An error occured while asking Spotify for data")
def apiRequest(requestUrl, requestMethod):
req = ajax.ajax()
req.bind('complete', on_complete)
req.open(requestMethod, requestUrl, True)
req.set_header('Authorization', localStorage['header'])
req.send()
def response():
while localStorage['apiResponse'] == '':
continue
print('done')
return json.loads(localStorage['apiResponse'])
Thanks in advance!

Do requests sessions end when the program is killed?

Is there anyway to pickup on a previous session when starting a python program?
I've set session as a global variable so that it can be accessed across any method that needs it. However, I'm guessing when I start the program again the session variable is reset.
Is there a way to come back to a previous session when starting the program?
session = requests.Session()
def auth():
session = self.session
url = 'this url has auth'
session.post(url, data=data)
# Now authentcated so lets grab the data
call_data(sessions)
def call_data(session)
url = 'this url has the data'
session.post(url, data=data)
def check_data()
url = 'this url does a specific call on data elements'
self.session.post(url, data=data)
When I load up my program a second time I will only want to use check_data method, I'd prefer to not require an auth every time I start the program, or perhaps I'm just curious to see if it can be done ;)
EDIT
I've updated my solution with the accepted answer.
def auth():
session = self.session
session.cookies = LWPCookieJar("cookies.txt")
url = 'this url has auth'
session.post(url, data=data)
# Now authentcated so lets grab the data
call_data(sessions)
session.cookies.save() #Save auth cookie
def some_other_method():
if not cookie:
session.cookies.load()
# do stuff now that we're authed
Code obviously don't show proper accessor for other methods, but the idea works fine.
Would be interested to know if this is the only way to remain authed.
The sessions are tracked in http via cookies. You can save them between program restart by storing in a http.cookiejar.LWPCookieJar
At the beginning of your program you have to set the cookieJar to this FileCookieJar and load the existing cookies if any
import requests
from http.cookiejar import LWPCookieJar
session = requests.Session()
session.cookies = LWPCookieJar("storage.jar")
session.cookies.load()
before closing your program you have to to save them to the file
session.cookies.save()
Note that by default it has the same behavior than browser that it doesn't save session cookies which are not set to persistent to your browser across restart if you want a different behavior, just precise it to save() method by setting ignore_discard argument to False like this
session.cookies.save(ignore_discard=False)
It's not clear what kind of session you try to establish. Django? Flask? Something different?
Be aware also that there seems to be a misspelling of call_data(sessions) where only session (without s) is defined.

Unexpected behaviour with Urllib in Python

My system is not behind any proxy.
params = urllib.urlencode({'search':"August Rush"})
f = urllib.urlopen("http://www.thepiratebay.org/search/query", params)
This goes onto an infinite loop(Or just hangs). I can obviously get rid of this and use FancyUrlOpener and create the query myself rather than passing it parameters. But, I think doing the way I'm doing now is a better and cleaner approach.
Edit: This was more of a networking problem in which my Ubuntu workstation was configured to a different proxy. Had to do certain changes and it worked. Thank you!
The posted code works fine for me, with Python 2.7.2 on Windows.
Have you tried using a http-debugging tool, like Fiddler2 to see the actual conversation going between your program and the site?
If you run Fiddler2 on port 8888 on localhost, you can do this to see the request and response:
import urllib
proxies = {"http": "http://localhost:8888"}
params = urllib.urlencode({'search':"August Rush"})
f = urllib.urlopen("http://www.thepiratebay.org/search/query", params, proxies)
print len(f.read())
This works for me:
import urllib
params = urllib.urlencode({'q': "August Rush", 'page': '0', 'orderby': '99'})
f = urllib.urlopen("http://www.thepiratebay.org/s/", params)
with open('text.html', 'w') as ff:
ff.write('\n'.join(f.readlines()))
I opened http://www.thepiratebay.org with Google Chrome with network inspector enabled. I put "August Rush" into the search field and pressed 'Search'. Then i analyzed the headers sent and did the code above.

Categories

Resources