Python: get json data from https url using sockets - python

I need to get some json data using API
import requests
url = 'https://example.com/api/some-info/'
response = requests.get(url)
print(response.text) # Here is JSON needeed
And everything fine, except I need to make such requests very often, and API provider says:
You'll be banned if you make more than 5 requests per second, so use
sockets
So, how can I make this work via sockets?
Big thx for advices.

Related

Access Instagram user data without logging in

I want to get data returned by this API:
https://www.instagram.com/api/v1/users/web_profile_info/?username=kateannedesigns
When we search for a user we can access basic data without even logging in but when I make a request using this api which actually fetches the data, it fails with a 400 response.
This is the request shown in the browser:
There is no session ID but it still works in the browser
but I want to use this in python requests.
Unfortunately Instagram banned my IP because I sent too many requests. I could find out that the API expects the X-IG-App-ID header with a number. I am not sure whether or not it actually needs the 15 digit number. Try this code:
import requests
headers = {'X-IG-App-ID': '743845927482847'} # random 15 digit number
response = requests.get('https://www.instagram.com/api/v1/users/web_profile_info/?username=kateannedesigns', headers=headers)
json_data = response.json()
print(response.status_code)
print(json_data)

python convert from request python to send api in json

guys I want to create a python API and request msg and get a response with JSON and send it to knime workflow and work with it
this is my script in python the fact is that he give me json.decoder.jsondecodeError
from importlib.metadata import files
import requests
url ='https://api.edination.com/v2/edifact/read'
headers = {'Ocp-Apim-Subscription-Key': '3ecf6b1c5cf34bd797a5f4c57951a1cf'}
files = {'file':open('C:\\Users\\hcharafeddine\\Desktop\\EDI\\Interchange_1654767219416.edi','rb')}
r = requests.post(url,files=files)
r.json()
We'll need more info to help further. I understand if you don't want to share the content of the EDI message, so here are a few things to try:
The EDINation website allows you to paste an EDI message in and it'll show you the JSON output that the API will return
It also has a sample EDIFACT document you can select and then save locally to run through your python script - you can then share the results here
You can also use HTTPToolkit to inspect the API request and response to troubleshoot further.

python http request stream response from google functions

I have a google function that gets data of few websites
with http request.
I want to send the data back as it comes in and not wait for all of it.
I was thinking streaming the data should work.
But I have no idea how to do it
not on server or client side.
I already trade 20 example of google and non worked for me.
Tried:
response.write
response.send
and
import json
import requests
r = requests.get('https://httpbin.org/stream/20', stream=True)
for line in r.iter_lines():
# filter out keep-alive new lines
if line:
decoded_line = line.decode('utf-8')
print(json.loads(decoded_line))
Not sure which side I am doing wrong the client or server but this not working.
Anyone know how to do streaming data on python from http request / to http res?

Can't login using requests

How do i to a website using requests library, i watched a lot of tutorials but they seem to have a 302 POST request in their networks tab in inspector. I see a lot of GET requests in my tab when i login. A friend of mine said cookies but i am really a beginner i don't know how to login.
Also, i would like to know the range from which i can use this library or any helpful source of information from where i can begin learning this library.
import requests
r = requests.get("https://example.com")
I want to POST request, the same friend told me that i would require API access of that website to proceed further is it true?
Depending on the site that you are trying to log in with it may be necessary to log in via a chrome browser (selenium) and from there extract and save the cookies for further injection and use within the requests module.
To extract cookies from selenium and save them to a json file use:
cookies = driver.get_cookies()
with open('file_you_want_to_save_the_cookies_to.json', 'w') as f:
json.dump(cookies, f)
To then use these cookies in the request module use:
cookies = {
'cookie_name' : 'cookie_value'
}
with requests.Session() as s:
r = s.get(url, headers=headers, cookies=cookies)
I could help further if you mention what site you are trying to do this on
Well best practice is to create a little project which can use your library. For requests lib it can accessing some of free APIs on the internet. For example I made this one while ago which is using free API:
##########################################
# #
# INSULT API #
# #
##########################################
def insult_api(name):
params={
("who", name)
}
response = requests.get(
f"https://insult.mattbas.org/api/en/insult.json",
params=params)
return response.json()
Helpful source is (obvisouly official documentation) basically any youtube video or SO post here. Just look for the thing you want to do.
Anyway if you are looking for logging into website without API accesses, you can use selenium libraby.
The requests module is extremely powerful if used properly and with the necessary information sent in the requests. First, analyse the network packets via tools like Network tab in Chrome Dev Tools. Then try to replicate the request via requests in python.
Usually, you will need headers, and data sent.
headers = {
<your headers here>
}
data = <data here>
req = requests.post("https://www.examplesite.com/api/login", data=data, headers=headers)
Everything should be easily found in network packets, unless it has some sort of security like csrf-tokens etc, which need to be sent along with the login req. In order to do this, you need to send a GET req to get the info, then send a POST req with the info.
If you could provide the site you're trying to use it would be pretty helpful too. Best of luck!

Send form data with requests in Python not working

I'm start learning python with requests library and i get facebook for example.
This is my code:
import requests
get_response = requests.get(url='https://www.facebook.com/login/identify?
ctx=recover')
post_data = {'email':'mycorrectemailaddress'}
post_response = requests.post(url="https://www.facebook.com/login/identify?ctx=recover/POST", data=post_data)
print(post_response.text)
And my script not going to the next page, i don't know where is my fault.
You didn't pass all the required data.
payload = {lsd:AVpg3ZvY
email:your#email.com
did_submit:Search
__user:0
__a:1
__dyn:7AzHK4GgN1t2u6XgmwCwRAKGzEy4S-C11xG12wAxu13wIwHx27QdwPG2iuUG4XzEa8uwh9UcU88lwIwHwJwnoCcxG48hwv9FovgeFUuzUhws82BxCqUkguy99UK
__af:iw
__req:5
__be:-1
__pc:PHASED:DEFAULT
__rev:2929740}
Sadly, I don't know what are these args. But you could parse it from a get requrest on the page or use selenium for the informations, and pass cookies to the requests module, and then make the post request.

Categories

Resources