Proxy server doesn't change public IP with Python Requests - python

I'm running this script:
import requests
proxyDict = {"http" : 'http://81.93.73.28:8081'}
r = requests.get('http://ipinfo.io/ip', proxies=proxyDict)
r.status_code
r.headers['content-type']
r.encoding
print(r.text)
I've tried my own proxy server as well as several public servers. It still prints my current ip.
What am I doing wrong?

Problems seem to be with proxy. I tried the random, free one with that code. Also, your code got a few issues. You are calling attributes without usage - they are no need. Try with that code and proxy, for me, it worked.
proxyDict = {"http" : 'http://162.14.18.11:80'}
r = requests.get('http://ipinfo.io/ip', proxies=proxyDict, )
print(r.status_code)
print(r.text)

Related

Why is the request ignoring my proxy parameter and returning my IP to me?

It all started with the fact that I reinstalled paycharm on my computer, reinstalled python
For example, I write normal code, it always worked:
import os
import requests
proxies = {'https': 'https://181.232.190.130:999'}
s = requests.Session()
s.proxies = proxies
r = s.get(url = 'http://wtfismyip.com/text', verify=False)
ip = r.text
print ('Your IP is ' + ip)
os.system("pause")
Of course, the proxies are up-to-date and work.
The problem is that the request returns me my real IP. As if it just ignores this parameter.
I am sure that the problem is not in the code, but in something else! But I have no idea where to look! Spent a whole day, but could not achieve anything!
There is nothing wrong with your code requests/urllib contains bug i believe.
Here modified version code:
Don't use https with your proxy, it will throw version errors. And use proxy for all protocols http/https. Just make changes to these two lines.
proxy = 'http://198.59.191.234:8080'
session.proxies = {"http":proxy, "https": proxy}
import os
import requests
session = requests.Session()
proxy = 'http://198.59.191.234:8080'
session.proxies ={"http": proxy, "https": proxy}
res = session.get(url = 'http://ipecho.net/plain', verify=False)
print ('Your IP is ' , res.text)
os.system("pause")
Output:
Your IP is 198.59.191.243
Press any key to continue . . .

what is the proper way to use proxies with requests in python

Requests is not honoring the proxies flag.
There is something I am missing about making a request over a proxy with python requests library.
If I enable the OS system proxy, then it works, but if I make the request with just requests module proxies setting, the remote machine will not see the proxy set in requests, but will see my real ip, it is as if not proxy was set.
The bellow example will show this effect, at the time of this post the bellow proxy is alive but any working proxy should replicate the effect.
import requests
proxy ={
'http:': 'https://143.208.200.26:7878',
'https:': 'http://143.208.200.26:7878'
}
data = requests.get(url='http://ip-api.com/json', proxies=proxy).json()
print('Ip: %s\nCity: %s\nCountry: %s' % (data['query'], data['city'], data['country']))
I also tried changing the proxy_dict format:
proxy ={
'http:': '143.208.200.26:7878',
'https:': '143.208.200.26:7878'
}
But still it has not effect.
I am using:
-Windows 10
-python 3.9.6
-urllib 1.25.8
Many thanks in advance for any response to help sort this out.
Ok is working yea !!! .
The credits for solving this goes to (Olvin Rogh) Thanks Olvin for your help and pointing out my problem. I was adding colon ":" inside the keys
This code is working now.
PROXY = {'https': 'https://143.208.200.26:7878',
'http': 'http://143.208.200.26:7878'}
with requests.Session() as session:
session.proxies = PROXY
r = session.get('http://ip-api.com/json')
print(json.dumps(r.json(), indent=2))

Python auth returns 401

I have a python script that downloads a snapshot from a camera. I use auth to login to the camera. For older style cameras it works with no issues, with the new one it doesn't. I have tested the link and credentials by copying them from my python script to make sure they work and they do, but i still can't login and I am not sure why. The commented url is the one that works. The uniview one doesn't. I have replaced the password with the correct one and i also tested the link in Chromium and it works.
import requests
#hikvision old cameras
#url = 'http://192.168.100.110/ISAPI/Streaming/channels/101/picture'
#uniview
url = 'http://192.168.100.108:85/images/snapshot.jpg'
r = requests.get(url, auth=('admin','password'))
if r.status_code == 200:
with open('/home/pi/Desktop/image.jpg', 'wb') as out:
for bits in r.iter_content():
out.write(bits)
else:
print(r.status_code)
print(r.content)
Below is the response I get
b'{\r\n"Response": {\r\n\t"ResponseURL": "/images/snapshot.jpg",\r\n\t"ResponseCode": 3,\r\n \t"SubResponseCode": 0,\r\n \t"ResponseString": "Not Authorized",\r\n\t"StatusCode": 401,\r\n\t"StatusString": "Unauthorized",\r\n\t"Data": "null"\r\n}\r\n}\r\n'
So it looks like hikvisio are using Basic_access_authentication while uniview are using Digest_access_authentication so according to the docs you need to change your request to:
from requests.auth import HTTPDigestAuth
r = requests.get(url, auth=HTTPDigestAuth('admin','password'))

Python Requests Proxy

I am trying to use the Requests module send a post request with a proxy.
My code is this:
import requests
proxyList = {'http' : 'http://202.134.202.226:80'}
dataDict = {'input' : 'test'}
r = requests.post('ThePlaceIWantToSendItTo', data=dataDict, proxies=proxyList)
print r.text
but when I do this my webserver says i'm still on my home IP meaning the proxy did not work. What am I doing wrong?

Using urllib2 via proxy

I am trying to use urllib2 through a proxy; however, after trying just about every variation of passing my verification details using urllib2, I either get a request that hangs forever and returns nothing or I get 407 Errors. I can connect to the web fine using my browser which connects to a prox-pac and redirects accordingly; however, I can't seem to do anything via the command line curl, wget, urllib2 etc. even if I use the proxies that the prox-pac redirects to. I tried setting my proxy to all of the proxies from the pac-file using urllib2, none of which work.
My current script looks like this:
import urllib2 as url
proxy = url.ProxyHandler({'http': 'username:password#my.proxy:8080'})
auth = url.HTTPBasicAuthHandler()
opener = url.build_opener(proxy, auth, url.HTTPHandler)
url.install_opener(opener)
url.urlopen("http://www.google.com/")
which throws HTTP Error 407: Proxy Authentication Required and I also tried:
import urllib2 as url
handlePass = url.HTTPPasswordMgrWithDefaultRealm()
handlePass.add_password(None, "http://my.proxy:8080", "username", "password")
auth_handler = url.HTTPBasicAuthHandler(handlePass)
opener = url.build_opener(auth_handler)
url.install_opener(opener)
url.urlopen("http://www.google.com")
which hangs like curl or wget timing out.
What do I need to do to diagnose the problem? How is it possible that I can connect via my browser but not from the command line on the same computer using what would appear to be the same proxy and credentials?
Might it be something to do with the router? if so, how can it distinguish between browser HTTP requests and command line HTTP requests?
Frustrations like this are what drove me to use Requests. If you're doing significant amounts of work with urllib2, you really ought to check it out. For example, to do what you wish to do using Requests, you could write:
import requests
from requests.auth import HTTPProxyAuth
proxy = {'http': 'http://my.proxy:8080'}
auth = HTTPProxyAuth('username', 'password')
r = requests.get('http://wwww.google.com/', proxies=proxy, auth=auth)
print r.text
Or you could wrap it in a Session object and every request will automatically use the proxy information (plus it will store & handle cookies automatically!):
s = requests.Session(proxies=proxy, auth=auth)
r = s.get('http://www.google.com/')
print r.text

Categories

Resources