I am trying to use the Requests module send a post request with a proxy.
My code is this:
import requests
proxyList = {'http' : 'http://202.134.202.226:80'}
dataDict = {'input' : 'test'}
r = requests.post('ThePlaceIWantToSendItTo', data=dataDict, proxies=proxyList)
print r.text
but when I do this my webserver says i'm still on my home IP meaning the proxy did not work. What am I doing wrong?
Related
It all started with the fact that I reinstalled paycharm on my computer, reinstalled python
For example, I write normal code, it always worked:
import os
import requests
proxies = {'https': 'https://181.232.190.130:999'}
s = requests.Session()
s.proxies = proxies
r = s.get(url = 'http://wtfismyip.com/text', verify=False)
ip = r.text
print ('Your IP is ' + ip)
os.system("pause")
Of course, the proxies are up-to-date and work.
The problem is that the request returns me my real IP. As if it just ignores this parameter.
I am sure that the problem is not in the code, but in something else! But I have no idea where to look! Spent a whole day, but could not achieve anything!
There is nothing wrong with your code requests/urllib contains bug i believe.
Here modified version code:
Don't use https with your proxy, it will throw version errors. And use proxy for all protocols http/https. Just make changes to these two lines.
proxy = 'http://198.59.191.234:8080'
session.proxies = {"http":proxy, "https": proxy}
import os
import requests
session = requests.Session()
proxy = 'http://198.59.191.234:8080'
session.proxies ={"http": proxy, "https": proxy}
res = session.get(url = 'http://ipecho.net/plain', verify=False)
print ('Your IP is ' , res.text)
os.system("pause")
Output:
Your IP is 198.59.191.243
Press any key to continue . . .
I'm trying to load JSON data through a tor session.
URL_1 works fine, but URL_2 does not work (Example I). I don't know why..
Here is the type of error I get:
IP: <Response [403]>
I can well recover Json api data without tor session (Example II) but as soon as I introduce the requests.Session() function for URL_2 it no longer works.
Example I
# open Tor browser before
import requests
from stem.control import Controller
from stem import Signal
import json
def get_tor_session():
# initialize a requests Session
session = requests.Session()
# setting the proxy of both http & https to the localhost:9050
# this requires a running Tor service in your machine and listening on port 9050 (by default)
session.proxies = {"http": "socks5://localhost:9150", "https": "socks5://localhost:9150"}
return session
#url_1 = "https://api.cryptowat.ch/markets/kraken/eurusd/ohlc"
url_2 = 'https://api.1inch.exchange/v1.1/quote?fromTokenSymbol=USDT&toTokenSymbol=KAI&amount=1000000'
s = get_tor_session()
ip = s.get(url_2).text
#ipJ = json.loads(ip)
print(ip)
Example II
import json
url_1 = 'https://api.cryptowat.ch/markets/kraken/eurusd/ohlc'
url_2 = 'https://api.1inch.exchange/v1.1/quote?fromTokenSymbol=USDT&toTokenSymbol=KAI&amount=1000000'
rA = requests.get(url_2)
jsA = json.loads(rA.content)
print(jsA)
I have already tried to add a header in the request but it does not work better.
Thank you for your help
I'm running this script:
import requests
proxyDict = {"http" : 'http://81.93.73.28:8081'}
r = requests.get('http://ipinfo.io/ip', proxies=proxyDict)
r.status_code
r.headers['content-type']
r.encoding
print(r.text)
I've tried my own proxy server as well as several public servers. It still prints my current ip.
What am I doing wrong?
Problems seem to be with proxy. I tried the random, free one with that code. Also, your code got a few issues. You are calling attributes without usage - they are no need. Try with that code and proxy, for me, it worked.
proxyDict = {"http" : 'http://162.14.18.11:80'}
r = requests.get('http://ipinfo.io/ip', proxies=proxyDict, )
print(r.status_code)
print(r.text)
I have the below. If I take out , proxies=proxies and try and connect to the Intranet, it works, I get a response back.
However if I try an external site with the above put in (as per my example below), I get a 407 error
import requests
from requests_ntlm import HttpNtlmAuth
proxies = {'http': 'http://myproxy.local:9090'}
ntlm_auth = HttpNtlmAuth('DomainName\\MyUsername','MyPassword')
res = requests.get("https://bbc.co.uk",auth=ntlm_auth, proxies=proxies)
print(res.content)
Am I doing something obviously incorrect? When I go to IE and look at the proxy information there, this is exactly what I am using
UPDATE
import requests
from requests_ntlm import HttpNtlmAuth
ntlm_auth = HttpNtlmAuth("DomainName\\MyUsername","MyPassword")
proxies = {'http': 'http://myproxy.local:9090'}
s = requests.Session()
s.proxies = proxies
s.auth = ntlm_auth
res = s.get("http://bbc.co.uk")
print(res.content)
I get the following:
When outputting the value of auth_header_value I get negotiate
I am trying to use urllib2 through a proxy; however, after trying just about every variation of passing my verification details using urllib2, I either get a request that hangs forever and returns nothing or I get 407 Errors. I can connect to the web fine using my browser which connects to a prox-pac and redirects accordingly; however, I can't seem to do anything via the command line curl, wget, urllib2 etc. even if I use the proxies that the prox-pac redirects to. I tried setting my proxy to all of the proxies from the pac-file using urllib2, none of which work.
My current script looks like this:
import urllib2 as url
proxy = url.ProxyHandler({'http': 'username:password#my.proxy:8080'})
auth = url.HTTPBasicAuthHandler()
opener = url.build_opener(proxy, auth, url.HTTPHandler)
url.install_opener(opener)
url.urlopen("http://www.google.com/")
which throws HTTP Error 407: Proxy Authentication Required and I also tried:
import urllib2 as url
handlePass = url.HTTPPasswordMgrWithDefaultRealm()
handlePass.add_password(None, "http://my.proxy:8080", "username", "password")
auth_handler = url.HTTPBasicAuthHandler(handlePass)
opener = url.build_opener(auth_handler)
url.install_opener(opener)
url.urlopen("http://www.google.com")
which hangs like curl or wget timing out.
What do I need to do to diagnose the problem? How is it possible that I can connect via my browser but not from the command line on the same computer using what would appear to be the same proxy and credentials?
Might it be something to do with the router? if so, how can it distinguish between browser HTTP requests and command line HTTP requests?
Frustrations like this are what drove me to use Requests. If you're doing significant amounts of work with urllib2, you really ought to check it out. For example, to do what you wish to do using Requests, you could write:
import requests
from requests.auth import HTTPProxyAuth
proxy = {'http': 'http://my.proxy:8080'}
auth = HTTPProxyAuth('username', 'password')
r = requests.get('http://wwww.google.com/', proxies=proxy, auth=auth)
print r.text
Or you could wrap it in a Session object and every request will automatically use the proxy information (plus it will store & handle cookies automatically!):
s = requests.Session(proxies=proxy, auth=auth)
r = s.get('http://www.google.com/')
print r.text