Python Requests ignoring time outs - python

I'm trying to make some kind of a scanner with python (just for fun)
it will send a get request to an random ip and see if there is any answer
the problem is that every time that the connection fails the program will stop running .
this is the code
import time
import requests
ips = open("ip.txt", "r")
for ip in ips:
r = requests.get(url="http://"+ip+"/",verify=False)
print(r.status_code)
time.sleep(0.5)
this is what i get by trying just a random ip :
requests.exceptions.ConnectionError: HTTPConnectionPool(host='177.0.0.0', port=80): Max retries exceeded with url:

This is throwing an error. To protect against this, use a try/except statement:
for ip in ips:
try:
r = requests.get(url="http://"+ip+"/",verify=False)
print(r.status_code)
except requests.exceptions.RequestException as e:
print('Connecting to ip ' + ip + ' failed.', e)
time.sleep(0.5)

Related

Set max retries on requests.post

I want to set a max retry limit on my script to eliminate these errors:
requests.exceptions.ConnectionError: HTTPConnectionPool(host='173.180.119.132', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x03F9E2E0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))
I can't find a way to send a post request with max retries.
This is my code:
import requests
from requests.adapters import HTTPAdapter
requests.adapters.DEFAULT_RETRIES = 2
f = open("hosts.txt", "r")
payload = {
'inUserName': 'ADMIN',
'inUserPassword': '1234'
}
i = 0
for line in f:
i += 1
print(i)
r = requests.post("http://" + line, data=payload)
if "401 - Unauthorized" in r:
pass
else:
if r.status_code != 200:
pass
else:
with open("output.txt", "a+") as output_file:
output_file.write(line)
This error
requests.exceptions.ConnectionError: HTTPConnectionPool(host='173.180.119.132', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x03F9E2E0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))
is caused by sending too many requests to the server and the only way to detect is by a response from the server-side, i.e. there is no way of knowing when this error will be thrown on client-side.
There are a few ways in which you can get around this error.
You can catch the error and you can break out of the loop.
try:
page1 = requests.get(ap)
except requests.exceptions.ConnectionError:
#r.status_code = "Connection refused"
break
You can also simply add a sleep(unit) line in your code to add a gap between each request made to the server. This often overcomes the maxRetry error.
from time import sleep
sleep(5) # 5 seconds sleep cmd

How to redo a try statement within a certain amount of tries

Currently I have a program that has proxies and makes a request to get my ip address with that proxy and returns it back in json.
An example request back is this:
Got Back: {'ip': '91.67.240.45', 'country': 'Germany', 'cc': 'DE'}
I want my program to try and make a request to the url and if it does not get the request because the proxy is down I want to try again 5 times before moving onto the next ip address.
I thought this except block would work but it is not breaking out of the loop when the 5 iterations are over and I am not sure why.
My program does however work when the proxy is up for the first try as it breaks after the first attempt and then moves onto the next ip address.
Here is what I currently have:
import requests
import time
proxies = [
"95.87.220.19:15600",
"91.67.240.45:3128",
"85.175.216.32:53281",
"91.236.251.131:8118",
"91.236.251.131:8118",
"88.99.10.249:1080",
]
def sol(ip):
max_tries = 5
for i in range(1, max_tries+1):
try:
print(f"Using Proxy: {ip}")
r = requests.get('https://api.myip.com', proxies={"https": ip})
print(f"Got Back: {r.json()}")
break
except OSError:
time.sleep(5)
print(f"Retrying...: {i}")
break
for i in proxies:
sol(i)
How can I make it s my loop has 5 tries before moving onto the next ip address.
My program does however work when the proxy is up for the first try as it breaks after the first attempt and then moves onto the next ip address.
It does this unconditionally, because you have an unconditional break after the except block. Code keeps going past a try/except when the except is entered, assuming it doesn't have an abnormal exit of its own (another exception, return etc.).
So,
I thought this except block would work but it is not breaking out of the loop when the 5 iterations are over and I am not sure why.
It doesn't break out "after the 5 iterations are over" because it breaks out after the first iteration, whether or not that attempt was successful.
If I understand correctly, you can just remove break from the last line. If you have an unconditional break in a loop, it will always iterate once.
def sol(ip):
max_tries = 5
for i in range(1, max_tries+1):
try:
print(f"Using Proxy: {ip}")
r = requests.get('https://api.myip.com', proxies={"https": ip})
print(f"Got Back: {r.json()}")
break
except OSError:
time.sleep(5)
print(f"Retrying...: {i}")
# break <---- Remove this line
Using retrying would look something like the following:
from retrying import retry
import requests
#retry(stop_max_attempt_number=5)
def bad_host():
print('trying get from bad host')
return requests.get('https://bad.host.asdqwerqweo79ooo/')
try:
bad_host()
except IOError as ex:
print(f"Couldn't connect because: {str(ex)}")
...which give the following output:
trying get from bad host
trying get from bad host
trying get from bad host
trying get from bad host
trying get from bad host
Couldn't connect to bad host because: HTTPSConnectionPool(host='bad.host.asdqwerqweo79ooo', port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x10b6bd910>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known'))
Getting fancy
If you want to get fancy, you could also add things like exponential backoff and selectively retrying certain exceptions.
Here's an example:
import random
import time
def retry_ioerror(exception):
return isinstance(exception, IOError)
#retry(
wait_exponential_multiplier=100,
wait_exponential_max=1000,
retry_on_exception=retry_ioerror,
stop_max_attempt_number=10)
def do_something():
t = time.time()
print(f'trying {t}')
r = random.random()
if r > 0.9:
return 'yay!'
if r > 0.8:
raise RuntimeError('Boom!')
else:
raise IOError('Bang!')
try:
result = do_something()
print(f'Success! {result}')
except RuntimeError as ex:
print(f"Failed: {str(ex)}")

read text file with ip address list and make a curl connection

i'm tying to create a python script that reads a text file with a list of ip addresses then would send a curl request to port 80 and supply a user name and password to see if I can log into the web interface and return whatever the web page displays. any help is most appreciated.
import sys
import requests
f = open('APs.txt', 'r')
c = f.read()
for i in c:
r = requests.get('http://' + i, auth=('user', 'pass'))
print(i, r.status_code)
f.close()
I think this is working as expected. Does anyone have a better way?
import sys
import requests
with open(r'APs.txt', 'r') as ips:
for line in ips:
ip = line.strip()
r = requests.get('http://' +ip, auth=('user', 'pass'))
print(ip,',',r.status_code)
This is what I have now but I am getting a error and trying to work with try/except but having issues.
import sys
import requests
with open(r'APs.txt', 'r') as ips:
for line in ips:
try:
ip = line.strip()
r = requests.get('http://' +ip, auth=('user', 'pass'), timeout=2)
print(ip,',',r.status_code)
except ConnectTimeout:
print(ip,',','error')
here is the error:
requests.exceptions.ConnectTimeout: HTTPConnectionPool(host='1.1.1.1', port=80): Max retries exceeded with url: / (Caused by ConnectTimeoutError(, 'Connection to 1.1.1.1 timed out. (connect timeout=1)'))

Python Requests timeout parameter is being ignored

I'm using Python 2.7, I want every request to timeout after some seconds, but the requests timeout almost immediately. Following is my code.
requestsTimeout = 5
link = 'http://' + IP + '/api/v1.0/system/info'
while (1):
try:
return requests.get(link, timeout = requestsTimeout)
except requests.exceptions.RequestException as e:
log._print0(_printID, 'getting DAQ Info' ,str(e)) # just printing
time.sleep(0.1)
Now if I disconnect my wifi, I should get a printout of timeout exception after every 5 seconds, but I'm getting prints at a very fast rate (multiple times in one second).
When host is unreachable ConnectionError is raised without waiting time set by timeout. You could overcome this by handling this exception separately:
requestsTimeout = 5
link = 'http://' + IP + '/api/v1.0/system/info'
while True:
try:
return requests.get(link, timeout=requestsTimeout)
except requests.exceptions.ConnectionError as e:
time.sleep(requestsTimeout)
except requests.exceptions.RequestException as e:
log._print0(_printID, 'getting DAQ Info' ,str(e)) # just printing
time.sleep(0.1)

Adding timeout while fetching server certs via python

I am trying to fetch a list of server certificates and using the python standard SSL library to accomplish this. This is how I am doing it:
import ssl
from socket import *
urls = [i.strip().lower() for i in open("urls.txt")]
for urls in url:
try:
print ssl.get_server_certificate((url, 443))
except error:
print "No connection"
However for some URLs,there are connectivity issues and the connection just times out.However it waits for the default ssl timeout value(which is quite long) before timing out.How do i specify a timeout in the ssl.get_server_certificate method ? I have specified timeouts for sockets before but I am clueless as to how to do it for this method
From the docs:
SSL sockets provide the following methods of Socket Objects:
gettimeout(), settimeout(), setblocking()
So should just be as simple as:
import ssl
from socket import *
settimeout(10)
urls = [i.strip().lower() for i in open("urls.txt")]
for urls in url:
try:
print ssl.get_server_certificate((url, 443))
except (error, timeout) as err:
print "No connection: {0}".format(err)
This versions runs for me using with Python 3.9.12 (hat tip #bchurchill):
import ssl
import socket
socket.setdefaulttimeout(2)
urls = [i.strip().lower() for i in open("urls.txt")]
for url in urls:
try:
certificate = ssl.get_server_certificate((url, 443))
print (certificate)
except Exception as err:
print(f"No connection to {url} due to: {err}")

Categories

Resources