Related
How do I ping a website or IP address with Python?
See this pure Python ping by Matthew Dixon Cowles and Jens Diemer. Also, remember that Python requires root to spawn ICMP (i.e. ping) sockets in linux.
import ping, socket
try:
ping.verbose_ping('www.google.com', count=3)
delay = ping.Ping('www.wikipedia.org', timeout=2000).do()
except socket.error, e:
print "Ping Error:", e
The source code itself is easy to read, see the implementations of verbose_ping and of Ping.do for inspiration.
Depending on what you want to achive, you are probably easiest calling the system ping command..
Using the subprocess module is the best way of doing this, although you have to remember the ping command is different on different operating systems!
import subprocess
host = "www.google.com"
ping = subprocess.Popen(
["ping", "-c", "4", host],
stdout = subprocess.PIPE,
stderr = subprocess.PIPE
)
out, error = ping.communicate()
print out
You don't need to worry about shell-escape characters. For example..
host = "google.com; `echo test`
..will not execute the echo command.
Now, to actually get the ping results, you could parse the out variable. Example output:
round-trip min/avg/max/stddev = 248.139/249.474/250.530/0.896 ms
Example regex:
import re
matcher = re.compile("round-trip min/avg/max/stddev = (\d+.\d+)/(\d+.\d+)/(\d+.\d+)/(\d+.\d+)")
print matcher.search(out).groups()
# ('248.139', '249.474', '250.530', '0.896')
Again, remember the output will vary depending on operating system (and even the version of ping). This isn't ideal, but it will work fine in many situations (where you know the machines the script will be running on)
You may find Noah Gift's presentation Creating Agile Commandline Tools With Python. In it he combines subprocess, Queue and threading to develop solution that is capable of pinging hosts concurrently and speeding up the process. Below is a basic version before he adds command line parsing and some other features. The code to this version and others can be found here
#!/usr/bin/env python2.5
from threading import Thread
import subprocess
from Queue import Queue
num_threads = 4
queue = Queue()
ips = ["10.0.1.1", "10.0.1.3", "10.0.1.11", "10.0.1.51"]
#wraps system ping command
def pinger(i, q):
"""Pings subnet"""
while True:
ip = q.get()
print "Thread %s: Pinging %s" % (i, ip)
ret = subprocess.call("ping -c 1 %s" % ip,
shell=True,
stdout=open('/dev/null', 'w'),
stderr=subprocess.STDOUT)
if ret == 0:
print "%s: is alive" % ip
else:
print "%s: did not respond" % ip
q.task_done()
#Spawn thread pool
for i in range(num_threads):
worker = Thread(target=pinger, args=(i, queue))
worker.setDaemon(True)
worker.start()
#Place work in queue
for ip in ips:
queue.put(ip)
#Wait until worker threads are done to exit
queue.join()
He is also author of: Python for Unix and Linux System Administration
http://ecx.images-amazon.com/images/I/515qmR%2B4sjL._SL500_AA240_.jpg
It's hard to say what your question is, but there are some alternatives.
If you mean to literally execute a request using the ICMP ping protocol, you can get an ICMP library and execute the ping request directly. Google "Python ICMP" to find things like this icmplib. You might want to look at scapy, also.
This will be much faster than using os.system("ping " + ip ).
If you mean to generically "ping" a box to see if it's up, you can use the echo protocol on port 7.
For echo, you use the socket library to open the IP address and port 7. You write something on that port, send a carriage return ("\r\n") and then read the reply.
If you mean to "ping" a web site to see if the site is running, you have to use the http protocol on port 80.
For or properly checking a web server, you use urllib2 to open a specific URL. (/index.html is always popular) and read the response.
There are still more potential meaning of "ping" including "traceroute" and "finger".
I did something similar this way, as an inspiration:
import urllib
import threading
import time
def pinger_urllib(host):
"""
helper function timing the retrival of index.html
TODO: should there be a 1MB bogus file?
"""
t1 = time.time()
urllib.urlopen(host + '/index.html').read()
return (time.time() - t1) * 1000.0
def task(m):
"""
the actual task
"""
delay = float(pinger_urllib(m))
print '%-30s %5.0f [ms]' % (m, delay)
# parallelization
tasks = []
URLs = ['google.com', 'wikipedia.org']
for m in URLs:
t = threading.Thread(target=task, args=(m,))
t.start()
tasks.append(t)
# synchronization point
for t in tasks:
t.join()
Here's a short snippet using subprocess. The check_call method either returns 0 for success, or raises an exception. This way, I don't have to parse the output of ping. I'm using shlex to split the command line arguments.
import subprocess
import shlex
command_line = "ping -c 1 www.google.comsldjkflksj"
args = shlex.split(command_line)
try:
subprocess.check_call(args,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
print "Website is there."
except subprocess.CalledProcessError:
print "Couldn't get a ping."
Most simple answer is:
import os
os.system("ping google.com")
I develop a library that I think could help you. It is called icmplib (unrelated to any other code of the same name that can be found on the Internet) and is a pure implementation of the ICMP protocol in Python.
It is completely object oriented and has simple functions such as the classic ping, multiping and traceroute, as well as low level classes and sockets for those who want to develop applications based on the ICMP protocol.
Here are some other highlights:
Can be run without root privileges.
You can customize many parameters such as the payload of ICMP packets and the traffic class (QoS).
Cross-platform: tested on Linux, macOS and Windows.
Fast and requires few CPU / RAM resources unlike calls made with subprocess.
Lightweight and does not rely on any additional dependencies.
To install it (Python 3.6+ required):
pip3 install icmplib
Here is a simple example of the ping function:
host = ping('1.1.1.1', count=4, interval=1, timeout=2, privileged=True)
if host.is_alive:
print(f'{host.address} is alive! avg_rtt={host.avg_rtt} ms')
else:
print(f'{host.address} is dead')
Set the "privileged" parameter to False if you want to use the library without root privileges.
You can find the complete documentation on the project page:
https://github.com/ValentinBELYN/icmplib
Hope you will find this library useful.
read a file name, the file contain the one url per line, like this:
http://www.poolsaboveground.com/apache/hadoop/core/
http://mirrors.sonic.net/apache/hadoop/core/
use command:
python url.py urls.txt
get the result:
Round Trip Time: 253 ms - mirrors.sonic.net
Round Trip Time: 245 ms - www.globalish.com
Round Trip Time: 327 ms - www.poolsaboveground.com
source code(url.py):
import re
import sys
import urlparse
from subprocess import Popen, PIPE
from threading import Thread
class Pinger(object):
def __init__(self, hosts):
for host in hosts:
hostname = urlparse.urlparse(host).hostname
if hostname:
pa = PingAgent(hostname)
pa.start()
else:
continue
class PingAgent(Thread):
def __init__(self, host):
Thread.__init__(self)
self.host = host
def run(self):
p = Popen('ping -n 1 ' + self.host, stdout=PIPE)
m = re.search('Average = (.*)ms', p.stdout.read())
if m: print 'Round Trip Time: %s ms -' % m.group(1), self.host
else: print 'Error: Invalid Response -', self.host
if __name__ == '__main__':
with open(sys.argv[1]) as f:
content = f.readlines()
Pinger(content)
import subprocess as s
ip=raw_input("Enter the IP/Domain name:")
if(s.call(["ping",ip])==0):
print "your IP is alive"
else:
print "Check ur IP"
If you want something actually in Python, that you can play with, have a look at Scapy:
from scapy.all import *
request = IP(dst="www.google.com")/ICMP()
answer = sr1(request)
That's in my opinion much better (and fully cross-platform), than some funky subprocess calls. Also you can have as much information about the answer (sequence ID.....) as you want, as you have the packet itself.
using system ping command to ping a list of hosts:
import re
from subprocess import Popen, PIPE
from threading import Thread
class Pinger(object):
def __init__(self, hosts):
for host in hosts:
pa = PingAgent(host)
pa.start()
class PingAgent(Thread):
def __init__(self, host):
Thread.__init__(self)
self.host = host
def run(self):
p = Popen('ping -n 1 ' + self.host, stdout=PIPE)
m = re.search('Average = (.*)ms', p.stdout.read())
if m: print 'Round Trip Time: %s ms -' % m.group(1), self.host
else: print 'Error: Invalid Response -', self.host
if __name__ == '__main__':
hosts = [
'www.pylot.org',
'www.goldb.org',
'www.google.com',
'www.yahoo.com',
'www.techcrunch.com',
'www.this_one_wont_work.com'
]
Pinger(hosts)
You can find an updated version of the mentioned script that works on both Windows and Linux here
using subprocess ping command to ping decode it because the response is binary:
import subprocess
ping_response = subprocess.Popen(["ping", "-a", "google.com"], stdout=subprocess.PIPE).stdout.read()
result = ping_response.decode('utf-8')
print(result)
you might try socket to get ip of the site and use scrapy to excute icmp ping to the ip.
import gevent
from gevent import monkey
# monkey.patch_all() should be executed before any library that will
# standard library
monkey.patch_all()
import socket
from scapy.all import IP, ICMP, sr1
def ping_site(fqdn):
ip = socket.gethostbyaddr(fqdn)[-1][0]
print(fqdn, ip, '\n')
icmp = IP(dst=ip)/ICMP()
resp = sr1(icmp, timeout=10)
if resp:
return (fqdn, False)
else:
return (fqdn, True)
sites = ['www.google.com', 'www.baidu.com', 'www.bing.com']
jobs = [gevent.spawn(ping_site, fqdn) for fqdn in sites]
gevent.joinall(jobs)
print([job.value for job in jobs])
On python 3 you can use ping3.
from ping3 import ping, verbose_ping
ip-host = '8.8.8.8'
if not ping(ip-host):
raise ValueError('{} is not available.'.format(ip-host))
If you only want to check whether a machine on an IP is active or not, you can just use python sockets.
import socket
s = socket.socket()
try:
s.connect(("192.168.1.123", 1234)) # You can use any port number here
except Exception as e:
print(e.errno, e)
Now, according to the error message displayed (or the error number), you can determine whether the machine is active or not.
Use this it's tested on python 2.7 and works fine it returns ping time in milliseconds if success and return False on fail.
import platform,subproccess,re
def Ping(hostname,timeout):
if platform.system() == "Windows":
command="ping "+hostname+" -n 1 -w "+str(timeout*1000)
else:
command="ping -i "+str(timeout)+" -c 1 " + hostname
proccess = subprocess.Popen(command, stdout=subprocess.PIPE)
matches=re.match('.*time=([0-9]+)ms.*', proccess.stdout.read(),re.DOTALL)
if matches:
return matches.group(1)
else:
return False
I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
Is there any efficient way to check and report in a log file or on the console may be... when ever the VPN is disconnected?
import time
print time.asctime( time.localtime(time.time()) )
Can print the time but I do not know what is the code to recursively find whether the VPN is active or not. Pinging it in a while(1) would be a stupid way to check if the connection is active or not. Any way to achieve this?
This solution is system dependent, I do know that it works on Linux because I've done something similar, but not sure about Windows though. I don't know if you want a solution not involving ping, but I think this is a good solution.
import logging, os, time
PING_HOST='10.10.10.10' # some host on the other side of the VPN
while True:
retcode = os.system('ping -c 1 %s' % PING_HOST)
if retcode:
# perform action for lost connection
logging.warn('Lost visibility with %s' % PING_HOST)
time.sleep(10) # sleep 10 seconds
This works because ping returns a return code of 0 for success. All other return codes signify an error.
In case the IP of vpn changes, you can check if a tunnel has been established at all.
import psutil
import logging, os, time
import subprocess
import sys
procname = "yourprocess_name"
while True:
cmdout = subprocess.Popen(["ifconfig | grep tun"],stdout = subprocess.PIPE, shell=True).communicate()[0]
print "cmdout: "+str(cmdout)
time.sleep(2)
#-----
if "tun" in cmdout:
print "seems to be ok"
if not "tun" in cmdout:
# perform action for lost connection
print "killing "+str(procname)
for proc in psutil.process_iter():
# check whether the process name matches
print "Listing procname: "+str(proc.name())
if proc.name() == procname:
proc.kill()
sys.exit()
This method uses the NAME of the HOST "Connection-specific DNS Suffix" associated with your IP (Mostly the corporation's VPN):
import os
import platform
def check_ping():
hostname = "xyz.com" #hostname will be..Name under: "Connection-specific DNS Suffix" when you type "ipconfig" in cmd..
response = os.system("ping " + ("-n 1 " if platform.system().lower()=="windows" else "-c 1 ") + hostname)
# and then check the response...
if response == 0:
pingstatus = "Network Active: Connected"
else:
pingstatus = "Network Error: Not Connected"
return pingstatus
response = check_ping()
print(response)
I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
How would I check if the remote host is up without having a port number? Is there any other way I could check other then using regular ping.
There is a possibility that the remote host might drop ping packets
This worked fine for me:
HOST_UP = True if os.system("ping -c 1 " + SOMEHOST) is 0 else False
A protocol-level PING is best, i.e., connecting to the server and interacting with it in a way that doesn't do real work. That's because it is the only real way to be sure that the service is up. An ICMP ECHO (a.k.a. ping) would only tell you that the other end's network interface is up, and even then might be blocked; FWIW, I have seen machines where all user processes were bricked but which could still be pinged. In these days of application servers, even getting a network connection might not be enough; what if the hosted app is down or otherwise non-functional? As I said, talking sweet-nothings to the actual service that you are interested in is the best, surest approach.
HOST_UP = True if os.system("ping -c 5 " + SOMEHOST.strip(";")) is 0 else False
to remove nasty script execution just add .strip(";")
-c 5
to increase the number of ping requests, if all pass than True
PS. Works only on Linux, on Windows always returns True
The best you can do is:
Try and connect on a known port (eg port 80 or 443 for HTTP or HTTPS); or
Ping the site. See Ping a site in Python?
Many sites block ICMP (the portocol used to ping sites) so you must know beforehand if the host in question has it enabled or not.
Connecting to a port tells you mixed information. It really depends on what you want to know. A port might be open but the site is effectively hung so you may get a false positive. A more stringent approach might involve using a HTTP library to execute a Web request against a site and see if you get back a response.
It really all depends on what you need to know.
Many firewalls are configured to drop ping packets without responding. In addition, some network adapters will respond to ICMP ping requests without input from the operating system network stack, which means the operating system might be down, but the host still responds to pings (usually you'll notice if you reboot the server, say, it'll start responding to pings some time before the OS actually comes up and other services start up).
The only way to be certain that a host is up is to actually try to connect to it via some well-known port (e.g. web server port 80).
Why do you need to know if the host is "up", maybe there's a better way to do it.
What about trying something that requires a RPC like a 'tasklist' command in conjunction with a ping?
I would use a port scanner. Original question states that you don't want to use a port. Then you need to specify which Protocol (Yes, this needs a port) you want to monitor: HTTP, VNC, SSH, etc. In case you want to monitor via ICMP you can use subprocess and control ping parameters, number of pings, timeout, size, etc.
import subprocess
try:
res = subprocess.Popen(['ping -t2 -c 4 110.10.0.254 &> /dev/null; echo $?'],shell=True,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
out, err = res.communicate()
out = out.rstrip()
err = err.rstrip()
print 'general.connectivity() Out: ' + out
print 'general.connectivity() Err: ' + err
if(out == "0"):
print 'general.connectivity() Successful'
return True
print 'general.connectivity() Failed'
return False
except Exception,e:
print 'general.connectivity() Exception'
return False
In case you want a port scanner
import socket
from functools import partial
from multiprocessing import Pool
from multiprocessing.pool import ThreadPool
from errno import ECONNREFUSED
NUM_CORES = 4
def portscan(target,port):
try:
# Create Socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socketTimeout = 5
s.settimeout(socketTimeout)
s.connect((target,port))
print('port_scanner.is_port_opened() ' + str(port) + " is opened")
return port
except socket.error as err:
if err.errno == ECONNREFUSED:
return False
# Wrapper function that calls portscanner
def scan_ports(server=None,port=None,portStart=None,portEnd=None,**kwargs):
p = Pool(NUM_CORES)
ping_host = partial(portscan, server)
if portStart and portStart:
return filter(bool, p.map(ping_host, range(portStart, portStart)))
else:
return filter(bool, p.map(ping_host, range(port, port+1)))
# Check if port is opened
def is_port_opened(server=None,port=None, **kwargs):
print('port_scanner.is_port_opened() Checking port...')
try:
# Add More proccesses in case we look in a range
pool = ThreadPool(processes=1)
try:
ports = list(scan_ports(server=server,port=int(port)))
print("port_scanner.is_port_opened() Port scanner done.")
if len(ports)!=0:
print('port_scanner.is_port_opened() ' + str(len(ports)) + " port(s) available.")
return True
else:
print('port_scanner.is_port_opened() port not opened: (' + port +')')
return False
except Exception, e:
raise
except Exception,e:
print e
raise