How would I check if the remote host is up without having a port number? Is there any other way I could check other then using regular ping.
There is a possibility that the remote host might drop ping packets
This worked fine for me:
HOST_UP = True if os.system("ping -c 1 " + SOMEHOST) is 0 else False
A protocol-level PING is best, i.e., connecting to the server and interacting with it in a way that doesn't do real work. That's because it is the only real way to be sure that the service is up. An ICMP ECHO (a.k.a. ping) would only tell you that the other end's network interface is up, and even then might be blocked; FWIW, I have seen machines where all user processes were bricked but which could still be pinged. In these days of application servers, even getting a network connection might not be enough; what if the hosted app is down or otherwise non-functional? As I said, talking sweet-nothings to the actual service that you are interested in is the best, surest approach.
HOST_UP = True if os.system("ping -c 5 " + SOMEHOST.strip(";")) is 0 else False
to remove nasty script execution just add .strip(";")
-c 5
to increase the number of ping requests, if all pass than True
PS. Works only on Linux, on Windows always returns True
The best you can do is:
Try and connect on a known port (eg port 80 or 443 for HTTP or HTTPS); or
Ping the site. See Ping a site in Python?
Many sites block ICMP (the portocol used to ping sites) so you must know beforehand if the host in question has it enabled or not.
Connecting to a port tells you mixed information. It really depends on what you want to know. A port might be open but the site is effectively hung so you may get a false positive. A more stringent approach might involve using a HTTP library to execute a Web request against a site and see if you get back a response.
It really all depends on what you need to know.
Many firewalls are configured to drop ping packets without responding. In addition, some network adapters will respond to ICMP ping requests without input from the operating system network stack, which means the operating system might be down, but the host still responds to pings (usually you'll notice if you reboot the server, say, it'll start responding to pings some time before the OS actually comes up and other services start up).
The only way to be certain that a host is up is to actually try to connect to it via some well-known port (e.g. web server port 80).
Why do you need to know if the host is "up", maybe there's a better way to do it.
What about trying something that requires a RPC like a 'tasklist' command in conjunction with a ping?
I would use a port scanner. Original question states that you don't want to use a port. Then you need to specify which Protocol (Yes, this needs a port) you want to monitor: HTTP, VNC, SSH, etc. In case you want to monitor via ICMP you can use subprocess and control ping parameters, number of pings, timeout, size, etc.
import subprocess
try:
res = subprocess.Popen(['ping -t2 -c 4 110.10.0.254 &> /dev/null; echo $?'],shell=True,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
out, err = res.communicate()
out = out.rstrip()
err = err.rstrip()
print 'general.connectivity() Out: ' + out
print 'general.connectivity() Err: ' + err
if(out == "0"):
print 'general.connectivity() Successful'
return True
print 'general.connectivity() Failed'
return False
except Exception,e:
print 'general.connectivity() Exception'
return False
In case you want a port scanner
import socket
from functools import partial
from multiprocessing import Pool
from multiprocessing.pool import ThreadPool
from errno import ECONNREFUSED
NUM_CORES = 4
def portscan(target,port):
try:
# Create Socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
socketTimeout = 5
s.settimeout(socketTimeout)
s.connect((target,port))
print('port_scanner.is_port_opened() ' + str(port) + " is opened")
return port
except socket.error as err:
if err.errno == ECONNREFUSED:
return False
# Wrapper function that calls portscanner
def scan_ports(server=None,port=None,portStart=None,portEnd=None,**kwargs):
p = Pool(NUM_CORES)
ping_host = partial(portscan, server)
if portStart and portStart:
return filter(bool, p.map(ping_host, range(portStart, portStart)))
else:
return filter(bool, p.map(ping_host, range(port, port+1)))
# Check if port is opened
def is_port_opened(server=None,port=None, **kwargs):
print('port_scanner.is_port_opened() Checking port...')
try:
# Add More proccesses in case we look in a range
pool = ThreadPool(processes=1)
try:
ports = list(scan_ports(server=server,port=int(port)))
print("port_scanner.is_port_opened() Port scanner done.")
if len(ports)!=0:
print('port_scanner.is_port_opened() ' + str(len(ports)) + " port(s) available.")
return True
else:
print('port_scanner.is_port_opened() port not opened: (' + port +')')
return False
except Exception, e:
raise
except Exception,e:
print e
raise
Related
I have made a program that tells me if I am connected to the internet or not. Now I want it to ping www.google.com and show me the ping time in ms. I don't want to use any 3rd party software or download anything.
Edit:
My code is:
def is_connected():
try:
# see if we can resolve the host name -- tells us if there is
# a DNS listening
host = socket.gethostbyname(REMOTE_SERVER)
# connect to the host -- tells us if the host is actually
# reachable
s = socket.create_connection((host, 80), 2)
return True
except:
pass
return False
The above code just tells me if I am connected to the internet.
What I want is a simple way to show the ping of a website.
This is not a duplicate as it doesn't answer my question.
A ping is not the same thing as a HTTP connection! The first is a low level ICMP packet that is used to test connectiviy and find round-trip time mainly on a local network. It is generaly not used on broad internet, because for security reasons it is often blocked by firewall and external routers.
If you want to know the time necessary to establish the connexion to a server, do what you would in real world: look at your watch, do the job, look again at your watch to see elapsed time. In Python it gives
#import time
...
def connect_time():
try:
# see if we can resolve the host name -- tells us if there is
# a DNS listening
host = socket.gethostbyname(REMOTE_SERVER)
# connect to the host -- tells us if the host is actually
# reachable
before = time.clock() # from Python 3.3 and above use before = time.perf_counter()
s = socket.create_connection((host, 80), 2)
after = time.clock() # from Python 3.3 and above use after = time.perf_counter()
return after - before
except:
return -1
I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
I'm writing a python program that uses Telnet to send the same few commands once every second, and then reads the output, organizes it into a Dictionary, and then prints to a JSON file (Were it is later read in by a front-end web-gui). The purpose of this is to provide a live-updates of crucial telnet command outputs.
The problem I am having is that if the connection is lost halfway though the program, it causes the program to crash. I have tried a number of ways to deal with this, such using a Boolean that is set to True once the connection is made and False if there is a timeout error, but this has some limitations. If the connection is successfully made, but later gets disconnected, the Boolean will read true in spite of the connection being lost. I have found some ways to deal with this too (Ex: if a Telnet command returns no output within 5 seconds, the connection was lost, and the boolean is updated to False).
However it is a complex program and it seems there are too many possible ways a disconnect can slip by the checks I have written and still cause the program to crash.
I am hoping to find a very simple way of checking that the Telnet command is connected. Better yet if it is a single line of code. The only way I currently know of how to check if it is connected is to try and connect again, which will fail if the network connection is lost. However, I do not want to have to open a new telnet connection every time I check to make sure it is connected. If it is already connected, it is a waste of crucial time, and there is no way to know it is not connected until after you try to connect.
I'm looking for something like:
tnStatus = [function or line of code that checks if Telnet is connected (w/o trying to open a connection), and returns boolean]
if(tnStatus == True):
sendComand('bla')
Any suggestions?
I'm running Python 2.6 (cannot update for backwards compatibility reasons)
EDIT:
This is (abridged) code of how I am presently connecting to telnet and sending/reading commands.
class cliManager():
'''
Class to manage a Command Line Interface connection via Telnet
'''
def __init__(self, host, port, timeout):
self.host = host
self.port = port
self.timeout = timeout #Timeout for connecting to telnet
self.isConnected = False
# CONNECT to device via TELNET, catch connection errors.
def connect(self):
try:
if self.tn:
self.tn.close()
print("Connecting...")
self.tn = telnetlib.Telnet(self.host, self.port, self.timeout)
print("Connection Establised")
self.isConnected = True
except Exception:
print("Connection Failed")
self.isConnected = False
.
.
.
def sendCmd(self, cmd):
# CHECK if connected, if not then reconnect
output = {}
if not self.reconnect():
return output
#Ensure cmd is valid, strip out \r\t\n, etc
cmd = self.validateCmd(cmd)
#Send Command and newline
self.tn.write(cmd + "\n")
response = ''
try:
response = self.tn.read_until('\n*', 5)
if len(response) == 0:
print "No data returned!"
self.isConnected = False
except EOFError:
print "Telnet Not Connected!"
self.isConnected = False
output = self.parseCmdStatus(response)
return output
elswhere...
cli = cliManager("136.185.10.44", 6000, 2)
cli.connect()
giDict = cli.sendCmd('getInfo')
[then giDict and other command results go to other methods where they are formatted and interpreted for the front end user]
You can try following code to check if telnet connection is still usable or not.
def is_connected(self):
try:
self.tn.read_very_eager()
return True
except EOFError:
print("EOFerror: telnet connection is closed")
return False
You can also refer https://docs.python.org/3/library/telnetlib.html for Telnet.read_very_eager() usage and:
https://lgfang.github.io/computer/2007/07/06/py-telnetlib#:~:text=The%20difference%20is%20that%20read_eager,read%20as%20much%20as%20possible.&text=The%20remaining%20read%20functions%20basically%20block%20until%20they%20received%20designated%20data.
Am newbie to python and stuck at a point. I want to create port scanner with using only python 3 inbuilt libraries (means avoiding scapy etc) I have following code :
import socket
for i in range(1,26):
s = socket.socket()
s.settimeout(0.5)
ip = "74.207.244.221" #scanme.nmap.org
response = s.connect_ex((ip, i))
if response:
print ("%d\tclose" %i)
else:
print ("%d\topen" %i)
s.close()
Now I want to add 2 functionalities to this : that is
Distinguish between close and filtered ports . In both cases am receiving same errno in return so how can I check if I have received back a rst packet or nothing ? As far as I have tried s.recv() isn't working for this.
I want to control the number of tries (attempts), i.e I want to send only one or two syn packets. I don't want this program to send more than 2 syn packets for probes. How can this thing be achieved ?
Distinguish between close and filtered ports . In both cases am
receiving same errno in return so how can I check if I have received
back a rst packet or nothing
You've probably only checked with servers that send back a RST. Here's what I tried:
First case, normal config:
>>> os.strerror(s.connect_ex((ip, 81)))
'Connection refused'
Second, with manual iptables:
iptables -A OUTPUT -p tcp --dport 81 -j DROP
>>> os.strerror(s.connect_ex((ip, 81)))
'Resource temporarily unavailable'
I want to control the number of tries (attempts), i.e I want to send
only one or two syn packets.
I don't think there's a setsockopt TCP option exposed, but on linux there's:
net.ipv4.tcp_syn_retries
However, since you limited the timeout for the socket, all operations that don't finish within 0.5 seconds will time out. So it's likely only 1 or 2 SYNs will leave the station.
#!/usr/bin/python
import socket
s = socket.socket(socket.AF_INET, socekt.SOCK_STREAM)
host = 74.207.244.221
def portscan(port):
try:
s.connect((host,port))
return True
else:
return False
for x in range(1,255):
if portscan(x):
print('Port',x,'Is Open')
I want to see if I can access an online API, but for that, I need to have Internet access.
How can I see if there's a connection available and active using Python?
Perhaps you could use something like this:
import urllib2
def internet_on():
try:
urllib2.urlopen('http://216.58.192.142', timeout=1)
return True
except urllib2.URLError as err:
return False
Currently, 216.58.192.142 is one of the IP addresses for google.com. Change http://216.58.192.142 to whatever site can be expected to respond quickly.
This fixed IP will not map to google.com forever. So this code is
not robust -- it will need constant maintenance to keep it working.
The reason why the code above uses a fixed IP address instead of fully qualified domain name (FQDN) is because a FQDN would require a DNS lookup. When the machine does not have a working internet connection, the DNS lookup itself may block the call to urllib_request.urlopen for more than a second. Thanks to #rzetterberg for pointing this out.
If the fixed IP address above is not working, you can find a current IP address for google.com (on unix) by running
% dig google.com +trace
...
google.com. 300 IN A 216.58.192.142
If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:
Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)
To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.
A quick Nmap of the host 8.8.8.8 gave below result:
$ sudo nmap 8.8.8.8
Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT STATE SERVICE
53/tcp open domain
Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds
As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.
Before we try with Python, let's test connectivity using an external tool, Netcat:
$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!
Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:
import socket
def internet(host="8.8.8.8", port=53, timeout=3):
"""
Host: 8.8.8.8 (google-public-dns-a.google.com)
OpenPort: 53/tcp
Service: domain (DNS/TCP)
"""
try:
socket.setdefaulttimeout(timeout)
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
return True
except socket.error as ex:
print(ex)
return False
internet()
Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.
UPDATE #4: This listing of public nameservers is a good reference for IPs to test against.
UPDATE #3: Tested again after the exception handling change:
defos.py
True
00:00:00:00.410
iamaziz.py
True
00:00:00:00.240
ivelin.py
True
00:00:00:00.109
jaredb.py
True
00:00:00:00.520
kevinc.py
True
00:00:00:00.317
unutbu.py
True
00:00:00:00.436
7h3rAm.py
True
00:00:00:00.030
UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487
iamaziz.py
True
00:00:00:00.335
ivelin.py
True
00:00:00:00.105
jaredb.py
True
00:00:00:00.533
kevinc.py
True
00:00:00:00.295
unutbu.py
True
00:00:00:00.546
7h3rAm.py
True
00:00:00:00.032
And once more:
$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450
iamaziz.py
True
00:00:00:00.358
ivelin.py
True
00:00:00:00.099
jaredb.py
True
00:00:00:00.585
kevinc.py
True
00:00:00:00.492
unutbu.py
True
00:00:00:00.485
7h3rAm.py
True
00:00:00:00.035
True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.
UPDATE #1: Thanks to #theamk's comment, timeout is now an argument and initialized to 3s by default.
It will be faster to just make a HEAD request so no HTML will be fetched.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet() -> bool:
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:
import requests
def connected_to_internet(url='http://www.google.com/', timeout=5):
try:
_ = requests.head(url, timeout=timeout)
return True
except requests.ConnectionError:
print("No internet connection available.")
return False
Bonus: this can be extended to this function that pings a website.
def web_site_online(url='http://www.google.com/', timeout=5):
try:
req = requests.head(url, timeout=timeout)
# HTTP errors are not raised by default, this statement does that
req.raise_for_status()
return True
except requests.HTTPError as e:
print("Checking internet connection failed, status code {0}.".format(
e.response.status_code))
except requests.ConnectionError:
print("No internet connection available.")
return False
Just to update what unutbu said for new code in Python 3.2
def check_connectivity(reference):
try:
urllib.request.urlopen(reference, timeout=1)
return True
except urllib.request.URLError:
return False
And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.
You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.
Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.
import urllib
def connected(host='http://google.com'):
try:
urllib.urlopen(host)
return True
except:
return False
# test
print( 'connected' if connected() else 'no internet!' )
For python 3, use urllib.request.urlopen(host)
Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.
To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?
Here's my version
import requests
try:
if requests.get('https://google.com').ok:
print("You're Online")
except:
print("You're Offline")
This might not work if the localhost has been changed from 127.0.0.1
Try
import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
print("You are not connected to the internet!")
else:
print("You are connected to the internet with the IP address of "+ ipaddress )
Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet.
This code basically gets the IP address and then asks if it is the localhost IP address .
Hope that helps
A modern portable solution with requests:
import requests
def internet():
"""Detect an internet connection."""
connection = None
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
connection = True
except:
print("Internet connection not detected.")
connection = False
finally:
return connection
Or, a version that raises an exception:
import requests
from requests.exceptions import ConnectionError
def internet():
"""Detect an internet connection."""
try:
r = requests.get("https://google.com")
r.raise_for_status()
print("Internet connection detected.")
except ConnectionError as e:
print("Internet connection not detected.")
raise e
Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:
import socket
print("website connection checker")
while True:
website = input("please input website: ")
print("")
print(socket.gethostbyname(website))
if socket.gethostbyname(website) == "92.242.140.2":
print("Website could be experiencing an issue/Doesn't exist")
else:
socket.gethostbyname(website)
print("Website is operational!")
print("")
my favorite one, when running scripts on a cluster or not
import subprocess
def online(timeout):
try:
return subprocess.run(
['wget', '-q', '--spider', 'google.com'],
timeout=timeout
).returncode == 0
except subprocess.TimeoutExpired:
return False
this runs wget quietly, not downloading anything but checking that the given remote file exists on the web
Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)
Here is what I have:
import socket
try:
from urllib2 import urlopen, URLError
from urlparse import urlparse
except ImportError: # Python 3
from urllib.parse import urlparse
from urllib.request import urlopen, URLError
class InternetChecker(object):
conn_url = 'https://www.google.com/'
def __init__(self):
pass
def test_internet(self):
try:
data = urlopen(self.conn_url, timeout=5)
except URLError:
return False
try:
host = data.fp._sock.fp._sock.getpeername()
except AttributeError: # Python 3
host = data.fp.raw._sock.getpeername()
# Ensure conn_url is an IPv4 address otherwise future queries will fail
self.conn_url = 'http://' + (host[0] if len(host) == 2 else
socket.gethostbyname(urlparse(data.geturl()).hostname))
return True
# Usage example
checker = InternetChecker()
checker.test_internet()
Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.
Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.
Works under Pyth3 with Raspbian 3.4.2
from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
try:
urlopen(urltotest)
answer='YES'
except:
essai='NO'
nboftrials+=1
sleep(30)
maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!
Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.
import socket
def haveInternet():
try:
# first check if we get the correct IP-Address or just the router's IP-Address
info = socket.getaddrinfo("www.google.com", None)[0]
ipAddr = info[4][0]
if ipAddr == "192.168.0.1" :
return False
except:
return False
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
This works for me in Python3.6
import urllib
from urllib.request import urlopen
def is_internet():
"""
Query internet using python
:return:
"""
try:
urlopen('https://www.google.com', timeout=1)
return True
except urllib.error.URLError as Error:
print(Error)
return False
if is_internet():
print("Internet is active")
else:
print("Internet disconnected")
I added a few to Joel's code.
import socket,time
mem1 = 0
while True:
try:
host = socket.gethostbyname("www.google.com") #Change to personal choice of site
s = socket.create_connection((host, 80), 2)
s.close()
mem2 = 1
if (mem2 == mem1):
pass #Add commands to be executed on every check
else:
mem1 = mem2
print ("Internet is working") #Will be executed on state change
except Exception as e:
mem2 = 0
if (mem2 == mem1):
pass
else:
mem1 = mem2
print ("Internet is down")
time.sleep(10) #timeInterval for checking
For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:
import struct
import socket
import select
def send_one_ping(to='8.8.8.8'):
ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
checksum = 49410
header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
header = struct.pack(
'!BBHHH', 8, 0, checksum, 0x123, 1
)
packet = header + data
ping_socket.sendto(packet, (to, 1))
inputready, _, _ = select.select([ping_socket], [], [], 1.0)
if inputready == []:
raise Exception('No internet') ## or return False
_, address = ping_socket.recvfrom(2048)
print(address) ## or return True
send_one_ping()
The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.
Make sure your pip is up to date by running
pip install --upgrade pip
Install the requests package using
pip install requests
import requests
import webbrowser
url = "http://www.youtube.com"
timeout = 6
try:
request = requests.get(url, timeout=timeout)
print("Connected to the Internet")
print("browser is loading url")
webbrowser.open(url)
except (requests.ConnectionError, requests.Timeout) as exception:
print("poor or no internet connection.")
import requests and try this simple python code.
def check_internet():
url = 'http://www.google.com/'
timeout = 5
try:
_ = requests.get(url, timeout=timeout)
return True
except requests.ConnectionError:
return False
I just want to refer to Ivelin's solution, because I can't comment there.
In python 2.7 with an old SSL certificate (in my case, not possible to update, which is another story), there is a possibility of a Certificate Error. In that case, replacing '8.8.8.8' with 'dns.google' or '8888.google' can help.
Hope this will someone helps too.
try:
import httplib # python < 3.0
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPSConnection("8888.google", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()