Not been able to connect simple International space station api using Python - python

I executed this code to connect most common api-
import requests
response = requests.get("http://api.open-notify.org/iss-now.json")
print(response.status_code)
But it is showing this error -
runfile('C:/Users/sanchit.joshi/use case of unassigned tickets/Api try
out.py', wdir='C:/Users/sanchit.joshi/use case of unassigned tickets')
Traceback (most recent call last):
File "<ipython-input-17-39bcdc5917ae>", line 1, in <module>
runfile('C:/Users/sanchit.joshi/use case of unassigned tickets/Api try
out.py', wdir='C:/Users/sanchit.joshi/use case of unassigned tickets')
File "C:\ProgramData\Anaconda3\lib\site-
packages\spyder_kernels\customize\spydercustomize.py", line 668, in
runfile
execfile(filename, namespace)
File "C:\ProgramData\Anaconda3\lib\site-
packages\spyder_kernels\customize\spydercustomize.py", line 108, in
execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "C:/Users/sanchit.joshi/use case of unassigned tickets/Api try
out.py", line 8, in <module>
response = requests.get("http://api.open-notify.org/iss-now.json")
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line
72, in get
return request('get', url, params=params, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line
58, in request
return session.request(method=method, url=url, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py",
line 512, in request
resp = self.send(prep, **send_kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py",
line 622, in send
r = adapter.send(request, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py",
line 513, in send
raise ConnectionError(e, request=request)
ConnectionError: HTTPConnectionPool(host='api.open-notify.org',
port=80): Max retries exceeded with url: /iss-now.json (Caused by
NewConnectionError('<urllib3.connection.HTTPConnection object at
0x000001E8E5BCBE80>: Failed to establish a new connection: [Errno 11002]
getaddrinfo failed'))
I tried changing max retry value, but its not working. It is more frustrating bcz I this is the simplest code to connect to an api. Any help is appreciated.

There is nothing wrong with your code, that code should work.
You are having a proxy issue. If you are on windows you can add the url to your proxy exceptions by going to the settings menu in internet explorer, then internet options, then connections, lan settings advanced and add the url to your exceptions. It is typical in a corporate environment or in your school for the admins to put you behind a proxy.
Alternatively you can use this Q/A to set the proxy in your request

Using the JSON and urllib.request libraries
import json
import urllib.request
file = urllib.request.urlopen("http://api.open-notify.org/iss-now.json")
data = json.loads(file.read())
print(data)
Resulting in
{'message': 'success', 'timestamp': 1541059187, 'iss_position': {'longitude': '13.6813', 'latitude': '47.8641'}}

Related

Python 3 Requests errors

I'm working through Python Crash Course 2nd Ed. and in the text is some code for accessing APIs. My code is copied from the text and is as follows:
import requests
import json
from operator import itemgetter
#Fetch top stories and store in variable r
url = 'https://hacker-news.firebaseio.com/v0/topstories.json'
r = requests.get(url)
print(f"Status code: {r.status_code}")
# #Explore data structure
# response_dict = r.json()
# readable_file = 'hn_readable.json'
# with open(readable_file, 'w') as f:
# json.dump(response_dict, f, indent=4)
submission_ids = r.json()
submission_dicts = []
for submission_id in submission_ids[:30]:
#Make API call for each article
url = f"https://hacker-news.firebasio.com/v0/item/{submission_id}.json"
r = requests.get(url)
print(f"id: {submission_id}\tstatus code: {r.status_code}")
response_dict = r.json()
#Store dictionary of each article
submission_dict = {
'title': response_dict['title'],
'score': response_dict['score'],
'comments': response_dict['descendants'],
'link': response_dict['url'],
}
submission_dicts.append(submission_dict)
#Sort article by score
submission_dicts = sorted(submission_dicts, key=itemgetter('score'), reverse = True)
#Display information about each article, ranked by score
for submission_dict in submission_dicts:
print(f"Article title: {submission_dict['title']}")
print(f"Article link: {submission_dict['url']}")
print(f"Score: {submission_dict['score']}")
However, this is now returning the following error messages:
Status code: 200
Traceback (most recent call last):
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\connectionpool.py", line 677, in urlopen
chunked=chunked,
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\connectionpool.py", line 381, in _make_request
self._validate_conn(conn)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\connectionpool.py", line 976, in _validate_conn
conn.connect()
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\connection.py", line 370, in connect
ssl_context=context,
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\util\ssl_.py", line 377, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\Users\snack\Python\lib\ssl.py", line 423, in wrap_socket
session=session
File "C:\Users\snack\Python\lib\ssl.py", line 870, in _create
self.do_handshake()
File "C:\Users\snack\Python\lib\ssl.py", line 1139, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1076)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\adapters.py", line 449, in send
timeout=timeout
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\connectionpool.py", line 725, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\urllib3\util\retry.py", line 439, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='hacker-news.firebasio.com', port=443): Max retries exceeded with url: /v0/item/23273247.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1076)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\snack\Python\proj_2\hn_submissions.py", line 24, in <module>
r = requests.get(url)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\sessions.py", line 643, in send
r = adapter.send(request, **kwargs)
File "C:\Users\snack\AppData\Roaming\Python\Python37\site-packages\requests\adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='hacker-news.firebasio.com', port=443): Max retries exceeded with url: /v0/item/23273247.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1076)')))
[Finished in 3.6s]
I have almost no experience with this, but from what I can tell, some authentication is failing and not letting my program access the API, but I have no idea why. I've tried limiting the number of API calls by removing the loop, but it doesn't seem to help. I also tried adding the verify=False parameter into the requests.get lines, but that just kicked up different errors.
There is nothing wrong with the API call itself.
As you visit the site https://hacker-news.firebaseio.com/v0/topstories.json you can see the expected list in the browser. (Your first and working api call)
As the first number in this list is 23277594, the script start with this request https://hacker-news.firebasio.com/v0/item/23277594.json, but visiting this url via the browser will also result in warnings. (your second and failing api call)
Alright, it was typos (of course). The url in my code was https...firebasio....json instead of https...firebaseio....json. One of the results is still not working, but I'm assuming that's due to the article not having comments (i.e. descendants), so some try/ except should fix that.

Python WebSocket Channel - Connection reset by peer

I want to build my own library to be used for server -> browser communicating via websockets. I have tried to build something, it's here: https://github.com/duverse/wsc. But sometimes, when I trying to make API Call to my own server, I've got something like this:
Python requests
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/project/.venv/local/lib/python2.7/site-packages/wsc/client/__init__.py", line 109, in stat
'Access-Key': self._access_key
File "/project/.venv/local/lib/python2.7/site-packages/requests/api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "/project/.venv/local/lib/python2.7/site-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/project/.venv/local/lib/python2.7/site-packages/requests/sessions.py", line 502, in request
resp = self.send(prep, **send_kwargs)
File "/project/.venv/local/lib/python2.7/site-packages/requests/sessions.py", line 612, in send
r = adapter.send(request, **kwargs)
File "/project/.venv/local/lib/python2.7/site-packages/requests/adapters.py", line 490, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(104, 'Connection reset by peer'))
I think that problem is in this method: https://github.com/duverse/wsc/blob/master/wsc/server/request.py#L113, because all websocket connections works fine. Usually it's probably tries to receive package that is empty. I am not sure.
P.S. I have used SSLSocket, and packages from the websocket request is always splited. At the first time it always sends me only one byte, and then all other.
P.P.S. WSC Server log has just it:
WSC.log
----------------------------------------
Exception happened during processing of request from ('123.123.123.123', 45757)
----------------------------------------
Without trace. (ip was replaced)

Can't load website using requests in Python [duplicate]

This question already has an answer here:
Requests failing to connect to a TLS server
(1 answer)
Closed 5 years ago.
I'm using Python and I'm trying to scrape this website:
https://online.ratb.ro/info/browsers.aspx
But I'm getting this error:
Traceback (most recent call last): File
"C:\Users\pinguluk\Desktop\Proiecte GIT\RATB Scraper\test2.py", line
3, in
test = requests.get('https://online.ratb.ro/info/browsers.aspx') File "C:\Python27\lib\site-packages\requests\api.py", line 72,
in get
return request('get', url, params=params, **kwargs) File "C:\Python27\lib\site-packages\requests\api.py", line 58, in request
return session.request(method=method, url=url, **kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 518,
in request
resp = self.send(prep, **send_kwargs) File "C:\Python27\lib\site-packages\requests\sessions.py", line 639, in
send
r = adapter.send(request, **kwargs) File "C:\Python27\lib\site-packages\requests\adapters.py", line 512, in
send
raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: SysCallError(-1,
'Unexpected EOF')",)
Installed modules:
['appdirs==1.4.3', 'asn1crypto==0.22.0', 'attrs==16.3.0',
'automat==0.5.0', 'beautifulsoup4==4.5.3', 'cairocffi==0.8.0',
'certifi==2017.4.17', 'cffi==1.10.0', 'colorama==0.3.9',
'constantly==15.1.0', 'cryptography==1.8.1', 'cssselect==1.0.1',
'cycler==0.10.0', 'distributedlock==1.2', 'django-annoying==0.10.3',
'django-oauth-tokens==0.6.3', 'django-taggit==0.22.1',
'django==1.11.1', 'enum34==1.1.6', 'facepy==1.0.8',
'functools32==3.2.3.post2', 'futures==3.1.1', 'gevent==1.2.1',
'greenlet==0.4.12', 'grequests==0.3.0', 'html5lib==0.999999999',
'htmlparser==0.0.2', 'httplib2==0.10.3', 'idna==2.5',
'incremental==16.10.1', 'ipaddress==1.0.18', 'lazyme==0.0.10',
'lxml==3.7.3', 'matplotlib==2.0.2', 'mechanize==0.3.3',
'ndg-httpsclient==0.4.2', 'numpy==1.12.1', 'oauthlib==2.0.2',
'olefile==0.44', 'opencv-python==3.2.0.7', 'packaging==16.8',
'parsel==1.1.0', 'pillow==4.0.0', 'pip==9.0.1', 'py2exe==0.6.9',
'pyandoc==0.0.1', 'pyasn1-modules==0.0.8', 'pyasn1==0.2.3',
'pycairo-gtk==1.10.0', 'pycparser==2.17', 'pygtk==2.22.0',
'pyhook==1.5.1', 'pynput==1.3.2', 'pyopenssl==17.0.0',
'pyparsing==2.2.0', 'pypiwin32==219', 'pyquery==1.2.17',
'python-dateutil==2.6.0', 'python-memcached==1.58', 'pytz==2017.2',
'pywin32==221', 'queuelib==1.4.2', 'requests-futures==0.9.7',
'requests-oauthlib==0.8.0', 'requests-toolbelt==0.8.0',
'requests==2.14.2', 'restclient==0.11.0', 'robobrowser==0.5.3',
'selenium==3.4.1', 'service-identity==16.0.0', 'setuptools==35.0.2',
'simplejson==3.10.0', 'six==1.10.0', 'twitter==1.17.0',
'twitterfollowbot==2.0.2', 'urllib3==1.21.1', 'w3lib==1.17.0',
'webencodings==0.5.1', 'werkzeug==0.12.1', 'wheel==0.29.0',
'zope.interface==4.3.3']
Thanks.
I think you will have hard time solving this problem since the server you are trying to "scrape" is awfully configured (ssllabs.com gave it a grade F) and it might be that Requests don't even support any of cipher suites because they are all insecure. There might be an option of creating a custom HTTPAdapter, so you might try that out.
You can try using:
requests.get(url, verify=False)
if you don't need to check the authenticity of the SSL certificate.

Write on HDFS using Python

I am trying to write on HDFS from Python.
Right now, I am using https://hdfscli.readthedocs.io/en/latest/quickstart.html
but for large file I get back:
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/hdfs/client.py", line 400, in write
consumer(data)
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/hdfs/client.py", line 394, in consumer
auth=False,
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/hdfs/client.py", line 179, in _request
**kwargs
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/home/edge7/venv-dev/local/lib/python2.7/site-packages/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', gaierror(-2, 'Name or service not known'))
My code for writing is pretty simple:
client = InsecureClient('http://xxxxxxx.co:50070', user='hdfs')
client.write("/tmp/a",stringToWrite)
Anyone can suggest a decent package to write on HDFS?
Cheers
For the stacktrace, it seems to be security related. Are you sure you need to use the InsecureClient and not the Kerberos one?. Also, remember that library is just a binding for HttpFs, so doing a manual test with Postman or CURL would let you debug any issue cluster-side.

HTTPS proxies with Requests: [Errno 8] _ssl.c:504: EOF occurred in violation of protocol

I am using Requests 1.2.3 on Windows 7 x64 and am trying to connect to (any) site via HTTPS using a HTTPS proxy by passing the proxies argument to the request.
I don't experience this error when using urllib2's ProxyHandler, so I don't think it's on my proxy's side.
>>> opener = urllib2.build_opener(urllib2.ProxyHandler({'https': 'IP:PORT'}))
>>> resp = opener.open('https://www.google.com')
>>> resp.url
'https://www.google.co.uk/'
>>> resp = requests.get('https://www.google.com', proxies={'https': 'IP:PORT'})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 331, in send
raise SSLError(e)
requests.exceptions.SSLError: [Errno 8] _ssl.c:504: EOF occurred in violation of protocol
I should probably note that the same error still happens if I pass verify=False to the request.
Any suggestions? I've looked at related questions but there was nothing that worked for me.
I suspect your proxy is a http proxy over which you can use https (the common case)
The problem is, that requests uses https to talk to proxies if the request itself https.
Using an explicit protocol (http) for your proxy should fix things: proxies={'https': 'http://IP:PORT'}
Also have a look at https://github.com/kennethreitz/requests/issues/1182

Categories

Resources