I am helping my client to solve a problem. He tries to call my API but is unsuccessful.
Situation:
my API is accessible via postman and the browser (swagger). It works.
On the same computer via a python script I cannot access to the API
error:
Caused by NewCOnnectionError('<urllib3.connection.HTTPConnection object at 0x000001310A6F5C0>:
Failed to establish a new connection: [Errno 11001] getaddrinfo failed')
If I try a very simple script
import requests
r = requests.get('http://google.com')
I get the same error.
I assume there some problem with firewall/proxy but why does Postman work?
In which direction can I direct my client to solve problem?
I also tried ping and it doesn't work either:
$ ping www.google.com
Ping request could not found host www.google.com. Pleas check the name and try again.
Related
I am calling a REST API to Informatica from POSTMAN and Python (requests library) and find the behavior quite funny.
When I am on VPN I can only make a successful call from POSTMAN, however if I switch VPN off both Python and POSTMAN calls work perfectly.
Python script generated automatically by POSTMAN.
Error:
ConnectionError: HTTPSConnectionPool(host='use4-mdm.dm-us.informaticacloud.com', port=443): Max retries exceeded with url: /rdm-service/external/v2/export (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000002393AB297C0>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))
Any ideas what the reason might be?
UPD
To make my question more clear:
This is a corporate VPN on work laptop
My system does not have *_PROXY variables
No default proxy in requests library
import requests
session = requests.Session()
session.proxies
>>> {}
http.client library - same result
Settings in POSTMAN are in screenshot below
you may want to have a look at the "proxies" parameter of python requests :)
I was trying to do a "POST" request but getting this WinError 10013 Socket issue in my Django API. I can do the "GET" request successfully, but why am I getting this error on POST request?
OSError: [WinError 10013] An attempt was made to access a socket in a way forbidden by its access permissions "POST /api/v5/users/email/ HTTP/1.1" 500 177849
Another process might be running on the port. Try netstat -ban to list the occupied ports. Also try running server in terminal with administrator privilege.
urllib.request.urlretrieve not working over HTTP websites, however the same is able to save url over HTTPS websites.
import urllib.request
abc="http://www.google.com"
urllib.request.urlretrieve(abc,"C:\\Users\\kj\\downloads\\abc.html")
I'm getting the following error:
[WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
When i use "https://www.google.com", the code is working. I am currently working on a website which is not a secured website (http), and this code is not working on them.
I have read few threads on this portal, but could not find a solution.
i am using python 3.6.
Any suggestions?
I'm getting the following error when trying to fetch an URL with urllib2 in the google app engine:
error: An error occured while connecting to the server: Unable to fetch URL: http://www.google.com Error: [Errno 10106] getaddrinfo failed
This is the code calling the urllib2 open read methods:
def get(self):
self.write(urllib2.urlopen("http://www.google.com").read())
self.render_index()
Nothing fancy, just a call to the library inside the main handler to ouptut the fetched text.
My PC resolves DNS correctly. I can use the urllib2 library from the python interpeter, fetching URLs successfully.
The deployed code running from the google servers work as intended, it's something with my local environment but I can't find what is it.
I also tried using urlfetch from gae with similar results (same getaddrinfo failed)
I switched to Google DNS some days before working with the urllib2 library but switching back to ISP provided DNS didn't work either.
EDIT: When calling the function with an IP address the URL is fetched:
self.write(urllib2.urlopen("http://173.194.42.34").read())
Thanks in advance!
I'm fairly certain that your DNS resolver fails to resolve the hostname. I assume that your OS, or security software prohibits the devserver from creating outbound connection. Another possibility would be that you have invalid entry in your hosts file on your operating system.
Also, there are many similar questions which could help you.
I'm getting the following error when trying to fetch an URL with urllib2 in the google app engine:
error: An error occured while connecting to the server: Unable to fetch URL: http://www.google.com Error: [Errno 10106] getaddrinfo failed
This is the code calling the urllib2 open read methods:
def get(self):
self.write(urllib2.urlopen("http://www.google.com").read())
self.render_index()
Nothing fancy, just a call to the library inside the main handler to ouptut the fetched text.
My PC resolves DNS correctly. I can use the urllib2 library from the python interpeter, fetching URLs successfully.
The deployed code running from the google servers work as intended, it's something with my local environment but I can't find what is it.
I also tried using urlfetch from gae with similar results (same getaddrinfo failed)
I switched to Google DNS some days before working with the urllib2 library but switching back to ISP provided DNS didn't work either.
EDIT: When calling the function with an IP address the URL is fetched:
self.write(urllib2.urlopen("http://173.194.42.34").read())
Thanks in advance!
I'm fairly certain that your DNS resolver fails to resolve the hostname. I assume that your OS, or security software prohibits the devserver from creating outbound connection. Another possibility would be that you have invalid entry in your hosts file on your operating system.
Also, there are many similar questions which could help you.