I'm getting the following error when trying to fetch an URL with urllib2 in the google app engine:
error: An error occured while connecting to the server: Unable to fetch URL: http://www.google.com Error: [Errno 10106] getaddrinfo failed
This is the code calling the urllib2 open read methods:
def get(self):
self.write(urllib2.urlopen("http://www.google.com").read())
self.render_index()
Nothing fancy, just a call to the library inside the main handler to ouptut the fetched text.
My PC resolves DNS correctly. I can use the urllib2 library from the python interpeter, fetching URLs successfully.
The deployed code running from the google servers work as intended, it's something with my local environment but I can't find what is it.
I also tried using urlfetch from gae with similar results (same getaddrinfo failed)
I switched to Google DNS some days before working with the urllib2 library but switching back to ISP provided DNS didn't work either.
EDIT: When calling the function with an IP address the URL is fetched:
self.write(urllib2.urlopen("http://173.194.42.34").read())
Thanks in advance!
I'm fairly certain that your DNS resolver fails to resolve the hostname. I assume that your OS, or security software prohibits the devserver from creating outbound connection. Another possibility would be that you have invalid entry in your hosts file on your operating system.
Also, there are many similar questions which could help you.
Related
When I am running a Python micro-service in a dockerized or kubernetes container it works just fine. But with Istio service mesh, it is not working.
I have added ServiceEntry for two of my outbound external http apis. It seems I can access the url content form inside the container using curl command which is inside service mesh. So, I think the service entries are fine and working.
But when I try from the micro-service which uses xml.sax parser in Python, it gives me the upstream connect error or disconnect/reset before headers though the same application works fine without Istio.
I think it is something related to Istio or Envoy or Python.
Update: I did inject the Istio-proxy side-car. I have also added ServiceEntry for external MySQL database and mysql is connected from the micro-service.
I have found the reason for this not working. My Python service is using xml.sax parser library to parse xml form the internet, which is using the legacy urllib package which initiate http/1.0 request.
Envoy doesn't support http/1.0 protocol version. Hence, it is not working. I made the workaround by setting global.proxy.includeIPRanges="10.x.0.1/16" for Istio using helm. This actually bypass the entire envoy proxy for all outgoing connections outside the given ip ranges.
But I would prefer not to globally bypass Istio.
I have installed Openstack CLI and when I try to use any command say
openstack server list
it is throwing the below error
Failed to discover available identity versions when contacting
https://44.128.19.51:5000/v3. Attempting to parse version from URL.
SSL exception connecting to https://44.128.19.51:5000/v3/auth/tokens:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
(_ssl.c:765)
I tried setting the export OS_CACERT=/path/to/ca.crt, but it is not working.
You must provide a proper authorization url. Sometimes the port of the url can be wrong. Like in my case, the authorization url had port 1300 instead of 5000.
And have you sourced your RC file?
Other than for proper authorization url, proper CACERT path or proper authorization certificates it should show this error.
A long time has passed since the question, but if someone like myself faces the problem, enter the "OpenStack" command with the flag "--insecure".
Here's the related documentation.
I'm getting the following error when trying to fetch an URL with urllib2 in the google app engine:
error: An error occured while connecting to the server: Unable to fetch URL: http://www.google.com Error: [Errno 10106] getaddrinfo failed
This is the code calling the urllib2 open read methods:
def get(self):
self.write(urllib2.urlopen("http://www.google.com").read())
self.render_index()
Nothing fancy, just a call to the library inside the main handler to ouptut the fetched text.
My PC resolves DNS correctly. I can use the urllib2 library from the python interpeter, fetching URLs successfully.
The deployed code running from the google servers work as intended, it's something with my local environment but I can't find what is it.
I also tried using urlfetch from gae with similar results (same getaddrinfo failed)
I switched to Google DNS some days before working with the urllib2 library but switching back to ISP provided DNS didn't work either.
EDIT: When calling the function with an IP address the URL is fetched:
self.write(urllib2.urlopen("http://173.194.42.34").read())
Thanks in advance!
I'm fairly certain that your DNS resolver fails to resolve the hostname. I assume that your OS, or security software prohibits the devserver from creating outbound connection. Another possibility would be that you have invalid entry in your hosts file on your operating system.
Also, there are many similar questions which could help you.
For the last few days I have been trying ti install the Native Client SDK for chrome in Windows and/or Ubuntu.
I'm behind a corporate network, and the only internet access is through an HTTP proxy with authentication involved.
When I run "naclsdk update" in Ubuntu, it shows
"urlopen error Tunnel connection failed: 407 Proxy Authentication Required"
Can anyone please help ?
Try to download this file:
http://commondatastorage.googleapis.com/nativeclient-mirror/nacl/nacl_sdk/naclsdk_manifest2.json
It is the native client update summary, but in the URL I replaced the https with http. If you view the JSON file, you will see the different pepper_xx versions available. Use the links to download the one you want, but again replace https with http.
The naclsdk update tool is very difficult to use for those of us behind a strict firewall. It would be nice if Google provided a direct link to the latest SDK.
I got a solution-
not a direct one, though.
managed to use a program to redirect the HTTPS traffic through the HTTP proxy.
I used the program called "proxifier". Works great.
I am trying to get access to a local JIRA 4.4.5 installation using it's JSON-RPC service. Therefore I wrote a python script utilizing jsonrpclib and trying to connect to http://localhost:8080/jira/rpc/json-rpc/jirasoapservice-v2 as described on https://developer.atlassian.com/display/JIRADEV/JIRA+JSON-RPC+Overview. Trying to connect from my python script as well as opening this URL in a browser gives me a 404 error.
import jsonrpclib
server = jsonrpclib.Server("http://localhost:8080/jira/rpc/json-rpc/jirasoapservice-v2")
reply = server.someMethod( param1, ... )
Calling someMethod fails with the following error:
xmlrpclib.ProtocolError: <ProtocolError for localhost:8080/jira/rpc/json-rpc/jirasoapservice-v2: 404 Not Found>
Has anyone successfully tried this the same way I did? Do I need to get access via HTTPS somehow instead of HTTP? How would I configure jira to do so?
Btw: Jira's json-rpc plugin is enabled.