I have currently 5 different services in Google App Engine all of them are in FastAPI Python standard environment. When a service gets called they call an authorization service and continue if permissions are valid. Im using firewall rule to disable all incoming requests but allow my computer. When using the firewall rule I cannot call the other service because it return Access Forbidden. I then found something about the requests in Python in GAE that you have to use Google's URLfetch to make calls to other services. But when I use the monkeypatch() function from requests_toolbelt.adapters.appengine I recieve an error in App Engine
File "/layers/google.python.pip/pip/lib/python3.10/site-packages/requests_toolbelt/adapters/appengine.py", line 121, in __init__
self.appengine_manager = gaecontrib.AppEngineManager(
File "/layers/google.python.pip/pip/lib/python3.10/site-packages/urllib3/contrib/appengine.py", line 107, in __init__
raise AppEnginePlatformError(
urllib3.contrib.appengine.AppEnginePlatformError: URLFetch is not available in this environment.
The main reason to restrict the API's is that nobody is able to read the docs from the services.
As mentioned in this docThe Python 3 runtime doesn't need an intermediary service to handle outbound requests. If you want to migrate away from using URL Fetch APIs but still need similar functionality, you should migrate those requests to use a standard Python library such as the requests library.
As discussed in this thread seems you are also facing the similar issue
The presence of requests_toolbelt dependency in the project was the
problem: it somehow forced the requests library to use urllib3, which
requires URLFetch to be present, otherwise it raises an
AppEnginePlatformError. As suggested in the app engine docs
monkey-patching Requests with requests_toolbelt forces the former to
use URLFetch, which is no longer supported by GAE in a Python 3
runtime.
You may resolve this by removing requests_toolbelt from
requirements.txt file
You can also have a look at this stackoverflow thread
Related
An external Python module that I'm loading as part of my GAE instance can't issue HTTP(s) requests to itself (the same GAE instance) anymore. It always used to work, but recently it stopped working. The outgoing requests (to itself) are timing out. Since it's a third-party module, I can't modify its code, only the URL it uses.
I also tried to access 0.0.0.0:8080, the port and address that the WSGI app listens to. No success either.
I read about the old Python runtime behaviour to use urlfetch or authenticate via HTTP headers, but none of that applies anymore since for the new python37 runtime, the use of Python's requests module is recommended, which I would also like to stick with.
Any idea where to start looking? I seem to have tried everything.
Found the answer after much research in another thread: Gunicorn doesn't allow requesting urls on the same server instance
Because of a deadlock, multiple gunicorn workers are needed to be able to connect to self, e.g. gunicorn -b :8888 -w 2 app:app.
Having multiple workers on Google App Engine Standard Environment requires an instance class of F2 or higher. You can set the number of workers following your instance class: https://cloud.google.com/appengine/docs/standard/python3/runtime#entrypoint_best_practices
I have a Django application deployed on Google App Engine standard environment. I am interested in server side rendering of my JS frontend. Can I use node.js alongside Django on the same GAE? Maybe as microservice?
What you can do is to deploy each of your app as a separate service in App Engine and they will work independently as a microservice. To do so, make sure to set a service name for each of the app.yaml file of your apps:
service: service-name
Afterwards, you can communicate between your services through an HTTP invocation, such as a user request or a RESTful API call. Code in one service can't directly call code in another service.
Refer to this link for additional information about communicating between your services.
I have come across articles that talk about integrating python and Node but I personally haven't done it or seen it done on GAE.
If I were to take a stab, I think you would be looking at something like
Have the python app as a service (say it's available on python_service.myapp.appspot.com
Have your Node.js as your default service available on myapp.appspot.com
Your Nodejs will have a route and when this route is invoked, you make an http request to the python service, wait for a response and then your Nodejs app returns that response.
Our App, https://nocommandline.com is an Electron App (combo of Node.js & Vue.js) If you purchase a license and try to validate it, we make a call server side and our server side is Python based. It's not exactly the same thing you're looking at (since our App is not web-based) but this gives you an idea of what I was trying to describe.
I want to create pods, manage replica sets, and deployments using a rest API either built with PHP or Python. This needs to be controlled from a web app where the user clicks on a button and a new pod with a specific volume is created. I'm not sure how to achieve this.
I came across KC8 API and Python KC8 client API but I'm unable to achieve what is required. TIA
Kubernetes is controlled through an HTTP REST API, which is fully specified here. You could write a web app that directly issues the appropriate HTTP requests to the Kubernetes API server.
However, it's much more recommended to use one of the Kubernetes client libraries that exist for different programming languages. These libraries wrap all the HTTP requests in function calls and also take care of things like authentication.
You can find example code using the different client libraries in the GitHub repositories of most libraries (see here).
I have an SDK for my web service that is distributed as a Python library via PyPI. My library uses requests for communicating with the backend using typical REST-like requests.
I would like my library to be compatible with applications that are hosted on Google App Engine (GAE). According to the GAE documentation on HTTP requests:
To use requests, you'll need to install both requests and
requests-toolbelt using the vendoring instructions.
Once installed, use the requests_toolbelt.adapters.appengine module to
configure requests to use URLFetch:
So I follow the example given there and have this in my library's main module:
if os.getenv('SERVER_SOFTWARE', '').startswith('Google App Engine/'):
import requests_toolbelt.adapters.appengine
requests_toolbelt.adapters.appengine.monkeypatch()
This seems to do the trick when a client application using my library is actually running on an App Engine instance.
However, when the client application is run locally using the development web server (dev_appserver.py), os.getenv('SERVER_SOFTWARE') returns "Development/2.0" and so the monkeypatch is not executed. I subsequently get these errors when trying to issue requests:
ConnectionError: ('Connection aborted.', error(13, 'Permission denied'))
How can I detect that the host application for my library is running either in Google App Engine itself, or inside the development web server? Checking for "Development/2.0" doesn't seem like it would be discriminating enough.
Or, is there a better general pattern to follow when shipping a shared Python library that needs support for "typical" network requests?
Digging through the Google Cloud SDK, it seems that Google's own method for determining if we are running in either the production or development (dev_appserver.py) Google App Engine environment is indeed to look for either of those values for SERVER_SOFTWARE. From apitools/base/py/util.py
def DetectGae():
"""Determine whether or not we're running on GAE.
This is based on:
https://developers.google.com/appengine/docs/python/#The_Environment
Returns:
True iff we're running on GAE.
"""
server_software = os.environ.get('SERVER_SOFTWARE', '')
return (server_software.startswith('Development/') or
server_software.startswith('Google App Engine/'))
I'm wondering if anybody tried to integrate mosso CloudFiles with an application running on Google AppEngine (mosso does not provide testing sandbox so I cann't check for myself without registering)? Looking at the code it seems that this will not work due to httplib and urllib limitations in AppEngine environment, but maybe somebody has patched cloudfiles?
It appears to implement a simple RESTful API, so there's no reason you couldn't use it from App Engine. Previously, you'd have had to write your own library to do so, using App Engine's urlfetch API, but with the release of SDK 1.1.9, you can now use urllib and httplib instead.