Google Cloud Functions - Python - HTTP Trigger URL Without Function Name - python

I have a Google Cloud Function triggered via HTTP.
The trigger URL is in the format:
https://europe-west1-PROJECT-NAME.cloudfunctions.net/FUNCTION-NAME
This works fine, however I need to beable to access it at:
https://europe-west1-PROJECT-NAME.cloudfunctions.net/
Is there a way of doing this? I have not been able to find a definitive answer for this via their documentation or Google Search.
Thanks

This is not possible. You must use the URL that was assigned to your function at the time of deployment. You can't rewrite the URL on the hostname/domain that's given to your project.
You can, however, use Firebase Hosting to proxy URLs to Cloud Functions in the same project. You will need to use the domain given to you by Firebase Hosting, or bring your own domain.

Related

Eliminating nuisance Instance starts

My GCP app has been abused by some users. To stop their usage I have attempted to eliminate features that can be abused, and have employed firewall rules to block certain users. But bad users continue to try to access my app via certain legacy URLs such as myapp.appspot.com/badroute. Of course, I still want users to use the default URL myapp.appspot.com .
I have altered my code in the following manner, but I am still getting Instances to start from them, and I do not want Instances in such cases. What can I do differently to avoid the bad Instances starting OR is there anything I can do to force such Instances to stop quickly instead of after about 15 minutes?
class Dummy(webapp2.RequestHandler):
def get(self):
logging.info("Dummy: " )
self.redirect("/")
app = webapp2.WSGIApplication(
[('/', MainPage),
('/badroute', Dummy)],debug=True)
(I may be referring to Instances when I should be referring to Requests.)
So whats the objective? you want users that visit /badroute to be redirected to some /goodroute ? or you want /badroute to not hit GAE and incur cost?
Putting a google cloud load balancer in front could help.
For the first case you could setup a redirect rule (although you can do this directly within App Engine too, like you did in your code example).
If you just want it to not hit app engine you could setup google cloud load balancer to have the /badroute route to some file in a GCS bucket instead of your GAE service
https://cloud.google.com/load-balancing/docs/https/ext-load-balancer-backend-buckets
However you wouldnt be able to use your *.appsot.com base url. You'd get a static IP which you should then map a custom domain to it
DISCLAIMER: I'm not 100% sure if this would work.
Create a new service dummy.
Create and deploy a dispatch.yaml (GAE Standard // GAE Flex)
Add the links you want to block to the dispatch.yaml and point them to the dummy service.
Set up the Identity Aware Proxy (IAP) and enable it for the dummy service.
???
Profit
The idea is that the IAP will block the requests before they hit the dummy service. Since the requests never actually get forwarded to the service dummy you will not have an instance start. The bots will get a nice 403 page from Google's own infrastructure instead.
EDIT: Be sure to create the dummy service with 0 instances as the idea is to not have it cost money.
EDIT2:
So let me expand a bit on this answer.
You can have multiple GAE services running within one GCP project. Each service is it's own app. You can have one service running a python Flask app and another running a Java Springboot app. You can have each be either GAE Standard or GAE Flex. See this doc.
Normally all traffic gets routed to the default service. Using dispatch.yaml you can make request to certain endpoints go to a specific service.
If you create the dummy service as a GAE Standard app, and you don't actually need it to do anything, you can then route all the endpoints that get abused to this dummy service using the dispatch.yaml. Using GAE Standard you can have the service use 0 instances (and 0 costs).
Using the IAP you can then make sure only your own Google account can access this app (which you won't do). In effect this means that the abusers cannot really access the service, as the IAP blocks it before hitting the service, as you've set it up so only your Google account can access it.
Note, the dispatch.yaml is separate from any services, it's one of the per-project configuration files for GAE. It's not tied to a specific service.
As stated, the dummy app doesn't actually need to do anything, but you need to deploy it once though, as this basically creates the service.
Consider using cloudflare to mitigate bot abuse, customize firewall rules regarding route access, rate limit ips, etc. This can be combined with Google cloud load balancer, if you’d like—as mentioned in https://stackoverflow.com/a/69165767/806876.
References
Cloudflare GCP integration: https://www.cloudflare.com/integrations/google-cloud/
There is a little information I did not provide in my question about my app.yaml:
handlers:
- url: /.*
script: mainapp.app
By simply removing .* from the url specification, no Instance start is created. The user gets Error: Not Found, though. So that satisfies my needs.
Edo Akse's Answer pushed me to this answer by reading here, so I am accepting his answer. I am still not clear how to implement Edo's Answer, though.

User authentication for Spotify in Python using Spotipy on AWS

I am currently building a web-app that requires a Spotify user to login using their credentials in order to access their playlists
I'm using the Spotipy python wrapper for Spotify's Web API and generating an access token using,
token = util.prompt_for_user_token(username,scope,client_id,client_secret,redirect_uri)
The code runs without any issues on my local machine. But, when I deploy the web-app on AWS, it does not proceed to the redirected uri and allow for user login.
I have tried transferring the ".cache-username" file via SCP to my AWS machine instance and gotten it to work in limited fashion.
Is there a solution to this issue? I'm fairly new to AWS and hence don't have much to go on or any idea where to look. Any help would be greatly appreciated. Thanks in advance!!
The quick way
Run the script locally so the user can sign in once
In the local project folder, you will find a file .cache-{userid}
Copy this file to your project folder on AWS
It should work
The database way
There is currently an open feature request on Github that suggests to store tokens in a DB. Feel free to subscribe to the issue or to contribute https://github.com/plamere/spotipy/issues/51
It's also possible to write a bit of code to persist new tokens into a DB and then read from it. That's what I'm doing as part of an AWS Lambda using DynamoDB, it's not very nice but it works perfectly https://github.com/resident-archive/resident-archive/blob/a869b73f1f64538343be1604d43693b6165cc58a/functions/to-spotify/main.py#L129..L157
The API way
This is probably the best way, as it allows multiple users to sign in simultaneously. However it is a bit more complex and requires you host a server that's accessible by URL.
This example uses Flask but one could adapt it to Django for example https://github.com/plamere/spotipy/blob/master/examples/app.py

How to authenticate to a REST endpoint on Google Cloud Firestore API

I am looking at writing a short python/nodejs script that will call out to the exportDocuments API route from Google's Firestore.
This page shows how to use gcloud but it just isn't an option since I am calling from inside an AWS Lambda function.
I am a GCP newbie, but not python/REST newbie. I couldn't find an SDK that exposes this endpoint (but maybe I am wrong here).
I did poked around the terrible documentation for GCP and made a service account and gave it the Cloud Datastore Import Export Admin role.
I also looked around at Google's Application Default Creds which doesn't help me since I am in Lambda.
The one thing I didn't dive into is the http.proto that GCP uses because I am not familiar with it and it looks like a big rabbit hole.
So does anyone have sample python or nodejs code for how make a POST request and provide a service account to a GCP REST endpoint? Or is there an sdk that uses v1beta1 from firestore. I wasn't able to find it at their docs page

Redirect url is not returning the authorization code when trying to authenticate a power bi web app

I wrote a code using python 3.5.1 to get an authorization code in order to get the access token to be used to access certain Power BI resources.
The web app is registered with Azure AD and I got the client ID and client secret and set up the permissions.
I constructed the query string and passed it to the get function of the requests library.
authstring=https://login.windows.net/common/oauth2/authorize?response_type=code&client_id=xxxxx-xxxx-xxxx-xxxx&resource=https://analysis.windows.net/powerbi/api&redirect_uri=https://login.live.com/oauth20_desktop.srf
requests.get(authstring)
but it keeps redirecting to
https://login.microsoftonline.com/common/oauth2/authorize?response_type=code&client_id=xxxxx-xxxx-xxxx-xxxx&resource=https://analysis.windows.net/powerbi/api&redirect_uri=https://login.live.com/oauth20_desktop.srf
instead of the redirect url with the code
https://login.live.com/oauth20_desktop.srf?code=xxXXXXXaaaxaaaxxxxxx....
When I put the query string in my browser it redirects and returns the code.
I am not sure what I am missing. Does anyone know how to resolve this or know of a work around?
Based on my understanding, you want to do the authentication for integrating Power BI resource with Python, but I don't know your python application is either a client app or a web app.
You can refer to the offical document Authenticate to Power BI service to know how to do.
According to your description, I think you want to get the access token for requesting the Power BI resource, not access Power BI via single sign on. So you need to do the Resource Management Authentication using Python SDK with the package adal.
Hope it helps.
Any concern, please feel free to let me know.

GAE: Can't Use Google Server Side API's (Whitelisting Issue)

To use Google API's, after activating them from the Google Developers Console, one needs to generate credentials. In my case, I have a backend that is supposed to consume the API server side. For this purpose, there is an option to generate what the Google page calls "Key for server applications". So far so good.
The problem is that in order to generate the key, one has to mention IP addresses of servers that would be whitelisted. But GAE has no static IP address that I could use there.
There is an option to manually get the IP's by executing:
dig -t TXT _netblocks.google.com #ns1.google.com
However there is no guarantee that the list is static (further more, it is known to change from time to time), and there is no programatic way I could automate the use of adding IP's that I get from dig into the Google Developers Console.
This leaves me with two choices:
Forget about GAE for this project, ironically, GAE cannot be used as a backend for Google API's (better use Amazon or some other solution for that). or
Program something like a watchdog over the output of the dig command that would notify me if there's a change, and then I would manually update the whitelist (no way I am going to do this - too dangerous), or allow all IP's to use the Google API granted it has my API key. Not the most secure solution but it works.
Is there any other workaround? Can it be that GAE does not support consuming Google API's server side?
You can use App Identity to access Google's API from AppEngine. See: https://developers.google.com/appengine/docs/python/appidentity/. If you setup your app using the cloud console, it should have already added your app's identity with permission to your project, but you can always check that out. From the "Permissions" Tab in cloud console for your project, make sure your service account is added under "Service Accounts" (in the form of your_app_id#appspot.gserviceaccount.com)
Furthermore, if you use something like the JSON API Libs available for python, you can use the bundled oauth2 library to do all of this for you using AppAssertionCredentials to authorize the API you wish to use. See: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
Yes, you should use App Identity. Forget about getting an IP or giving up on GAE :-) Here is an example of how to use Big Query, for example, inside a GAE application:
static {
// initializes Big Query
JsonFactory jsonFactory = new JacksonFactory();
HttpTransport httpTransport = new UrlFetchTransport();
AppIdentityCredential credential = new AppIdentityCredential(Arrays.asList(Constants.BIGQUERY_SCOPE));
bigquery = new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(Constants.APPLICATION_NAME).setHttpRequestInitializer(credential)
.setBigqueryRequestInitializer(new BigqueryRequestInitializer(Constants.API_KEY)).build();
}

Categories

Resources