Can python/flask websites be hosted on firebase? - python

I have a website that uses python/flask, and I know that firebase hosting is only for static websites, but I need to be able to use firebase cloud functions in my app, and that requires firebase hosting (please correct me if I am wrong). As node js is server side, but you can use it with firebase hosting, I was hopeful that there might be a way to use python too. Otherwise, if there is a way to use cloud functions without firebase hosting, you can tell me about that too.

You don't need Firebase Hosting for using Firebase Functions, and as you mentioned Firebase Hosting is for static pages.
Firebase Functions are hosted on firebase (independent from Firebase Hosting for static pages), and currently don't support python.
For HTTP trigger Firebase Functions you simply make HTTP requests to your function's url, from any backend or from frontend itself.
Firebase DB/Storage and other trigger functions work in the same way, but don't explicitly call then they are triggered on specific events in DB/Storage etc. that you specify when defining functions.

Related

Can I expose the API URL to the public in a Python library?

I have built a Django API which has a lot of proprietary algorithms for analyzing stock portfolios. The API is hosted in AWS and this API is also connected to the website where I show the visualizations of the calculated metrics.
I want to build a library to provide these metrics in Python like pandas. Users can install this package using pip install .
In my Python library code, I would expose only the API URL and I will call the API with different parameters and endpoints. Paid users will get a unique access code in email using which only they can access the Python package functions. I don't want to expose my Django API code to the public as the analysis itself requires a lot of stock price data from the Database and also there are a lot of proprietary algorithms.
Does the solution make sense? Should I hide the API URL or just leave it.
Hiding the API URL is security through obscurity and should be avoided.
To protect your API from being abused by public users, you can either develop your own protection mechanism, e.g: rolling out your custom API key provisioning, with rate limiter, and IP address filtering, etc...
Or you can use AWS API Gateway to proxy traffic to your back-end API. API Gateway alone might not be useful, but the services supporting it is really helpful without requiring you to write additional codes
API Gateway supports API Key with Usage Plans, helping to rate limit your authenticated users.
You can enable AWS WAF to protect your API from malicious scripts, or other attacks
To make sure that your back-end servers only receive traffic from API Gateway, you can configure a client-certificate. This way, your server is protected even if your back-end's API URL is publicly exposed.

Google Cloud Functions - Python - HTTP Trigger URL Without Function Name

I have a Google Cloud Function triggered via HTTP.
The trigger URL is in the format:
https://europe-west1-PROJECT-NAME.cloudfunctions.net/FUNCTION-NAME
This works fine, however I need to beable to access it at:
https://europe-west1-PROJECT-NAME.cloudfunctions.net/
Is there a way of doing this? I have not been able to find a definitive answer for this via their documentation or Google Search.
Thanks
This is not possible. You must use the URL that was assigned to your function at the time of deployment. You can't rewrite the URL on the hostname/domain that's given to your project.
You can, however, use Firebase Hosting to proxy URLs to Cloud Functions in the same project. You will need to use the domain given to you by Firebase Hosting, or bring your own domain.

How to create a Python API and use it with React Native?

I'm learning about basic back-end and server mechanics and how to connect it with the front end of an app. More specifically, I want to create a React Native app and connect it to a database using Python(simply because Python is easy to write and fast). From my research I've determined I'll need to make an API that communicates via HTTP with the server, then use the API with React Native. I'm still confused as to how the API works and how I can integrate it into my React Native front-end, or any front-end that's not Python-based for that matter.
I think you have to follow some online tutorial
And from my experiences, I think Flask is good choice for such case.
This is basic flask tutorial provided by tutorialspoint.com
You have to create a flask proxy, generate JSON endpoints then use fetch or axios to display this data in your react native app. You also have to be more specific next time.

Serving dynamic webpages using aws?

I'm new to AWS in general, and would like to learn how to deploy a dynamic website with AWS. I'm coming from a self-hosted perspective (digitalocean + flask app), so I'm confused on what exactly the process would be for AWS.
With self-hosting solution, the process is something like:
User makes a request to my server (nginx)
nginx then directs the request to my flask app
flask app handles the specific route (ie, GET /users)
flask does db operations, then builds an html page using jinja2 with the results from the db operation
returns html to user and user's browser renders the page.
With AWS, I understand the following:
User makes a request to Amazon's API Gateway (ie, GET /users)
API Gateway can call a AWS Lambda function
AWS Lambda function does db functions or whatever, returns some data
API Gateway returns the result as JSON (assuming I set the content-type to JSON)
The confusing part is how do I generate the webpage for the user, not just return the JSON data? I see two options:
1) Somehow get AWS Lambda to use Jinja2 module, and use it to build the HTML pages after querying the db for data. API Gateway will just return the finished HTML text. Downside is this will no longer be a pure api, and so I lose flexibility.
2) Deploy Flask app onto Amazon Beanstalk. Flask handles application code, like session handling, routes, HTML template generation, and makes calls to Amazon's API Gateway to get any necessary data for the page.
I think (2) is the 'correct' way to do things; I get the benefit of scaling the flask app with Beanstalk, and I get the flexibility of calling an API with the API Gateway.
Am I missing anything? Did I misunderstand something in (2) for serving webpages? Is there another way to host a dynamic website without using a web framework like Flask through AWS, that I don't know about?
The recommended way to host a server with lambda and without EC2 is:
Host your front static files on S3 (html, css, js).
Configure your S3 bucket to be a static web server
Configure your lambdas for dynamic treatments and open it to the outside with API-gateway
your JS call the lambda through API-gateway, so don't forget to activate CORS (on the S3 bucket AND on API-gateway).
configure route53 to link it with your bucket (your route53 config must have the same name as your bucket) so you can use your own DNS name, not the generic S3-webserver url
You definitely have to weigh the pros and cons of serving the dynamic website via API GW and Lambda.
Pros:
Likely cheaper at low volume
Don't have to worry about scaling
Lambda Functions are easier to manage than even beanstalk.
Cons:
There will be some latency overhead
In some ways less flexible, although Python is well supported and you should be able to import the jinja2 module.
Both of your proposed solutions would work well, it kind of depends on how you view the pros and cons.

GAE: Can't Use Google Server Side API's (Whitelisting Issue)

To use Google API's, after activating them from the Google Developers Console, one needs to generate credentials. In my case, I have a backend that is supposed to consume the API server side. For this purpose, there is an option to generate what the Google page calls "Key for server applications". So far so good.
The problem is that in order to generate the key, one has to mention IP addresses of servers that would be whitelisted. But GAE has no static IP address that I could use there.
There is an option to manually get the IP's by executing:
dig -t TXT _netblocks.google.com #ns1.google.com
However there is no guarantee that the list is static (further more, it is known to change from time to time), and there is no programatic way I could automate the use of adding IP's that I get from dig into the Google Developers Console.
This leaves me with two choices:
Forget about GAE for this project, ironically, GAE cannot be used as a backend for Google API's (better use Amazon or some other solution for that). or
Program something like a watchdog over the output of the dig command that would notify me if there's a change, and then I would manually update the whitelist (no way I am going to do this - too dangerous), or allow all IP's to use the Google API granted it has my API key. Not the most secure solution but it works.
Is there any other workaround? Can it be that GAE does not support consuming Google API's server side?
You can use App Identity to access Google's API from AppEngine. See: https://developers.google.com/appengine/docs/python/appidentity/. If you setup your app using the cloud console, it should have already added your app's identity with permission to your project, but you can always check that out. From the "Permissions" Tab in cloud console for your project, make sure your service account is added under "Service Accounts" (in the form of your_app_id#appspot.gserviceaccount.com)
Furthermore, if you use something like the JSON API Libs available for python, you can use the bundled oauth2 library to do all of this for you using AppAssertionCredentials to authorize the API you wish to use. See: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
Yes, you should use App Identity. Forget about getting an IP or giving up on GAE :-) Here is an example of how to use Big Query, for example, inside a GAE application:
static {
// initializes Big Query
JsonFactory jsonFactory = new JacksonFactory();
HttpTransport httpTransport = new UrlFetchTransport();
AppIdentityCredential credential = new AppIdentityCredential(Arrays.asList(Constants.BIGQUERY_SCOPE));
bigquery = new Bigquery.Builder(httpTransport, jsonFactory, credential)
.setApplicationName(Constants.APPLICATION_NAME).setHttpRequestInitializer(credential)
.setBigqueryRequestInitializer(new BigqueryRequestInitializer(Constants.API_KEY)).build();
}

Categories

Resources