Firebase Cloud Functions running a python script - needs dependencies - python

I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.

the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.

Related

How do I create a standalone Jupyter Lab server using Pyinstaller or similar?

I would like to create a self-contained, .exe file that launches a JupyterLab server as an IDE on a physical server which doesn't have Python installed itself.
The idea is to deploy it as part of an ETL workflow tool, so that it can be used to view notebooks that will contain the ETL steps in a relatively easily digestible format (the notebooks will be used as pipelines via papermill and scrapbook - not really relevant here).
While I can use Pyinstaller to bundle JupyterLab as a package, there isn't a way to launch it on the Pythonless server (that I can see), and I can't figure out a way to do it using Python code alone.
Is it possible to package JupyterLab this way so that I can run the .exe on the server and then connect to 127.0.0.1:8888 on the server to view a notebook?
I have tried using the link below as a starting point, but I think I'm missing something as no server seems to start using this code alone, and I'm not sure how I would execute this via a tornado server etc.:
https://gist.github.com/bollwyvl/bd56b58ba0a078534272043327c52bd1
I would really appreciate any ideas, help, or somebody to tell my why this idea is impossible madness!
Thanks!
Phil.
P.S. I should add that Docker isn't an option here :( I've done this before using Docker and it's extremely easy.

How to limit python script so that it can't access local resources?

I am working on a project that allows users to upload a python script to an API and run it on a schedule. Currently, I'm trying to figure out a way to limit the functionality of the script so that it cannot access local files, mess with the flask server running the API, etc. Do you have any ideas on how I can achieve this? Is there anyway to make it so only specific libraries are available for importing?
Running other scripts on your server is serious security issue. If you are trying to deploy Python interpreter on your web application, you can try with something like judge0 - GitHub. It is free if you deploy it yourself and it will run scripts safely inside containers.
The simplest way is to ensure the user running the script is not root, but a user specifically designed for this task (e.g. part of a group that can only read and not write or execute). This means at minimum you should ensure all files have the appropriate mode. Then you can just use a pipe or something to run the script.
Alternatively, you could use a runtime that’s not “local”, like a VM or compute service (AWS lambda, etc). The latter would be simplest, and there’s lots of vendors who offer compute service with programmatic api.

How do I include dependencies for embedded console apps when using Run From Package

I'm deploying my Azure Function app using a CI/CD pipeline in Azure DevOps. The function invokes three console applications that are included in the package. One of the console applications is a stand alone .exe, it works without issue. The other two have dependencies to a number of dll:s that are also included in the package. This setup works well on my local machine, and when deployed using WebDeploy.
When instead deploying using Run From Package to a freshly created Function App Service, the function app itself loads fine as well as the standalone .exe console app, but both console apps that have dll dependencies fail to run, and both return with exit code 0xC0000135 to my function app (indicating that a dll failed to load).
Now, if I deploy once using Webdeploy and then deploy again using Run From Package, I get the latest build installed - and the console apps now work (!). I think this might be due to the .exe not being able to access the virtual file system when loading the dll:s, is this correct?
I could stick with WebDeploy but I really want to use the package deploy since the cold start time is much faster during scale-out (will need 100+ instances in production). I am also concerned that this way, the app actually needs to copy both the zip package, and the site structure under wwwroot, causing additional overhead.
What is the best way to include dependencies such as dll:s in a package when using Run From Package with Azure Functions?
(The function app is v3, built using .NET Core 3.1)

How can I call a python script from an asp.net Azure app service?

I have an existing .NET Core / asp.net app service hosted on Azure. I need to call (on demand) a python script to return data based on custom user input.
It does not appear that I can use IronPython, since I need python modules that are built in CPython, which unfortunately aren't supported by IronPython.
The two options I see are:
I might be able to install the right python version and libraries on the app service and call it from the .NET code. This seems like it might be deprecated: https://learn.microsoft.com/en-us/visualstudio/python/publishing-python-web-applications-to-azure-from-visual-studio?view=vs-2017
I can create a whole new and separate app service for just the python script and call it as a REST API on demand from the .NET app service. This seems like overkill, and introduces the problem of opening up a whole new service publicly, which I don't want to do. This also appears to have the limitation that Flask isn't mean for production, so hosting many calls at once is not really workable. https://learn.microsoft.com/en-us/visualstudio/python/publish-to-app-service-windows?view=vs-2017
What is the best way to call a python script on demand from .NET app service on Azure?
Per my experience, there are two ways to call a Python script in C# without IronPython.
Directly use System.Diagnostics.Process in C# to run a command as same as the SO thread Run Command Prompt Commands to get the result via parse the content of the process standard output. Simply to do it, you can use py2exe to wrap a Python script as a .exe file to avoid for installing Python modules and setting environment variables on Azure App Service. However, considering for concurrency, it's not a good idea for performance.
The second option as you said is to deploy a Python script as a REST API in the same instance of Azure Web App. You can follow the blog Deploying multiple virtual directories to a single Azure Website to deploy a flask app with your Python script as a child project via Visual Studio with PTVS to expose an API url like https://<your web app name>.azurewebsites.net/pyapi which can be called from your ASP.NET via HttpClient. I tried this solution, it works.
Note: Due to the restriction of Azure Web App sandbox for Local Address Requests, you have to use <your web app name>.azurewebsites.net as hostname, neither localhost or 127.0.0.1.

How to boot up a test pubsub emulator from python for automated testing

I'm working on a flask API, which one of the endpoint is to receive a message and publish it to PubSub. Currently, in order to test that endpoint, I will have to manually spin-up a PubSub emulator from the command line, and keep it running during the test. It working just fine, but it wouldn't be ideal for automated test.
I wonder if anyone knows a way to spin-up a test PubSub emulator from python? Or if anyone has a better solution for testing such an API?
As far as I know, there is no Python native Google Cloud PubSub emulator available.
You have few options, all of them require launching an external program from Python:
Just invoke the gcloud command you mentioned: gcloud beta emulators pubsub start [options] directly from your python application to start this as an external program.
The PubSub emulator which comes as part of Cloud SDK is a JAR file bootstrapped by the bash script present in CLOUD_SDK_INSTALL_DIR/platform/pubsub-emulator/bin/cloud-pubsub-emulator. You could possibly run this bash script directly.
Here is a StackOverflow answer which covers multiple ways to launch an external program from Python.
Also, it is not quite clear from your question how you're calling the PubSub APIs in Python.
For unit tests, you could consider setting up a wrapper over the code which actually invokes the Cloud PubSub APIs, and inject a fake for this API wrapper. This way, you can test the rest of the code which invokes just your fake API wrapper instead of the real API wrapper and not worry about starting any external programs.
For integration tests, the PubSub emulator will definitely be useful.
This is how I usually do:
1. I create a python client class which does publish and subscribe with the topic, project and subscription used in emulator.
Note: You need to set PUBSUB_EMULATOR_HOST=localhost:8085 as env in your python project.
2. I spin up a pubsub-emulator as a docker container.
Note: You need to set some envs, mount volumes and expose port 8085.
set following envs for container:
PUBSUB_EMULATOR_HOST
PUBSUB_PROJECT_ID
PUBSUB_TOPIC_ID
PUBSUB_SUBSCRIPTION_ID
Write whatever integration tests you want to. Use publisher or subscriber from client depending on your test requirements.

Categories

Resources