Azure timer trigger function using Python - python

I am writing an Azure timer trigger using Python 3.x. I've got one such function running. I think I know to do it, create one from JS and then delete the 'index.js' and create a run.py. But this time, when I run my python function, I always got an error saying "No such file: index.js". I didn't see any bonds between the function and the 'index.js' file.
Any thoughts?

We could add the python function from the Azure portal directly. If you want to create Timetrigger function,then we could change the trigger type
The following is my detail steps to create Python timetrigger function.
1.Create an Azure function App
2.Add a python function
3.Change the httptrigger to timetrigger
a. delete the httptrigger and http output
b. add the time trigger
4.Add the test code and test it from Azure portal.
The default version is 2.7.8. If you want to use python 3.x, you could follow this tutorial to update the python version.
5.Update the python version.
a. Install extension for Azure function App
b. Add Handler Mappings entry so as to use Python3.X via FastCGI
6.Test it from Azure portal

I followed tutorial in comment and reproduce your issue on my side though I refresh the portal.
However, after waiting for some time, it works. I suspect it's due to cache.
I suggest you creating python azure function on kudu directly. Just create run.py and function.json in new folder instead of changing the JS template.
Hope it helps you.

In my case, run.py is recognized and run after I restart Azure Functions from the portal:
Azure Functions > Overview > Restart
screenshot

Related

Azure function didn't realize the trigger on deployment but that work in local

My azure function works correctly locally but when I deploy it on azure. It will return this message:
Deployment successful. deployer = ms-azuretools-vscode deploymentPath = Functions App ZipDeploy. Extract zip. Remote build.
Syncing triggers...
Querying triggers...
No HTTP triggers found.
Uploading settings...
Added the following settings:
- AzureWebJobsFeatureFlags
- FUNCTIONS_WORKER_PROCESS_COUNT
11:43:54 AM: Ignored the following settings that were already the same:
- FUNCTIONS_WORKER_RUNTIME
11:43:54 AM: WARNING: This operation will not delete any settings in "vietnam-trademark-scraper-dev-test". You must manually delete settings if desired.
11:43:54 AM: Excluded the following settings:
- AzureWebJobsStorage
11:43:54 AM: Error: Error "appSettings.properties with value "1" must be of type string." occurred in serializing the payload - "StringDictionary".
in local, I see all trigger work:
I use Python V2 model Azure Functions triggers. I deploy it with a dedicated plan.
I try to search for this problem on google and have no idea to fix it. Can someone explain this problem and suggested me some solutions? Thanks
Glad that it worked for you by adding the setting AzureWebJobsFeatureFlags:EnableWorkerIndexing and shown the practical in one of my workarounds.
I added this setting and redeployed and it works. But I don't know how it works? Can you explain it?
It is because Microsoft explicitly mentioned to add that application setting for running the Python Programming v2 model in Azure.
Whatever the values we have added related to AzureWebJobsFeatureFlags are not ready to run in production but can be experimental before they are released completely as defined in this MS Doc of Azure Functions App Settings.
And also, in the V2 Programming model, multiple workers (FUNCTIONS_WORKER_PROCESS_COUNT) of Python environment is not yet supporting in greater than 1 so this setting needs to be added as a Feature flag to work.
Refer to the GitHub Article on Azure Functions Host of Worker Indexing Changes to Python Environment for related more information.

How to run a Python Azure Function with a private Azure Artifact dependancy

I am trying to deploy a Python Azure Function into an Azure Function App. The function __init__.py script imports an SDK which is stored as an Azure Artifact Python Package. I can build and publish the function to Azure successfully using a pipeline from the DevOps repo, however the function fails at the import mySDK line when I run it.
I assume the issue is that because the function is serverless, when it is called the SDK needs to be pip installed again - how do I do this?
I have tried adding a PIP_EXTRA_INDEX_URL to the artifact feed in the Function App with no success.
PIP_EXTRA_INDEX_URL wored for me.
What was the issue you received when you tried it?
Basically before you deploy your function, you should alter the application settings on your function app and add the PIP_EXTRA_INDEX_URL key-value pair. Then add the python package in your azure artifacts feed to the requirements.txt file in your function app code.
There is a good guide here EasyOps - How To connect to azure artifact feed from Function App

Firebase Cloud Functions running a python script - needs dependencies

I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.

Python Lambda function not published after saving (deploying) in AWS

After editing Lambda function inline with Python 3.8 and then pressing "save"...I get the message "successfully deployed package". Normally the updated function should be triggered after calling API again, but the updated code seems either not beeing deployed or whyever the new code is not called, I donĀ“t know what happens there...it says successfully deployed package.
Does it take a while for AWS to "realy" make the new function available to public?
Thanks for reply.
EDIT:
found out that my browser had old version in cache I think...but this is a problem...why do I have to clear cache to test new version of API? It got published correctly but my browser used old version...how to prevent that?

run a python script inside azure datafactory that call APIs using MSI

We have a rest API hosted on Azure as a web application which provides a json output when invoked.
Inside datafactory we need to run a databrick activity that have a python code, currently we store the certifications inside the script, the python script call the URL/web app with the certification and we do the magic.
But we don't want to store the certifications, and we are thinking on using MSI, is it possible for the python script retrieve the certifications of MSI and call the API?
i thought of having a webapp activity before the databrick activity with MSI and pass it as an input not sure if that a good idea.
any one know how to pass MSI certification to python to use a webapp inside azure?
Referring this link
https://learn.microsoft.com/en-us/python/azure/python-sdk-azure-authenticate?view=azure-python
but not sure what do i need to get the credentials, the resourceID? an applicationId?
i appreciate if some one has a small script/ example to share =)
thanks guys.

Categories

Resources