I want to Test my azure function using the Azure Apps feature to Run/Test mode but it is throwing the '500 internal server error'.
I am able to debug the same code in my local environment but when to trigger the same code on the azure portal then it is getting failed without any proper error logs.
This Azure function will read the json format data from the event hub and write the same to the blob storage. I am using python for the azure function development.
Here is the code:
init.py
from typing import List
import logging
import os
import azure.functions as func
from azure.storage.blob import BlobClient
import datetime
import json
storage_connection_string = os.getenv('storage_connection_string_FromKeyVault')
container_name = os.getenv('storage_container_name_FromKeyVault')
today = datetime.datetime.today()
def main(events: List[func.EventHubEvent]):
for event in events:
a = event.get_body().decode('utf-8')
json.loads(a)
logging.info('Python EventHub trigger processed an event: %s', a)
logging.info(f' SequenceNumber = {event.sequence_number}')
logging.info(f' Offset = {event.offset}')
blob_client = BlobClient.from_connection_string(storage_connection_string, container_name, str(today.year) +"/" + str(today.month) + "/" + str(today.day) + "/" + str(event.sequence_number) + ".json")
blob_client.upload_blob(event.get_body().decode(),blob_type="AppendBlob")
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "<Endpoint1>",
"FUNCTIONS_WORKER_RUNTIME": "python",
"storage_connection_string_FromKeyVault": "<connectionString",
"storage_container_name_FromKeyVault": "<container_name>",
"EventHubReceiverPolicy_FromKeyVault": "<Endpoint2>"
}
}
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "eventHubTrigger",
"name": "events",
"direction": "in",
"eventHubName": "pwo-events",
"connection": "EventHubReceiverPolicy_FromKeyVault",
"cardinality": "many",
"consumerGroup": "$Default",
"dataType": "binary"
}
]
}
Please note that this error is throwing when I am clicking on Run/Test on the portal. but the same code is also running fine after deployment.
The 500 error is not helpful to solve this problem, you need to check the specific error of the azure function. You can use application insights to get the details error. The function must configure the corresponding application insights before you can view the log on the portal.
So you need to configure an application insights to your function app like this:
Then your function app will restart.
Of course, you can also go to kudu to view:
First, go to advanced tools, then click 'GO',
Then After you go to kudu, click Debug Console -> CMD -> LogFiles -> Application -> Functions -> yourtriggername. You will find log file there.
If you are based on linux OS, after go to kudu, just click 'log stream'(this is not supportted to consumption plan for linux.).
I had this problem and I found that problem was with dependencies. Removing unexisting libraries (or using Microsoft's bring dependency document) will solve the issue.
Adding third-party dependencies in the Azure portal is currently not supported for Linux Consumption Function Apps. Click here to setup local environment. Learn more
If you need dependencies, to solve this problem, you can refer to this Microsoft Document
Related
I am working on one of the Azure Function which is written in Python and it should get called based on Blob trigger event. However, the trigger is not firing when I am uploading a zip file in a blob container to which azure function is supposed to monitor.
Following is local.settings.json file -
{
"IsEncrypted": false,
"Values":
{
"AzureWebJobsStorage": "blob (connection string) that was created when Azure function is created",
"FUNCTIONS_WORKER_RUNTIME": "python",
"My_STORAGE": "blob (connection string) that function should monitor"
}
}
Following is function.json file -
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "mycontainer/{name}",
"connection": "My_STORAGE"
}
]
}
Following is my code init.py - (test_func is user defined function to do some business logic)
def main(myblob: func.InputStream):
test_func(myblob.name)
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
When I'm uploading zip file in "mycontainer" container, the azure function is not firing.
The "mycontainer" of StorageV2 (general purpose v2) account kind. I am using Python 3.8 version.
This "mycontainer" has automatically created a container named $logs that has day wise folder to have a log file mentioning the file that I'm uploading in "mycontainer", however, there is no sign of blob trigger event on Azure function.
"My_STORAGE" is added as Application Settings in Azure Function's Configuration settings. I am uploading Local Settings after Azure Function deployment.
Any idea what is going wrong?
Thank you.
I've had a similar problem. It was solved by adding the connection string as a key-value pair to the Application Settings in Function App->Configuration->Settings. I assume that it has to be done if Azure Functions Apps are deployed in a different storage container, for which there needs to be a different connection string.
The problem was caused by mistake in connection string. The AccountName in connection string should be storage account name but not the name of container.
So just change AccountName=mycontainer to AccountName=<storage account name>, then it works.
And by the way:
The connection string should be: DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=xxxxxxxxxx==;EndpointSuffix=core.windows.net
The "path" in "function.json" should be: "path": "<container name>/{name}"
Copy the <storage_account>_STORAGE key from local.settings.json file to a new application setting key-value:
I am trying to inject Environment Variables into an Azure "App Services" Flask resource.
Note: I am aware that some use files to set up environment variables. I may look into that in the future, but for now I'm trying to do this without managing files.
Per the manual, I added the environment variables as "app settings," on the portal page.
And I can see that they have been set correctly with the Azure CLI command:
az webapp config appsettings list --name <redacted> --resource-group <redacted>
which outputs:
{
"name": "DB.DATABASE",
"slotSetting": false,
"value": "<redacted>"
},
{
"name": "DB.DRIVER",
"slotSetting": false,
"value": "{SQL Server Native Client 11.0}"
},
...
My Python code references the variables, which works locally.
from os import environ
driver = environ.get('DB.DRIVER')
server = environ.get('DB.SERVER')
user_id = environ.get('DB.USER_ID')
password = environ.get('DB.PASSWORD')
database = environ.get('DB.DATABASE')
trusted_connection = environ.get('DB.TRUSTED_CONNECTION')
print(f'driver: {driver}')
print(f'server: {server}')
print(f'user_id: {user_id}')
and the output, in the Azure log stream is:
2020-10-05T17:08:01.172742838Z driver: None
2020-10-05T17:08:01.172767338Z server: None
2020-10-05T17:08:01.172772538Z user_id: None
What, please, am I missing from this procedure? It seemed so simple, but it just fails with no error message.
I was trying to use the debugging function for lambda (python) in Visaul Studio Code. I was following the instructions on AWS Docs, but I could not trigger the python applicaion in debug mode.
Please kindly see if you know the issue and if I have setup anything incorrectly, thanks.
Reference:
https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-using-debugging.html
Observation
Start application
Seems application was not started on the debug port specified?
Request call
The endpoint could not be reached and python application was not entered
If accessed through port 3000, application could complete successfully
Setup performed
Initialize the project and install ptvsd as instructed
Enable ptvsd on the python code
Add launch configuration
Project structure
Python source
This is basically just the offical helloworld sample for python
import json
# import requests
import ptvsd
# Enable ptvsd on 0.0.0.0 address and on port 5890 that we'll connect later with our IDE
ptvsd.enable_attach(address=('localhost', 5890), redirect_output=True)
ptvsd.wait_for_attach()
def lambda_handler(event, context):
"""Sample pure Lambda function
Parameters
----------
event: dict, required
API Gateway Lambda Proxy Input Format
Event doc: https://docs.aws.amazon.com/apigateway/latest/developerguide/set-up-lambda-proxy-integrations.html#api-gateway-simple-proxy-for-lambda-input-format
context: object, required
Lambda Context runtime methods and attributes
Context doc: https://docs.aws.amazon.com/lambda/latest/dg/python-context-object.html
Returns
------
API Gateway Lambda Proxy Output Format: dict
Return doc: https://docs.aws.amazon.com/apigateway/latest/developerguide/set-up-lambda-proxy-integrations.html
"""
# try:
# ip = requests.get("http://checkip.amazonaws.com/")
# except requests.RequestException as e:
# # Send some context about this error to Lambda Logs
# print(e)
# raise e
return {
"statusCode": 200,
"body": json.dumps({
"message": "hello world",
# "location": ip.text.replace("\n", "")
}),
}
Launch configuration
launch.json
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Python: Current File",
"type": "python",
"request": "launch",
"program": "${file}",
"console": "integratedTerminal"
},
{
"name": "SAM CLI Python Hello World",
"type": "python",
"request": "attach",
"port": 5890,
"host": "localhost",
"pathMappings": [
{
"localRoot": "${workspaceFolder}/hello_world/build",
"remoteRoot": "/var/task"
}
]
}
]
}
It seems I was editing the python file at "python-debugging/hello_world/build" following the guideline of the doc (there is a step in the doc which asks you to copy the python file to "python-debugging/hello_world/build").
But then when you run "sam local start-api", it actually runs the python file at the location specifed by the CloudFormation template (tempalted.yaml), which is at "python-debugging/hello_world" (check the "CodeUri" property).
When I moved all the libriaries to the same folder as the python file it works.
So I suppose you have to make sure which python (or lambda) script you are running, and ensure the libraries are together with the python script (if you are not using layers).
Folder structure
Entering debugging mode in Visual studio code
Step 1: Invoke and start up the local API gateway
Server
Step 2: Send a test request
Client
Step 3: Request received, lambda triggered, pending activating debug mode in Visual Studio Code
Server
Step 4: Lambda function triggered, entering debug mode in Visual Studio Code
In the IDE, open the "Run" perspective, select the launch config for this file ("SAM CLI Python Hello World"). Start the debug.
Step 5: Step through the function, return response
Server
Client
I have created a simple blob trigger in visual studio for which init.py is as below
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
and function.json is as below
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "mycontainer/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
local.settings.json looks as below
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https; AccountName=****;AccountKey=*****;EndpointSuffix=core.windows.net"
}
}
This code works fine with visual studio on local machine. But when published on azure portal it can not read blob path from function.json and gives error as
Invalid blob path specified : ''. Blob identifiers must be in the format 'container/blob'.
I have published function using command to push contains of local.settings.json.
func azure functionapp publish FUNCTIONNAME --build-native-deps --publish-local-settings -i
.
Can anyone please guid me what I am missing after publishing.
Are you using the run button in the Azure portal to test your function? The way this works for blob triggers is that in the 'Test' tab on the right hand side, you can specify the name of the blob you want to manually send a trigger event for, forcing your function to run:
The idea is that you should edit the contents of the request body box and put in the path to a valid blob in your account. That way the trigger runs and finds the blob and retrieves it. So if you don't modify the request body box, then it will look for a blob and fail to find it and throw the 404 error.
Also please take a look at below document for configuring container name
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#storage-blob-trigger
Also please verify if you setting has been published in the portal or not.
func azure functionapp publish "functionname" --publish-local-settings
Hope it helps.
I am using Ubuntu 16.04.5 LTS local machine to create and publish Python Function App to Azure using CLI and Azure Functions Core Tools (Ref). I have configured Blob Trigger and my function.json file looks like this:
{
"disabled": false,
"scriptFile": "__init__.py",
"bindings": [
{
"name": "<Blob Trigger Name>",
"type": "blobTrigger",
"direction": "in",
"path": "<Blob Container Name>/{name}",
"connection": "<Connection String having storage account and key>"
},
{
"name": "outputblob",
"type": "blob",
"path": "<Blob Container Name>",
"connection": "<Connection String having storage account and key>",
"direction": "out"
}
]
}
My init.py function looks like this.
def main(<Blob Trigger Name>: func.InputStream, doc: func.Out[func.Document]):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {<Blob Trigger Name>.name}\n"
f"Blob Size: {<Blob Trigger Name>.length} bytes")
logging.basicConfig(filename='example.log',level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
# Write text to the file.
file = open("QuickStart.txt", 'w')
file.write("Hello, World!")
file.close()
# Create the BlockBlockService that is used to call the Blob service for the storage account
block_blob_service = BlockBlobService(account_name='<Storage Account Name>', account_key='<Storage Account Key>')
container_name='<Blob Container Name>'
# Set the permission so the blobs are public.
block_blob_service.set_container_acl(container_name, public_access=PublicAccess.Container)
# Upload the created file, use local_file_name for the blob name
block_blob_service.create_blob_from_path(container_name, 'QuickStart.txt', '')
The Function App is "Always On" but when I upload a blob in the storage the function is not getting triggered. Another Reference Link is this (Ref).
What's going wrong?
Thanks and regards,
Shashank
Have you checked that the local.settings.json (connection strings for storage accounts) are also in the function app in Azure? They are not published from local machine by default.
You can configure them manually in the portal or use the publish-local-settings flag:
func azure functionapp publish "functionname" --publish-local-settings
I tried to reproduce this issue by creating a sample function app in python using Visual studio code with default template and finally deployed in Linux. It worked for me.
Here is the piece of code i have written in pyhton file.
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
and here is the function.json file from my function app.
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "samples-workitems/{name}",
"connection": ""
}
]
}
I am using 2.0 Azure function , python 3.6 and Azure Functions Core Tools version 2.2.70
this is the reference link i used :
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-python
Please try to use this and see if it helps.
In your main def of py script you have 2nd argument of doc: func.Out[func.Document], which is for cosmos db. This should be an output stream, as its of type blob