Is there any chance to read a bunch of messages in my python Azure Function (using the serviceBusTrigger)?
The question is fair simple as it is. I did many tries but it seems that the Azure Function Framework after reading the messages coming from Azure Service Bus, calls the trigger function one message at time, no matter how great the prefetch value is or any parameter is set.
I would like to handle the entire fetch of messages, because when you have to handle million of messages, one at time or many at time makes a difference.
This is my configuration:
function.json
{
"scriptFile": "main.py",
"entryPoint": "function_handler",
"bindings": [{
"name": "msg",
"type": "serviceBusTrigger",
"direction": "in",
"topicName": "topic-name",
"subscriptionName": "topic-subscription-name"
}]
}
main.py
import azure.functions as func
import json
def function_handler(msg: func.ServiceBusMessage):
print(str(msg))
host.json
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
},
"extensions": {
"serviceBus": {
"prefetchCount": 100,
"messageHandlerOptions": {
"autoComplete": true,
"maxConcurrentCalls": 6,
"maxAutoRenewDuration": "00:05:00"
}
}
}
}
I think you may be need a async service bus trigger:
__init__.py
import logging
import asyncio
import azure.functions as func
async def hello():
await asyncio.sleep(2)
async def main(msg: func.ServiceBusMessage):
logging.info('Python ServiceBus queue trigger processed message: %s',
msg.get_body().decode('utf-8'))
await asyncio.gather(hello(), hello())
logging.info(msg.get_body().decode('utf-8')+" is been done.")
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
},
"extensions": {"serviceBus": {"messageHandlerOptions": {"maxConcurrentCalls": 5}}}
}
And I can get:
Related
I have a TimerTrigger function that runs every couple minutes and adds multiple items to a queue, which a QueueTrigger function should process. But each time the QueueTrigger function is firing only once.
my TimerTrigger function:
def main(mytimer: func.TimerRequest, msg: func.Out[str]) -> None:
utc_timestamp = datetime.datetime.utcnow().replace(
tzinfo=datetime.timezone.utc).isoformat()
if mytimer.past_due:
logging.info('The timer is past due!')
logging.info('Python timer trigger function ran at %s', utc_timestamp)
msg.set("1")
msg.set("2")
QueueTrigger function:
def main(msg: func.QueueMessage) -> None:
logging.info('Python queue trigger function processed a queue item: %s',
msg.get_body().decode('utf-8'))
function.json of the TimerTrigger:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */2 * * * *"
},
{
"name": "msg",
"type": "queue",
"direction": "out",
"queueName": "js-queue-items",
"connection": "AzureWebJobsStorage"
}
]
}
function.json of the QueueTrigger:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "msg",
"type": "queueTrigger",
"direction": "in",
"queueName": "js-queue-items",
"connection": "AzureWebJobsStorage"
}
]
}
local.settings.json:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python"
}
}
How can I fix this?
There is no intrinsic way of connecting to the queue storage using azure time trigger i.e. the queue storage will treat the time trigger as any other console application or any other piece of program which is trying to connect to the storage and add message.
Thus, we have to add the messages the classical way by using connection strings.
connectionstring = ""
queuename = ""
messages = ""
queueClient = QueueClient.from_connection_string(connectionString, queueName)
queueClient.send_message(message)
Thus, when queue trigger will start only once during the start and remain idle because no messages are added and thus it will not be triggered.
Refer the documentation on timetrigger and queuetrigger and how to add message to queue using python.
I am pretty new to azure, and struggling with the python function triggers from eventGrid.
I am using ready templates created from azure for python and getting errors.
I will share the files.
(init.py)
import json
import logging
import azure.functions as func
def main(event: func.EventGridEvent):
result = json.dumps(
{
"id": event.id,
"data": event.get_json(),
"topic": event.topic,
"subject": event.subject,
"event_type": event.event_type,
}
)
logging.info("Python EventGrid trigger processed an event: %s", result)
function.json
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "event",
"direction": "in"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[2.*, 3.0.0)"
}
}
and the that's the dataset that i sent to event grid
{
"topic": "/subscriptions/{subscriptionID}/resourcegroups/{resourceGroupName}/providers/Microsoft.EventHub/namespaces/event",
"subject": "eventhubs/test",
"eventType": "captureFileCreated",
"eventTime": "2017-07-14T23:10:27.7689666Z",
"id": "{id}",
"data": {
"fileUrl": "https://test.blob.core.windows.net/debugging/testblob.txt",
"fileType": "AzureBlockBlob",
"partitionId": "1",
"sizeInBytes": 0,
"eventCount": 0,
"firstSequenceNumber": -1,
"lastSequenceNumber": -1,
"firstEnqueueTime": "0001-01-01T00:00:00",
"lastEnqueueTime": "0001-01-01T00:00:00"
},
"dataVersion": "",
"metadataVersion": "1"
}
and the error that i am getting is
fail: Function.{functionName}[3]
Executed 'Functions.{functionName}' (Failed, Id={someID}, Duration=121ms)
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.{functionName}
System.InvalidOperationException: Exception binding parameter 'event'
System.NotImplementedException: The method or operation is not implemented.
probably it is super easy mistake somewhere above but i couldn't find it..
Thanks in advance!
python fail: Function.{functionName}[3] Executed 'Functions.{functionName}' (Failed, Id={someID}, Duration=121ms) Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.{functionName} System.InvalidOperationException: Exception binding parameter 'event' System.NotImplementedException: The method or operation is not implemented.
The above error is quite a common generic error message. It appears when your application hits some limitations of the Event grid.
If It is reaching the maximum hits of limitations of Event Grid refer to this
Note: Event Grid triggers aren't natively supported in an internal load balancer App Service Environment. The trigger uses an HTTP request that can't reach the function app without a gateway into the virtual network.
If you are using a consumption plan try to use the dedicated plan to avoid this issue.
Refer here to reliable event processing to relate this error.
Check the limitations of Event Grid and choose the appropriate plan to for your requirement.
Reference:
azure function fails with function invocation exception
I have written an azure function, currently the azure function is acting as Webhook Consumer. The job of the function is to read Webhook events and save into azure storage.
I am using an HTTP trigger template to get the job done. I am able to receive the events from the Webhook, but when I try to write to azure storage it is giving me below error.
I tried the option mentioned in this post, but no luck still getting the same error.
System.InvalidOperationException: Storage account connection string 'AzureWebJobs<AzureStorageAccountName>' does not exist. Make sure that it is a defined App Setting.
Below is my function.json file
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "blob",
"name": "outputblob",
"path": "test/demo",
"direction": "out",
"connection": "<AzureStorageAccountName>"
}
]
}
init.py
import logging
import azure.functions as func
def main(req: func.HttpRequest,outputblob: func.Out[str]) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = 'some_name'
if not name:
try:
req_body = 'req_body_test'#req.get_json()
except ValueError:
pass
else:
name = 'name'#req_body.get('name')
print(str(req.get_json()))
outputblob.set(str(req.get_json()))
Please make sure you have already add the connection string to the local.settings.json on local or configuration settings on azure.
Please test below code and settings files:
__init__.py
import logging
import azure.functions as func
def main(req: func.HttpRequest,outputblob: func.Out[func.InputStream]) -> func.HttpResponse:
outputblob.set("this is a test.")
return func.HttpResponse(
"Test.",
status_code=200
)
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"name": "outputblob",
"type": "blob",
"path": "test/demo",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
]
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "python",
"MyStorageConnectionAppSetting":"DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net"
}
}
On azure:
As I claimed in the title, is possible to have an azure durable app that triggers using TimerTrigger and not only httpTrigger?
I see here https://learn.microsoft.com/en-us/azure/azure-functions/durable/quickstart-python-vscode a very good example on how implement it with HttpTrigger Client and I'd like to have an example in python on how do it with a TimerTrigger Client, if it's possible.
Any help is appreciate, thanks in advance.
Just focus on the start function is ok:
__init__py
import logging
import azure.functions as func
import azure.durable_functions as df
async def main(mytimer: func.TimerRequest, starter: str) -> None:
client = df.DurableOrchestrationClient(starter)
instance_id = await client.start_new("YourOrchestratorName", None, None)
logging.info(f"Started orchestration with ID = '{instance_id}'.")
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "* * * * * *"
},
{
"name": "starter",
"type": "orchestrationClient",
"direction": "in"
}
]
}
Below is the code,
import logging
import json
import urllib.request
import urllib.parse
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
response = urllib.request.urlopen("http://example.com:5000/processing")
return {
'statusCode': 200,
'body': json.dumps(response.read().decode('utf-8'))
}
Error: Result: Failure Exception: RuntimeError: function 'abc' without a $return binding returned a non-None value Stack: File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/dispatcher.py", line 341, in _handle__invocation_request f'function {fi.name!r} without a $return binding '. The same code works in lambda.. Please help me in debugging in azure functions.
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "sourcemetadata/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
In the Azure function, if you use return in the function app code, it means that you want to use output binding. But you do not define it in function.json. Please define it. For more details, please refer to here and here
For example
I use process blob with blob trigger and send message to azure queue with queue output binding
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "test/{name}.csv",
"connection": "AzureWebJobsStorage"
},
{
"name": "$return",
"direction": "out",
"type": "queue",
"queueName": "outqueue",
"connection": "AzureWebJobsStorage"
}
]
}
Code
async def main(myblob: func.InputStream) :
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n")
return "OK"