I am using Ubuntu 16.04.5 LTS local machine to create and publish Python Function App to Azure using CLI and Azure Functions Core Tools (Ref). I have configured Blob Trigger and my function.json file looks like this:
{
"disabled": false,
"scriptFile": "__init__.py",
"bindings": [
{
"name": "<Blob Trigger Name>",
"type": "blobTrigger",
"direction": "in",
"path": "<Blob Container Name>/{name}",
"connection": "<Connection String having storage account and key>"
},
{
"name": "outputblob",
"type": "blob",
"path": "<Blob Container Name>",
"connection": "<Connection String having storage account and key>",
"direction": "out"
}
]
}
My init.py function looks like this.
def main(<Blob Trigger Name>: func.InputStream, doc: func.Out[func.Document]):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {<Blob Trigger Name>.name}\n"
f"Blob Size: {<Blob Trigger Name>.length} bytes")
logging.basicConfig(filename='example.log',level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
# Write text to the file.
file = open("QuickStart.txt", 'w')
file.write("Hello, World!")
file.close()
# Create the BlockBlockService that is used to call the Blob service for the storage account
block_blob_service = BlockBlobService(account_name='<Storage Account Name>', account_key='<Storage Account Key>')
container_name='<Blob Container Name>'
# Set the permission so the blobs are public.
block_blob_service.set_container_acl(container_name, public_access=PublicAccess.Container)
# Upload the created file, use local_file_name for the blob name
block_blob_service.create_blob_from_path(container_name, 'QuickStart.txt', '')
The Function App is "Always On" but when I upload a blob in the storage the function is not getting triggered. Another Reference Link is this (Ref).
What's going wrong?
Thanks and regards,
Shashank
Have you checked that the local.settings.json (connection strings for storage accounts) are also in the function app in Azure? They are not published from local machine by default.
You can configure them manually in the portal or use the publish-local-settings flag:
func azure functionapp publish "functionname" --publish-local-settings
I tried to reproduce this issue by creating a sample function app in python using Visual studio code with default template and finally deployed in Linux. It worked for me.
Here is the piece of code i have written in pyhton file.
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
and here is the function.json file from my function app.
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "samples-workitems/{name}",
"connection": ""
}
]
}
I am using 2.0 Azure function , python 3.6 and Azure Functions Core Tools version 2.2.70
this is the reference link i used :
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-python
Please try to use this and see if it helps.
In your main def of py script you have 2nd argument of doc: func.Out[func.Document], which is for cosmos db. This should be an output stream, as its of type blob
Related
I'm running an Azure function locally, from VSCode, that outputs a string to a blob. I'm using Azurite to emulate the output blob container.
My function looks like this:
import azure.functions as func
def main(mytimer: func.TimerRequest, outputblob:func.Out[str]):
outputblob.set("hello")
My function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * * * * *"
},
{
"name": "outputblob",
"type": "blob",
"dataType": "string",
"direction": "out",
"path": "testblob/hello"
}
]
}
In local.settings.json, I've set "AzureWebJobsStorage": "UseDevelopmentStorage=true".
The problem is, when I run the function and check in Azure Storage Explorer, the container is created (testblob) (along with 2 other containers: azure-webjobs-hosts and azure-webjobs-secrets) but it is empty and Azure Storage Explorer displays an error message when I refresh :
The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object.Received undefined
The function runs and doesn't return any error message.
When I use a queue instead of a blob as output, it works and I can see the string in the emulated queue storage.
When I use the blob storage in my Azure subscription instead of the emulated blob, it works as well, a new blob is created with the string.
I've tried the following:
clean and restart Azurite several times
replace "UseDevelopmentStorage=true" by the connection string of the emulated storage
reinstall Azure Storage Explorer
I keep getting the same error message.
I'm using Azure Storage Explorer Version 1.25.0 on Windows 11.
Thanks for any help!
It looks like this is a known issue with the latest release (v1.25.0) of Azure Storage Explorer version see:
https://github.com/microsoft/AzureStorageExplorer/issues/6008
Simplest solution is to uninstall and re-install an earlier version:
https://github.com/microsoft/AzureStorageExplorer/releases/tag/v1.24.3
I have written a program to convert a file JSON in each line to convert into a JSON array.
Refer to the below link to understand what I want to achieve:
How to get JSON Array in a blob storage using dataflow
I have created below files for trigger:
function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "jsonfiletrigger",
"type": "blobTrigger",
"direction": "in",
"path": "<containername>/in.json",
"connection": "<Storage account>"
},
{
"name": "blobin",
"type": "blob",
"direction": "in",
"path": "<containername>/in.json",
"connection": "<Storage account>"
},
{
"name": "blobout",
"type": "blob",
"direction": "out",
"path": "<containername>/out.json",
"connection": "<Storage account>"
}
],
"disabled": false
}
host.json:
{
"version": "2.0"
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=<Storage account>;AccountKey=<Storage account access key>;EndpointSuffix=core.windows.net",
"FUNCTIONS_EXTENSION_VERSION": "~3",
"FUNCTIONS_WORKER_RUNTIME": "python",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<appinsight instrumentation key>",
"APPLICATIONINSIGHTS_CONNECTION_STRING": "InstrumentationKey=<Instrumentation key>;IngestionEndpoint=https://westeurope-3.in.applicationinsights.azure.com/"
},
"ConnectionStrings": {}
}
init.py
import logging
import azure.functions as func
import sys
import json
import os
def main(blobin: func.InputStream, blobout: func.Out[bytes], context: func.Context):
logging.info('env variables :: %s' % dict(os.environ))
jsonarr = []
try:
with open(blobin, 'rt') as fin:
for line in fin.readlines():
jsonobj = json.loads(line.strip())
jsonarr.append(jsonobj)
except OSError as e:
print(f'EXCEPTION: Unable to read input as file. {e}')
sys.exit(254)
except Exception as e:
print(f'EXCEPTION: {e}')
sys.exit(255)
try:
with open(blobout, 'wt') as fout:
json.dump(jsonarr, indent=4, fp=fout)
except OSError as e:
print(f'EXCEPTION: Unable to write output. {e}')
sys.exit(254)
except Exception as e:
print(f'EXCEPTION: {e}')
sys.exit(255)
I ran the below command to publish:
func azure functionapp publish jsonlisttoarray --publish-local-settings
I see files are in the functionapp. But not sure why the function doesn't get triggered.
Please help resolve the issue.
The problem may be caused by the connection string of storage account hadn't been uploaded to azure portal when you do deployment.
We can see the document shows us values in ConnectionStrings will not be published to azure when you run the command func azure functionapp publish jsonlisttoarray --publish-local-settings.
I test it in my side, the values under ConnectionStrings field in local.settings.json wasn't published to function application settings on portal when I do deployment. And it will lead to the function can't be triggered.
To solve this problem, you need to go to your function app on azure portal first. Then click "Configuration" --> under "Application settings" tab --> click "New application setting" to add a variable with the name and value same with it in your local.settings.json under ConnectionStrings field.
================================Update================================
It seems you there is a mistake of connection field in your function.json. First, you should add a variable in local.settings.json with the value of the storage connection string like below:
Then set the value of connection field with the name of connection string(in local.settings.json) in function.json:
Then deploy the function to azure again with the command func azure functionapp publish jsonlisttoarray --publish-local-settings.
Note: if you do not add --publish-local-settings in your publish command, it will not upload the value in local.settings.json to your function app when you do deployment.
I found that problem was with the directory structure. I was have having all the files in the same directory. I needed to have function.json and init.py or any other sources under directory.
A function-app can have multiple functions each function shares the same settings, requirements.txt, and host.json.
The directory structure looks like below:
$ ls -Ra
.:
. .. BlobTrigger extensions.csproj host.json local.settings.json proxies.json .python_packages requirements.txt
./BlobTrigger:
. .. function.json __init__.py
./.python_packages:
. ..
I am working on one of the Azure Function which is written in Python and it should get called based on Blob trigger event. However, the trigger is not firing when I am uploading a zip file in a blob container to which azure function is supposed to monitor.
Following is local.settings.json file -
{
"IsEncrypted": false,
"Values":
{
"AzureWebJobsStorage": "blob (connection string) that was created when Azure function is created",
"FUNCTIONS_WORKER_RUNTIME": "python",
"My_STORAGE": "blob (connection string) that function should monitor"
}
}
Following is function.json file -
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "mycontainer/{name}",
"connection": "My_STORAGE"
}
]
}
Following is my code init.py - (test_func is user defined function to do some business logic)
def main(myblob: func.InputStream):
test_func(myblob.name)
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
When I'm uploading zip file in "mycontainer" container, the azure function is not firing.
The "mycontainer" of StorageV2 (general purpose v2) account kind. I am using Python 3.8 version.
This "mycontainer" has automatically created a container named $logs that has day wise folder to have a log file mentioning the file that I'm uploading in "mycontainer", however, there is no sign of blob trigger event on Azure function.
"My_STORAGE" is added as Application Settings in Azure Function's Configuration settings. I am uploading Local Settings after Azure Function deployment.
Any idea what is going wrong?
Thank you.
I've had a similar problem. It was solved by adding the connection string as a key-value pair to the Application Settings in Function App->Configuration->Settings. I assume that it has to be done if Azure Functions Apps are deployed in a different storage container, for which there needs to be a different connection string.
The problem was caused by mistake in connection string. The AccountName in connection string should be storage account name but not the name of container.
So just change AccountName=mycontainer to AccountName=<storage account name>, then it works.
And by the way:
The connection string should be: DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=xxxxxxxxxx==;EndpointSuffix=core.windows.net
The "path" in "function.json" should be: "path": "<container name>/{name}"
Copy the <storage_account>_STORAGE key from local.settings.json file to a new application setting key-value:
I have encountered a problem when setting blob metadata in Azure Storage. I have developed a script for this in Spyder, so local Python, which works great. Now, I want to be able to execute this same script as an Azure Function. However, when setting the metadata I get the following error: HttpResponseError: The specifed resource name contains invalid characters.
The only change from Spyder to Functions that I made is:
Spyder:
def main(container_name,blob_name,metadata):
from azure.storage.blob import BlobServiceClient
# Connection string to storage account
storageconnectionstring=secretstoragestringnotforstackoverflow
# initialize clients
blobclient_from_connectionstring=BlobServiceClient.from_connection_string(storageconnectionstring)
containerclient=blobclient_from_connectionstring.get_container_client(container_name)
blob_client = containerclient.get_blob_client(blob_name)
# set metadata of container
blob_client.set_blob_metadata(metadata=metadata)
return
Functions
def main(req: func.HttpRequest):
container_name = req.params.get('container_name')
blob_name = req.params.get('blob_name')
metadata_raw = req.params.get('metadata')
metadata_json = json.loads(metadata_raw)
# Connection string to storage account
storageconnectionstring=secretstoragestringnotforstackoverflow
# initialize clients
blobclient_from_connectionstring=BlobServiceClient.from_connection_string(storageconnectionstring)
containerclient=blobclient_from_connectionstring.get_container_client(container_name)
blob_client = containerclient.get_blob_client(blob_name)
# set metadata of container
blob_client.set_blob_metadata(metadata=metadata_json)
return func.HttpResponse()
Arguments to the Function are passed in the header. Problem lies with metadata and not container_name or blob_name as I get no error when I comment out metadata. Also, I tried formatting metadata in many variations with single or double quotes and as JSON or as string but no luck so far. Anyone who could help me solve this problem?
I was able to fix the problem. Script was fine, problem was in the input parameters. They needed to be in a specific format. metadata as a dict with double quotes and blob/container as string without any quote.
As request the function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
With parameter formatting:
Picture from Azure Functions
I have created a simple blob trigger in visual studio for which init.py is as below
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
and function.json is as below
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "mycontainer/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
local.settings.json looks as below
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https; AccountName=****;AccountKey=*****;EndpointSuffix=core.windows.net"
}
}
This code works fine with visual studio on local machine. But when published on azure portal it can not read blob path from function.json and gives error as
Invalid blob path specified : ''. Blob identifiers must be in the format 'container/blob'.
I have published function using command to push contains of local.settings.json.
func azure functionapp publish FUNCTIONNAME --build-native-deps --publish-local-settings -i
.
Can anyone please guid me what I am missing after publishing.
Are you using the run button in the Azure portal to test your function? The way this works for blob triggers is that in the 'Test' tab on the right hand side, you can specify the name of the blob you want to manually send a trigger event for, forcing your function to run:
The idea is that you should edit the contents of the request body box and put in the path to a valid blob in your account. That way the trigger runs and finds the blob and retrieves it. So if you don't modify the request body box, then it will look for a blob and fail to find it and throw the 404 error.
Also please take a look at below document for configuring container name
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#storage-blob-trigger
Also please verify if you setting has been published in the portal or not.
func azure functionapp publish "functionname" --publish-local-settings
Hope it helps.