Azure function deployed but never run on blob input - python

We are setting an Azure functions to be triggered once we have a file in an azure blob storage.
This file will be used as an input of a python script hosted on Github.
Here is the azure function basic script that was generated once the function was set using visual studio code:
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
the aim is that, this TOML input file that was uploaded into the blob should serve as a loader of the variables.
The script then run and generates another file that would be saved in another blob.
Using a web app, we are able to load into the blob, however, the function is not triggered by looking at the monitor tab:
What we want is that within the main() of the azure function, to trigger a python project on github to run with the input file. so it becomes:
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
# python src/main.py fileInput.toml
Any idea why the enabled function is not running and what to add into it's function?

I have reproduced in my environment and got expected results as below and followed below process:
Firstly, created a function app and storage account.
Then in Configuration Section of Function App check if the Connection string is correct in AzureWebJobsStorage:
Copied Connection string from Storage account here:
And pasted here in function app:
Then create a blob function trigger:
Now in Function Trigger code:
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
Now in function.json check the conatiner name :
samples-workitems
Now create a container in storage account linked to Function app:
Then Upload the blob inside the container:
After Uploading the blob:
Output:
Check in Logs section Of Function app :
Now use below query to check the blob trigger is triggered or not:
traces
|where message contains "Python blob trigger function processed blob"

Related

Calling my python function in VS Code Azure Function App tab (http trigger)

I'm new to Azure Function App.
I have my python code that I want to run when http trigger called.
I have new project and calling in "__ init __.py"
What is the correct way to call my code?
Here is "__ init __.py":
import logging
import azure.functions as func
import UploadToGCS
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
UploadToGCS(UploadToGCS.upload_files) <--- I called it here
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
status_code=200
)
Currently I receive "401 error" page
Can you please, suggest how it should be done?
Here is my python code: (I'm uploading file to Google Cloud Storage bucket using details in config_file = find("gcs_config.json", "C:/")):
from google.cloud import storage
import os
import glob
import json
# Finding path to config file that is called "gcs_config.json" in directory C:/
def find(name, path):
for root, dirs, files in os.walk(path):
if name in files:
return os.path.join(root, name)
def upload_files(config_file):
# Reading 3 Parameters for upload from JSON file
with open(config_file, "r") as file:
contents = json.loads(file.read())
print(contents)
# Setting up login credentials
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = contents['login_credentials']
# The ID of GCS bucket
bucket_name = contents['bucket_name']
# Setting path to files
LOCAL_PATH = contents['folder_from']
for source_file_name in glob.glob(LOCAL_PATH + '/**'):
# For multiple files upload
# Setting destination folder according to file name
if os.path.isfile(source_file_name):
partitioned_file_name = os.path.split(source_file_name)[-1].partition("-")
file_type_name = partitioned_file_name[0]
# Setting folder where files will be uploaded
destination_blob_name = file_type_name + "/" + os.path.split(source_file_name)[-1]
# Setting up required variables for GCS
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
# Running upload and printing confirmation message
blob.upload_from_filename(source_file_name)
print("File from {} uploaded to {} in bucket {}.".format(
source_file_name, destination_blob_name, bucket_name
))
config_file = find("gcs_config.json", "C:/")
upload_files(config_file)
Kind regards,
Anna
I'm replying to this as no one else did, and someone might stumble upon this thread looking for an answer.
To run your function locally from VS Code:
Initiate the function in your local environment, run the command in terminal inside vscode:
func init
This will create all the necessary files in your folder and a virtual environments (if you're using Anaconda, you need to configure the setting.json for vscode that points to the Conda environments).
Finish your init.py file. Then start the function with:
func start
The function will deploy at localhost and give you a link.
If you want to deploy it in the cloud, install Azure Function extension, you'll have option to login and select your subscription. After this is done, you can deploy the functions under any Function App that was created in Azure.

How to download/save file to blob starage via azure app functions?

I'm new to Azure. I need http triggered function to perform some simple actions on my blob storage. This will be the part of pipeline in datafactory but first I need to figure it out how to edit blobs via functions. I'm stucked now because I have no idea which API/methods I could use. Appreciate for your help. Below is some part of my code.
def main(req):
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
requested_file = requests.get("web address")
### Should I connect to blob here?
with open("file.txt", "wb") as file:
file.write(requested_file.content)
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
status_code=200
)
You can use either the Python SDK
Or use bindings and triggers for Azure Functions
How you would do it is use the bindings to pull in your blob and then use the bindings to write out your blob. This is actually a pretty good example of it:
Similarly with the SDK, you want to make sure that you are calling in and writing out. Make sure all your keys are correct and your containers are too!

Write file from GCP Cloud Function to Bucket

I am having a very difficult time simply writing a file from a Cloud Function to a Bucket.
I am using this Medium post: https://medium.com/#Tim_Ebbers/import-a-file-to-gcp-cloud-storage-using-cloud-functions-9cf81db353dc
This is the code of the Cloud Function:
#Create function that is triggered by http request
def importFile(request):
#import libraries
from google.cloud import storage
from urllib import request
#set storage client
client2 = storage.Client()
# get bucket
bucket = client2.get_bucket('YOUR-TEST-BUCKET') #without gs://
blob = bucket.blob('animals-1.json')
#See if json exists
if blob.exists() == False :
#copy file to google storage
try:
ftpfile = request.urlopen('https://raw.githubusercontent.com/LearnWebCode/json-example/master/animals-1.json')
#for non public ftp file: ftpfile = request.urlopen('ftp://account:password#ftp.domain.com/folder/file.json')
blob.upload_from_file(ftpfile)
print('copied animals-1.json to google storage')
#print error if file doesn't exists
except:
print('animals-1.json does not exist')
#print error if file already exists in google storage
else:
print('file already exists in google storage')
The function deploys successfully. When I go to "Test" it, I get the very unhelpful error:
Error: function terminated. Recommended action: inspect logs for termination reason. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging Details:
500 Internal Server Error: The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
and
Logs: Not Available
What am I doing wrong here?

Uploading csv file using python to azure blob storage

I'm trying to upload a csv file to a container. It is constantly giving me an error that says - Retry policy did not allow for a retry: , HTTP status code=Unknown, Exception=HTTPSConnectionPool
Here is my code -
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='myAccoutName', account_key='myAccountKey')
block_blob_service.get_blob_to_path(container_name='test1', blob_name='pho.csv', file_path = 'C:\\Users\\A9Q5NZZ\\pho.csv')
I am new to Python so if you can answer with a simple language, that would be really helpful.
Forget uploading a CSV file, it doesn't even let me view existing blobs in an existing container! It gives the same 'Retry Policy' error for the below code -
container_name = 'test1'
generator = block_blob_service.list_blobs(container_name)
for blob in generator:
print("\t Blob name: " + blob.name)
I understand I've asked two questions, but I think the error is the same. Any help is appreciated. Again, since I am new to Python, an explanation/code with simpler terms would be great!
The method get_blob_to_path you're using is for downloading blob to local. If you want to upload a local file to azure blob storage, you should use this method block_blob_service.create_blob_from_path(container_name="",blob_name="",file_path="")
The sample code works at my side:
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='xxx', account_key='xxxx')
block_blob_service.create_blob_from_path(container_name="mycontainier",blob_name="test2.csv",file_path="D:\\temp\\test2.csv")

Is it possible to download file to Google Cloud Storage via API call (with Python) using Google App Engine (without Google Compute Engine)

I wrote a python program which connected various platforms' API for file downloading purposes here. The program is currently running on my local machine (laptop) without a problem (all downloaded files saved to my local drive of course).
Here is my real question, without Google Compute Engine, is it possible to deploy the very same python program using Google App Engine? If yes, how could I save my files (via API calls) to Google Cloud Storage here?
Thanks.
Is this a Web App? If so you deploy it using GOOGLE APP ENGINE standard or flexible.
In order to send files to Cloud Storage, try the example in the python-docs-samples repo (folder appengine/flexible/storage/):
# [START upload]
#app.route('/upload', methods=['POST'])
def upload():
"""Process the uploaded file and upload it to Google Cloud Storage."""
uploaded_file = request.files.get('file')
if not uploaded_file:
return 'No file uploaded.', 400
# Create a Cloud Storage client.
gcs = storage.Client()
# Get the bucket that the file will be uploaded to.
bucket = gcs.get_bucket(CLOUD_STORAGE_BUCKET)
# Create a new blob and upload the file's content.
blob = bucket.blob(uploaded_file.filename)
blob.upload_from_string(
uploaded_file.read(),
content_type=uploaded_file.content_type
)
# The public URL can be used to directly access the uploaded file via HTTP.
return blob.public_url
# [END upload]

Categories

Resources