Creating azure datafactory stored procedure activity parameters - python

I am trying to create a stored procedure activity in azure data factory python sdk. Although the task can receive the value and the type of stored procedure parameters, I am not being able to pass the name of the parameter using StoredProcedureParameter class. Any suggestions?
ps: Already read the documentation and tried to pass as a dict. Did not worked.
-> The result in Azure Data Factory portal
-> Creating the parameters via python sdk

Related

How to get parameters from API Gateway in Cloud Functions

I have a API Gateway that calls a Cloud Function..
I want to create a config file to 3 routes:
..myurl/user
..myurl/client
..myurl/order
My problem is that I would like to use the same Cloud Function wrote in Python to deal with each scenario since it's just a function to get the data and write in BigQuery, in the end my need is just to know if the data is from user, client or order to switch correctly the insert target load.
Today I am using 3 different cloud functions for each path in the API spec.

How do I call an Cosmos DB stored procedure from Azure Function in Python?

I have an HTTP triggered Azure Function written in Python. I also have a CosmosDB container with a stored procedure which takes no parameters. How do I call this stored procedure from within the Azure Function's Python code?
After reviewing the Cosmos docs, I know that one can create an HTTP trigger for the Cosmos DB send a request to the appropriate URL, but I've been unable to find out if this is necessary, and if it is, are there any Azure Python modules that create the necessary URLs and boiler plate.
I've been unable to find in the Azure Functions docs if there are bindings for stored procedures.
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
# What do I type here?
return func.HttpResponse(....)
Please refer to the official python documentation on how to call a stored procedure LINK.
Also, there is no output bindings to trigger Cosmos DB stored procedures.

Execute python script from azure data factory

Can someone help me with executing python function from azure data factory.
I have stored python function in blob and i'm trying to trigger the same.
However i'm not able to do it. Please assist.
Second, Can i parameterize python function call from ADF?
You could get an idea of Azure Function Activity in ADF which allows you to run Azure Functions in a Data Factory pipeline.
And you could duplicate your python function into Python Azure Function.
Also,it want to pass parameters into python function,you could set them into body properties.
The Azure Function Activity supports routing. For example, if your app uses the following routing - https://functionAPP.azurewebsites.net/api/functionName/{value}?code=<secret> - then the functionName is functionName/{value}, which you can parameterize to provide the desired functionName at runtime.

Is it possible to use x-goog-if-generation-match in google.cloud.storage python client?

I'm using google.cloud.storage Python library to store files in Google Cloud Storage from my App Engine flexible application. I would like to x-goog-if-generation-match header to prevent overwriting newer versions, but can't find a way to pass it to the library.
Is it possible to utilise blob generation checking with google.cloud.storage Python library?
As of today, google cloud python client supports generation and metageneration in the API, including Blob.upload_from_string:
https://googleapis.dev/python/storage/latest/generation_metageneration.html#conditional-parameters
Using if_generation_match
Passing the if_generation_match parameter to a method which retrieves a blob resource (e.g., Blob.reload) or modifies the blob (e.g., Blob.update) makes the operation conditional on whether the blob’s current generation matches the given value.
As a special case, passing 0 as the value for if_generation_match makes the operation succeed only if there are no live versions of the blob.
AFAIK the x-goog-if-generation-match header is only available in the XML API.
The google.cloud.storage library doesn't allow generic, direct access to the request headers. In some cases access is supported, but via dedicated properties and I see none equivalent to x-goog-if-generation-match in google.cloud.storage.blob.Blob.
I do see methods for retrieving the object/blob's generation and meta-generations, though (but those aren't equivalent to x-goog-if-generation-match):
generation
Retrieve the generation for the object.
See https://cloud.google.com/storage/docs/json_api/v1/objects
Return type: int or NoneType Returns: The generation of the blob or
None if the blob’s resource has not been loaded from the server.
and
metageneration
Retrieve the metageneration for the object.
See https://cloud.google.com/storage/docs/json_api/v1/objects
Return type: int or NoneType Returns: The metageneration of the blob
or None if the blob’s resource has not been loaded from the server.

Python vs. Node.js Event Payloads in Firebase Cloud Functions

I am in the process of writing a Cloud Function for Firebase via the Python option. I am interested in Firebase Realtime Database Triggers; in other words I am willing to listen to events that happen in my Realtime Database.
The Python environment provides the following signature for handling Realtime Database triggers:
def handleEvent(data, context):
# Triggered by a change to a Firebase RTDB reference.
# Args:
# data (dict): The event payload.
# context (google.cloud.functions.Context): Metadata for the event.
This is looking good. The data parameter provides 2 dictionaries; 'data' for notifying the data before the change and 'delta' for the changed bits.
The confusion kicks in when comparing this signature with the Node.js environment. Here is a similar signature from theNode.js world:
exports.handleEvent = functions.database.ref('/path/{objectId}/').onWrite((change, context) => {}
In this signature, the change parameter is pretty powerful and it seems to be of type firebase.database.DataSnapshot. It has nice helper methods such as hasChild() or numChildren() that provide information about the changed object.
The question is: Does Python environment have a similar DataSnapshot object? With Python, do I have to query the database to get the number of children for example? It really isn't clear what Python environment can and can't do.
Related API/Reference/Documentation:
Firebase Realtime DB Triggers: https://cloud.google.com/functions/docs/calling/realtime-database
DataSnapshot Reference: https://firebase.google.com/docs/reference/js/firebase.database.DataSnapshot
The python runtime currently doesn't have a similar object structure. The firebase-functions SDK is actually doing a lot of work for you in creating objects that are easy to consume. Nothing similar is happening in the python environment. You are essentially getting a pretty raw view at the payload of data contained by the event that triggered your function.
If you write Realtime Database triggers for node, not using the Firebase SDK, it will be a similar situation. You'll get a really basic object with properties similar to the python dictionary.
This is the reason why use of firebase-functions along with the Firebase SDK is the preferred environment for writing triggers from Firebase products. The developer experience is superior: it does a bunch of convenient work for you. The downside is that you have to pay for the cost of the Firebase Admin SDK to load and initialize on cold start.
Note that might be possible for you to parse the event and create your own convenience objects using the Firebase Admin SDK for python.

Categories

Resources