Azure Webhooks -Python - python

I have an onsite SQL server which runs and posts relevant records to to a data warehouse accessible via API endpoint. I need to create a webhook to detect changes whenever rows are added or deleted from the warehouse table.Preferably, the webhook should trigger a message to Azure queue storage via a httptrigger.
How can I go about this in azure? I cant get my hands on any straightforward documentation or tutorial.
If it cant be done in Azure, are there any other third party platforms with which I can create a webhook to detect changes to the table given the end points url?
I have been able to create a webhook in ArcGIS which is currently successfully running on the same logic. I am however now required to change and have that triggered by activity on the datawarehouse API. Any help will be appreciated?

Related

Python Tableau Server Client to refresh Cache memory in Tableau Server

My org had a dashboard some time ago that was very complex and took some time to load, however, once we loaded it once, performance was fine for the rest of the day (due to cache memory being refreshed). Therefore, we had someone in our team go open the dashboard everyday early in the morning so the cache memory for the day was refreshed. I am now learning about Tableau Server Client (TSC) module in python, which uses the Tableau API to manipulate Tableau server, and I am wondering if I could accomplish the same thingĀ using TSC. Is that possible? If so, what would be the syntax? I looked into the documentation and I didn't see anything about it. I am currently able to trigger an extract refresh using TSC, but I would like to see if I can just refresh the cache memory of a dashboard. Or perhaps there is another function of TSC that does something similar. Any thoughts? Thanks
You can do what is called cache warming for Tableau Server 9.0+ using subscriptions.
Here are the steps found in the tableau kb:
Starting with Tableau Server 9.0, the Cache Server can be "warmed" with data using the Subscription function. Triggering a subscription email that includes a thumbnail after executing an extract refresh will cause the queries to run for the viz and load into the external query cache.
If a user wants a fast-loading view for an 8 AM meeting:
Tableau Administrator schedules an extract refresh at 2 AM.
Tableau Administrator schedules a subscription email at 5 AM.
User loads workbook quickly from stored cache at 8 AM.
Note: if "Refresh More Often" is selected in the Data Connections tab of Configure Tableau Server, the cache will be cleared every time the view is loaded. Additionally, regardless of cache settings, if a user hits the "Refresh Data" button on the toolbar, the Tableau Server will fetch new data.
If you want to use python then I would just use requests.get() to hit your url on a scehdule.

How to trigger python script with Hasura event

I'm currently building a selfhosted Vuejs webapp (with account logins).
The webapp needs to be a Python webscraper GUI where my user has control over the Python scraper.
So for example, the user fills in an endpoint, starts scraper, view results, trigger a new more depth scraper, etc.
I have the Python scripts for scraping.
And I have decided to go with VueJS + AWS cognito + Hasura for the frontend
I have difficulties understanding how to trigger the python scripts and dump the results in the database and show them to the frontend.
I do like the 3 factor approach:
The data from my scrapers can be many db entries, so I don't like to enter them in the database via mutations.
Do I have to make Flask endpoints to let Hasura trigger these webhooks?
I'm not familiar with serverless.
How do I make my Python scraper scripts serverless?
Or can I just use SQLalchemy to dump the scraper results into the database?
But how do I notify my frontend user that the data is ready?
There are a lot of questions in this one and the answer(s) will be somewhat opinionated so it might not be the greatest fit for StackOverflow.
That being said, after reading through your post I'd recommend that for your first attempt at this you use SQLalchemy to store the results of your scraper jobs directly into the Database. It sounds like you have the most familiarity with this approach.
With Hasura, you can simply have a subscription to the results of the job that you query in your front end so the UI will automatically update on the Vue side as soon as the results become available.
You'll have to decide how you want to kick off the scraping jobs, you have a few different options:
Expose an API endpoint from your Python app and let the UI trigger it
Use Hasura Actions
Build a GQL server in Python and attach it to Hasura using Remote Schemas
Allow your app put a record into the Database using a graphql mutation that includes information about the scrape job and then allow Hasura to trigger a webhook endpoint in your Python app using Hasura Event Triggers
Hasura doesn't care how the data gets into the database it provides a ton of functionality and value even if you're using a different Database access layer in another part of your stack.

How to connect Google sheet API on AWS EC2

I am trying to use Google Sheets API to load data into EC2 using Python.
So, I tried this quickstart.
But I am stuck configuring OAuth client to get credential.json file. I can't understand what drop-down type I should select.
Hope I was clear. Thanks in advance for your time.
Depending on the type of applications you want to create, you will have to choose one of the options provided in the dropdown.
Since you want to use a Python script, you can use the credentials of type Desktop and connect to AWS EC2 from the same application. However, you can always create new ones in the corresponding GCP project to match the application you are working on.
Reference
Sheets API Authorize Requests.

Need a Https endpoint for microsoft teams and python integration

I have some python scripts which perform some jobs based on the user inputs I want to host this on microsoft teams. For an outgoing webhook teams asks for a https link where and how do I get it. I am pretty new to this so do not take anything for granted.
Basically, this "outgoing webhook" means that Teams has the ability to call a web service of some sort, hosted on a publicly-accessible https address. In the end, it functions very similarly to a bot, so it's possible to just create a full-blown bot. Here's guidance on creating a Microsoft bot (for Teams or otherwise) using Python.
However, there's a more simple option, of basically just hosting a web function somewhere (e.g. an Azure Function or, I guess, an Amazon Lamba). See this article. As mentioned in this link:
Outgoing webhooks post data from Teams to any chosen service capable
of accepting a JSON payload. Once an outgoing webhook is added to a
team, it acts like bot, listening in channels for messages using
#mention, sending notifications to external web services, and
responding with rich messages that can include cards and images.
An Azure Function automatically gets a full, unique, https address, so it's fine to use.
As another example, this blog post describes how to create a Flow ("Power Automate") that the webhook calls into. This example also ends up using an Azure Function to "glue together" Teams + Flow, but it explains the concepts a bit. You could ignore Flow and just use an Azure Function.
Whether to build an -actual- bot depends on -what else- you might want to be able to do. For instance, do you want to have a more complete conversation with the user? Do you want to the user to be able to interact with your code outside of a channel (e.g. a 1-1 conversation)? These are the kinds of things that will indicate if you might need a proper bot.
You need to use bot framework to create bot that will handle that:
https://github.com/microsoft/botframework-sdk
https://github.com/microsoft/BotBuilder-Samples

Amazon MWS - how do I go about using AnyOfferChanged Notification with python

I want to develop an app that sends me an email when pricing offers for specific listings change using the AnyOfferChanged MWS notifications. However, I can't find any good documentation on how to go about receiving the notifications. Is it a must to have AWS SQS, or can I use Django? how do I go about subscribing to a notification?
I already have a developers account and I'm using the python mws library
You need to subscribe to the AnyOfferChangedNotification through the Subscriptions API and yes, it must use SQS. I found it easiest to use the scratchpad to create the subscription, since it's usually a one-time event.
Once your price change notifications start flowing into your queue, write an app that reads the queue and you can respond to your messages, including sending an email if that's what you want to do.
See if these code samples for SQS help you: https://docs.aws.amazon.com/code-samples/latest/catalog/code-catalog-python-example_code-sqs.html

Categories

Resources