How to connect Azure Web App with Databricks - python

Is it possible that a flask web app uploaded to the web app service can connect to databricks? The app should have some dropdown menus and send a command to databricks. Databricks should search the data from the data lake?.

As I known, there are two ways to connect Azure Databricks in Python.
Refer to the offical document Connect to Azure Databricks from Excel, Python, or R, you can download and install Simba Spark ODBC Driver and pyodbc to follow the section Connect from Python to retrieve the data from Azure Databricks. However, I don't think you can follow the Simba offical document About the Simba Spark ODBC Driver for Windows to install and configure the driver on Azure WebApp for Windows, but you can try to follow the document About the Simba Spark ODBC Driver for Unix/Linux to do on Azure WebApp for Linux.
Alternatively, there is a Python package named databricks-connect, as its description said as the figure below, you can try to use it to connect custom applications like flask to Azure Databricks clusters and run Spark code.
The more details for its usage, you can refer to the offical document Databricks Connect and the blog Databricks-Connect - FINALLY!.
Hope it helps.

Related

Download or export Azure databricks notebooks to my local machine in Python using REST API

I need to automate a way to download Azure Databricks notebooks using Python to my local machine. Please let me know if there are any ways.
Yes, there is an API endpoint to export a notebook.
Refer to the documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/workspace#--export
Here's how to make API requests with Python: Making a request to a RESTful API using python

Access GCP Cloud SQL from AI notebook?

Looking for relevant python code to be used in the GCP AI Platform notebook which will be able to query GCP Cloud SQL (specifically Postgresql.) Unfortunately, I haven't found any relevat resources/tutorials from GCP official or even unaffiliated resources.
I work on the Developer Relations team for Cloud SQL at Google. The quickest and simplest way to connect to a Cloud SQL instance from an AI Platform Notebook is to use our newly released Cloud SQL Python Connector. To install, run pip install cloud-sql-python-connector[pg8000]. This will also install the pg8000 driver which is used to connect to Postgres.
Then, you can use the following code to get a connection object which you can use to run queries:
conn = connector.connect(
"project_name:region-name:instance-name",
"pg8000",
user="postgres",
password="password",
db="db_name"
)
An alternative, which is language-independent, is to download and run the Cloud SQL Auth Proxy in a separate terminal from your notebook.
For either of these solutions, make sure to enable the Cloud SQL Admin API first.

Is there any way to connect a GAE application with a custom Google Cloud Database?

I'm trying to connect my webapp2 application to a 'in-cloud' database.
To run it local I'm using the following commands:
--datastore_path=/<path>/<to>/<project>/.db/datastore
--blobstore_path=/<path>/<to>/<project>/.db/blobstore
The problem is that I don't want a local path to my datastore/blobstore.
Is there any way to connect in a 'in-cloud' database passing a different path? Can't find any solutions like that
You can use Remote API:
The Remote API library allows any Python client to access services
available to App Engine applications.
For example, if your App Engine application uses Datastore or Google
Cloud Storage, a Python client could access those storage resources
using the Remote API.
in order to get remotely access to Google Cloud Datastore using webapp2.

How to deploy PostgresSQL python app which is hosted in google cloud?

I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.

How can I make a database that can be accessed by multiple people running the same program on different computers?

I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database

Categories

Resources