Is there a way to programmatically DROP a SQL Database in Azure? - python

I am working on a process to automatically remove and add databases to Azure. When the database isn't in use, it can be removed from Azure and placed in cheaper S3 storage as a .bacpac.
I am using SqlPackage.exe from Microsoft as a PowerShell script to export and import these databases from and to Azure respectively in either direction. I invoke it via a Python script to use boto3.
The issue I have is with the down direction at step 3. The sequence would be:
Download the Azure SQL DB to a .bacpac (can be achieved with SqlPackage.exe)
Upload this .bacpac to cheaper S3 storage (using boto3 Python SDK)
Delete the Azure SQL Database (It appears the Azure Blob Python SDK can't help me, and it appears SQLPackage.exe does not have a delete function)
Is step 3 impossible to automate with a script? Could a workaround be to SqlPackage.exe import a small dummy .bacpac with the same name to overwrite the old bigger DB?
Thanks.

To remove an Azure SQL Database using PowerShell, you will need to use Remove-AzSqlDatabase Cmdlet.
To remove an Azure SQL Database using Azure CLI, you will need to us az sql db delete.
If you want to write code in Python to delete the database, you will need to use Azure SDK for Python.

Related

How to get list of the Postgres databases on Google Cloud using Django/Python

I intend to get the list of databases that are available PostGreSQL instance on Google Cloud.
With command line gcloud tool the following provides the expected result for my project :
gcloud sql databases list --instance=mysqlinstance
How can I get the same result through python / django while using google cloud ?
This is a good one for kind of anything you do via gcloud and want to do it within a script.
With any gcloud call, you can add the flag --log-http and it will spit out a bunch of useful stuff, including a uri tag which will be the REST API call that gcloud uses to fetch the information.
So in your case, you can run:
gcloud --log-http sql databases list --instance=mysqlinstance and the uri will come back with:
https://sqladmin.googleapis.com/sql/v1beta4/projects/<project-name>/instances/mysqlinstance/databases?alt=json
You can now use that REST call within the Python/Django script you have to fetch the same data. You'll need to handle credentials now of course because your script won't be authorized necessarily the same way gcloud will, unless you're always going to run the script from within your own environment. If you do, then it'll work fine as it will fetch your user credentials from within the same environment as you were running gcloud. But if you want this script to run elsewhere you'll need to manage credentials for it.

How to deploy PostgresSQL python app which is hosted in google cloud?

I have problem with deploying python app which writes down additional info to sql database. Postgres SQL is hosted in google cloud and for connecting to it i'm using credential files and invoking it through cmd in path with stored json. Command
In powershell
>>CD "path where stored"
>>cloud_sql_proxy -credential_file=./credentials.json -instances="[PROJECT-ID]:[REGION]:[INSTANCE-NAME]=tcp:5435"
The question, how i will deploy it google app engine, And how can i call function with json. without typing it into cmd?
I see you are using the Cloud SQL Proxy to connect to your PostgreSQL instance.
In the specific case of App Engine the public facing documentation offers clear instructions and code on how to achieve this.
You basically need to provide the App Engine's service account the correct permissions (the recommended role will be Cloud SQL Client) and adapt the code in your Python application to connect to the instance.
The link shared shows how to use the SQL alchemy to achieve this connection for both Public and Private IP.

Run Python script on Azure and save to SQL database

We have just signed up with Azure and were wondering how to schedule and run Python scripts that extract data from various sources like APIs, web scrape scripts, etc. What is the best tool on Azure that can run and schedule those scripts as well as save to target destination.
The output of the scripts will be saved to either data lakes and/or azure sql database.
Thank you.
There're several services in azure can do this task.
I suggest you can take use of azure webjobs(it supports python as well as support running as per schedule).
The rough guidelines are as below:
1.Develop your python scripts locally, make sure it can work locally(like extract data from other sources, save to azure database).
2.In azure portal, Create a scheduled WebJob. During creation, you need to upload the .py file(zip all the files into a .zip file); For "Type", please select "Triggered"; in the Triggers dropdown, select "Scheduled"; then specify at which time to run the .py file by using CRON Expression.
3.It's done.
You can also consider other azure services like azure function with time trigger. But the webjob is much more easier.
Hope it helps, and also please let me know if you still have more issues about that.

Using Cloud SQL (Postgres) in Cloud Function

I'm trying to connect to Cloud SQL running PostgreSQL 11 using a Cloud Function Python script. Ideally, I want to connect to the DB using a simple pandas.DataFrame.to_sql() which requires a sqlalchemy.create_engine() to connect.
This is my current attempt (I've tried both with/without /.s.PGSQL.5432 suffix):
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME/.s.PGSQL.5432')
engine.connect()
I also try the code sample provided by Google which uses psycopg2 directly but also didn't work
Error: function crashed. Details:
could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/cloudsql/vida-production:us-central1:vida-ops/.s.PGSQL.5432"?
Edit:
Thanks #kurtisvg for pointing this out. Since the Cloud Function is trying to connect to Cloud SQL that is on a different project. I have to add the Cloud Function's service account to IAM of the Cloud SQL project.
After setting IAM, the Python code I use to connect sqlalchemy engine to Cloud SQL is this:
# Don't add the "/.s.PGSQL.5432 suffix" because it will already be added back automatically by the library...
engine = sqlalchemy.create_engine('postgresql+psycopg2://USERNAME:PASSWORD#/DB_NAME?host=/cloudsql/INSTANCE_CONNECTION_NAME')
engine.connect()
Make sure to check the following things:
Have the Cloud SQL Admin API enabled
Make sure the Functions default service account has the Cloud SQL Client IAM role
If you are connecting between two projects, make sure to enable the Admin API and grant permissions (for both projects) to the Service Account that is running the function you are trying to connect from.

How can I make a database that can be accessed by multiple people running the same program on different computers?

I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database

Categories

Resources