I am currentlt using Monstache connector for sync MongoDB elastic search. is there any other recommended connector for sync.
Architecture: Angular7 Elastic Search Mongo DB Python Pycharm.
is there any other solution apart from this. https://rwynn.github.io/monstache-site/.
Related
I am working on a process to automatically remove and add databases to Azure. When the database isn't in use, it can be removed from Azure and placed in cheaper S3 storage as a .bacpac.
I am using SqlPackage.exe from Microsoft as a PowerShell script to export and import these databases from and to Azure respectively in either direction. I invoke it via a Python script to use boto3.
The issue I have is with the down direction at step 3. The sequence would be:
Download the Azure SQL DB to a .bacpac (can be achieved with SqlPackage.exe)
Upload this .bacpac to cheaper S3 storage (using boto3 Python SDK)
Delete the Azure SQL Database (It appears the Azure Blob Python SDK can't help me, and it appears SQLPackage.exe does not have a delete function)
Is step 3 impossible to automate with a script? Could a workaround be to SqlPackage.exe import a small dummy .bacpac with the same name to overwrite the old bigger DB?
Thanks.
To remove an Azure SQL Database using PowerShell, you will need to use Remove-AzSqlDatabase Cmdlet.
To remove an Azure SQL Database using Azure CLI, you will need to us az sql db delete.
If you want to write code in Python to delete the database, you will need to use Azure SDK for Python.
Looking for relevant python code to be used in the GCP AI Platform notebook which will be able to query GCP Cloud SQL (specifically Postgresql.) Unfortunately, I haven't found any relevat resources/tutorials from GCP official or even unaffiliated resources.
I work on the Developer Relations team for Cloud SQL at Google. The quickest and simplest way to connect to a Cloud SQL instance from an AI Platform Notebook is to use our newly released Cloud SQL Python Connector. To install, run pip install cloud-sql-python-connector[pg8000]. This will also install the pg8000 driver which is used to connect to Postgres.
Then, you can use the following code to get a connection object which you can use to run queries:
conn = connector.connect(
"project_name:region-name:instance-name",
"pg8000",
user="postgres",
password="password",
db="db_name"
)
An alternative, which is language-independent, is to download and run the Cloud SQL Auth Proxy in a separate terminal from your notebook.
For either of these solutions, make sure to enable the Cloud SQL Admin API first.
I'm pretty new to the AWS platform. I want to Fetch data from RDS MYSQL Database to my Python application using API. I couldn't find any relevant tutorial to achieve this task. It will be awesome If you could guide me or send a proper tutorial.
I couldn't find any relevant tutorial to achieve this task.
This is because such operation is not supported. To connect to your RDS database and get its data you have to use regular mysql tools, such as mysql cli, phpmyadmin or mysql workbanch.
You can only use AWS API for Mysql if you are using Aurora Serverless and its data api.
I am trying execute Python script from RDS SQL Server 15 version but I didn't find any documentation around this in AWS Will it be possible to do this?
Unfortunately that is not possible as of now. RDS for SQL Server is just Relational Database Service and it does not allow you to execute any program on the RDS instance, except for T-SQL programmability stored within your SQL Server database (triggers, stored procedures, etc).
I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database