I have built an app that uses mysql database with Python, I would love to share some functionalities with different applications and that calls for an online database feature, kindly give me some insights over how i can move a python mysql database to online and how to make calls to it in order to facilitate for sharing of data between different applications.
I don't exactly know what you are calling a python database but there are some options here that you might want to consider
First, use heroku to host your app and heroku postgress to host your databaseOr you can use an EC2 aws machine to host your app and it's database (in case it's a custom code that you can't call from a browser using heroku)with both of these options you can access you database and the appp with the second one you can install other services such as ssh and other.
Related
I haven't been able to find any documentation regarding whether it's possible to access SQLITE3 (using Python) when the SQLITE database is hosted externally:
I have my SQLITE3 database hosted on my VPS (alongside some other stuff that doesn't really matter) - rather than having it as a local file with my Python program.
Therefore, is it possible for me to connect to the SQLITE database which is hosted on my VPS, or will the SQLITE DB have to be hosted locally for me to be able to do this?
The reason I want it to be accessible from my VPS is because I want to be able to run the program on multiple computers and them all have the same access to the database- if this isn't possible, are there any other options which would allow me to do this?
If you want to have a database server with external, possibly remote, applications interacting a client-server protocol switch to PostgreSQL, MariaDB, etc.
see: How to connect to SQLite3 database server?
I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.
I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database
Yesterday, I installed an apche web server and phpmyadmin on my raspberry-py. How can I connect my raspberry-pi to databases in phpmyadmin with python? Can I use MySQL? Thank, I hope you understand my question and sorry for my bad english.
Your question is quite unclear. But from my understanding, here is what you should try doing: (Note: I am assuming you want to connect your Pi to a database to collect data and store in an IoT based application)
Get a server. Any Basic server would do. I recommend DigitalOcean or AWS LightSail. They have usable servers for just $5 per month. I recommend Ubuntu 16.04 for ease of use.
SSH into the server with your terminal with the IP address you got when you created the server
Install Apache, MySQL, Python, PHPMyAdmin on the server.
Write your web application in any language/framework you want.
Deploy it and write a separate program to make HTTP calls to the said web server.
MySQL is the Database server. Python is the language that is used to execute any instructions. PHPMyAdmin is the interface to view MySQL Databases and Tables. Apache is the webserver that serves the application you have written to deal with requests.
I strongly recommend understanding the basics of Client-Server model of computing over HTTP.
Alternatively, you could also use the approach of Using a DataBase-as-a-service from any popular cloud service provider(Eg., AWS RDS), to make calls directly into the DB.
I spent the whole yesterday migrating my django application to OpenShift (I selected the free solution and my application is using one small gear). My application is now up and running and there are no issues visiting the site using a browser.
However I have a .NET (C#) application which accompanies the web application and it will be run by many different users and it needs to access the database but I can not find a way to do this in OpenShift.
All the different IP addresses seems to be local and I can not find a way to access the MySQL database remotely. Below are the environment variables from OpenShift:
env | grep MYSQL
OPENSHIFT_MYSQL_DIR=/var/lib/openshift/.../mysql/
OPENSHIFT_MYSQL_DB_PORT=3306
OPENSHIFT_MYSQL_DB_HOST=127.13.169.130
OPENSHIFT_MYSQL_DB_PASSWORD=...
OPENSHIFT_MYSQL_IDENT=redhat:mysql:5.5:0.2.9
OPENSHIFT_MYSQL_DB_USERNAME=...
OPENSHIFT_MYSQL_DB_SOCKET=/var/lib/openshift/.../mysql//socket/mysql.sock
OPENSHIFT_MYSQL_DB_URL=mysql://..-...#127.13.169.130:3306/
OPENSHIFT_MYSQL_DB_LOG_DIR=/var/lib/openshift/.../mysql//log/
OPENSHIFT_MYSQL_LD_LIBRARY_PATH_ELEMENT=/opt/rh/mysql55/root/usr/lib64
As explained in the title I am not looking for the port forwarding solution. (I need to make it work not only for me but all the users)
What am I missing?
Why can't databases be accessed externally?
What should I do?
Are there any other FREE paas out there which offer what I am looking for?
Do I need to get a medium or big gear in order for this to work?
Thanks
If you don't want to use port forwarding, then I would suggest you write an API that your .NET application can use to access the database. Otherwise you would want to look into an externally hosted database (DBaaS) solution.
OK, lots of googleing and I now know that using the free solution provided by OpenShift it is not possible to solve this issue.
You must upgrade to a paid version in order to get another port to access the sql database directly.