I have a relatively extensive sqlite database that I'd like to import into my Google App Engine python app.
I've created my models using the appengine API which are close, but not quite identical to the existing schema. I've written an import script to load the data from sqlite and create/save new appengine objects, but the appengine environment blocks me from accessing the sqlite library. This script is only to be run on my local app engine instance, and from there I hope to push the data to google.
Am I approaching this problem the wrong way, or is there a way to import the sqlite library while running in the local instance's environment?
I would make suitable CSV files from the Sqlite data, in a separate script, then use bulk loading to push the data from the CSV files up to app engine.
If you need to access your datastore outside of the App Engine environment (like if you need to use libraries not present in App Engine or do other things App Engine does not support with the datastore) then the best option is the Remote Api.
There is an excellent tutorial on that here:
http://code.google.com/appengine/articles/remote_api.html
Essentially you import the remote_api module, authenticate with Google to access your datastore, then run your data access commands (query, update, delete, etc) as you normally would in app engine.
According to Google, you're doing it backwards. The app should be pulling data from you where you have more flexibility in converting to the new model anyway.
I have not had any trouble importing pysqlite2, reading data, then transforming it and writing it to AppEngine using the remote_api.
What error are you seeing?
Related
I have tried to follow googles documentation on how to set up local development using a database (https://cloud.google.com/appengine/docs/standard/python/tools/using-local-server#Python_Using_the_Datastore). However, i do not have the experience level to follow along. I am not even sure if that was the right guide. The application is a Django project that uses python 2.7. To run the local host, i usually type dev_appserver.py --host 127.0.0.1 .
My questions are:
how do i download the data store database on google cloud. I do not want to download the entire database, just enough data to populate local host so i can do tests
once the database is download, what do i need to do to connect it to the localhost? Do i have to change a parameter somewhere?
do i need to download the datastore? Can i just make a duplicate on the cloud and then connect to that datastore?
When i run localhost, should it not already be connected to the datastore? Since the site works when it is running on the cloud. Where can i find the connection URI?
Thanks for the help
The development server is meant to simulate the whole App Engine Environment, if you examine the output of the dev_appserver.py command you'll see something like Starting Cloud Datastore emulator at: http://localhost:PORT. Your code will interact with that bundled Datastore automatically, pushing and retrieving data according to the code you wrote. Your data will be saved on a file in local storage and will persist across different runs of the development server unless it's explicitly deleted.
This option doesn't provide facilities to import data from your existing Cloud Datastore instance although it's a ready to go solution if your testing procedures can afford populating the local database with mock data through the use of a custom created script that does so programmatically. If you decide for this approach just write the data creation script and execute it before running the tests.
Now, there is another option to simulate local Datastore using the Cloud SDK that comes with handy features for your purposes. You can find the available information for it under Running the Datastore Emulator documentation page. This emulator has support to import entities downloaded from your production Cloud Datastore as well as for exporting them into files.
Back to your questions:
Export data from the Cloud instance into a GCS bucket following this, then download the data from the bucket to your filesystem following this, finally import the data into the emulator with the command shown here.
To use the emulator you need to first run gcloud beta emulators datastore start in a Cloud Shell and then in a separate tab run dev_appserver.py --support_datastore_emulator=true --datastore_emulator_port=8081 app.yaml.
The development server uses one of the two aforementioned emulators, in both cases it is not connected to your Cloud Datastore. You might create another project aimed for development purposes with a copy of your database and deploy your application there so you don't use the emulator at all.
Requests at datastore are made trough the endpoint https://datastore.googleapis.com/v1/projects/project-id although this is not related to how the emulators manage the connections in your local server.
Hope this helps.
I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database
I am writing a small web application using Flask and I have to use DynamoDB as backend for some hard requirements.
I went through the tutorial on Flask website without establishing sqlite connection. All data were pulled directly from DynamoDB and it seemed to work.
Since I am new to web development in general and Flask framework, do you see any problems with this approach?
No. SQLite is just one option for backend storage. SQLite is mentioned in the tutorial only for its simplicity in getting something working fast and simply on a typical local developers environment. (No db to or service to install/configure etc.)
I just deployed a site on GAE which requires me to stage some data for dropdown fields (i.e. us states, status, etc.).
In development, I have created an entity for each type of data (US State entity for example) and was able to preload the data using the interactive console by creating the entity and then calling the put() method.
Now that the application is deployed I don’t know of a way to preload this data. How would you recommend doing this in a deployed instance?
I am using SDK version 1.7.0, python 2.7, High Replication Datastore (HRD), and memcache when data is retrieved.
Thanks in advance for your help!
If you want to do it programmatically, you may use the interactive console in production. Check out How do I activate the Interactive Console on App Engine?
You may also create a temporary request handler that'll do the job, deploy it (e.g. as a different version of the app to make it easy to delete) and launch the respective URL in your browser.
You can use the bulkloader to upload your entities to your deployed version. See the doc Uploading and Downloading Data for details and examples.
I have a python application that I've been running with the devserver and everything seems to work fine except I am having problems initializing my datastore. Basically I need to set up data store values from a bunch of files that are on my local drive, but I don't want to upload go google. I set up a simple python script inside my app directory that does all of the data creation, but now I'm having a lot of problems deploying my app. How do I get a dump of the data that dev_appserver is using and upload it to my application?
Thanks for any insights.
Download the data using appcfg.py (after enabling the remote_api), then re-'upload' it to local devappserver.
http://blog.mfabrik.com/2011/03/14/mirroring-app-engine-production-data-to-development-server-using-appcfg-py/