I am writing a small web application using Flask and I have to use DynamoDB as backend for some hard requirements.
I went through the tutorial on Flask website without establishing sqlite connection. All data were pulled directly from DynamoDB and it seemed to work.
Since I am new to web development in general and Flask framework, do you see any problems with this approach?
No. SQLite is just one option for backend storage. SQLite is mentioned in the tutorial only for its simplicity in getting something working fast and simply on a typical local developers environment. (No db to or service to install/configure etc.)
Related
I have built an app that uses mysql database with Python, I would love to share some functionalities with different applications and that calls for an online database feature, kindly give me some insights over how i can move a python mysql database to online and how to make calls to it in order to facilitate for sharing of data between different applications.
I don't exactly know what you are calling a python database but there are some options here that you might want to consider
First, use heroku to host your app and heroku postgress to host your databaseOr you can use an EC2 aws machine to host your app and it's database (in case it's a custom code that you can't call from a browser using heroku)with both of these options you can access you database and the appp with the second one you can install other services such as ssh and other.
I have a flask app with simple functionalities of a blog website codes given in below Github repo -
https://github.com/vivanks/flaskhost
It's data like user login details and blogs are stored in my system MySql-server.
On local machine it's working perfectly fine.
Now I want it to be deployed over internet.
What I tried so far is :
Heroku but problem with heroku is it need some postgressql and my whole data is stored in MySQL so I can't conver it.
Hosting flask part on Heroku and Database on 000webhost.com but 000webhost don't allow to connect to database outside of 00webhost
Hosting on http://pythonanywhere.com/ but then again failed also It don't support import MySQLdb instead it supports sqlalchemy
I want some way or something stable way through which I can export my data stored in mysql and don't have to change my code.
It would be great if you provide step by step guide.
P.S I don't have problem paying small amount
I am looking to create a python application that accesses a database. However, people must be able to access this database from different computers and always receive an up to date version. I understand that this would have to be some sort of cloud based database, but I cannot seem to find a an API with python bindings or module that allows me to do this. Does anybody know of an API or module that I could use to do this?
you can try with Google Cloud Datastore and App Engine DataStore which are fulfil your requirements:
https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/
And for api you can use Remote API
If you are going to use AWS, you can use Amazon RDS for database and Elastic Beanstalk for deploying your python application on cloud.
This link provides you information how to implement the database part on AWS Adding an Amazon RDS DB Instance to Your Python Application Environment
If you want to use Microsoft Azure then you can refer to the following links
Azure SQL Database libraries for Python
Use Python to query an Azure SQL database
I've done a lot of research for Google App engine. I ended up with webapp2. Now we realized that AWS has many advantages over GAE for our project.
I feel a little bit overwhelmed because there is so much information about AWS.
I ended up with the nosql dynamoDB solution.
Now I really love lightweight frameworks like webapp2 , its really simple and easy to use. It also don't hide post and get from you.
Now does it makes sense to use webapp2 for AWS ? Mabye there other frameworks that would fit better for our project.
(We want to go in a filehosting direction)
I mean there are literally 50 frameworks for python. And I didn't find a chart with pro/cons.
The last frameworks that I investigated are : web2py , pylon(now pyramids?) and tornado.
To be honest I am really confused.
So if I want to have a lightweight framework for AWS would you recommend me to stay with webapp2?
Resources:
Amazon AWS web framework for Python
Edit #1
Now it's a decision between tornado and webapp2. considering that I want to use dynamoDB.
Tornado is a fast non-blocking web server itself. It's easy to write simple apps in a few minutes, but it has nothing AFAIK like templates, views, etcetera. If you want to serve files it would be very quick to connect a tornado server together with boto (https://github.com/boto/boto) and use dynamoDB or S3.
On the other hand, pylons (now pyrmaid) and we2py are full web application frameworks with no web server. In fact, they come with a lightweight development server (at least pylons), but if you want to put up a site in AWS you should use something like nginx (http://nginx.org/), apache or some other wsgi server (http://wsgi.readthedocs.org/).
If you want to go for a lightweight framework and use python, I'd go for web2py, easy to configure and build apps. (A curious note: it used to be a single .py file) you can try it online: http://www.web2py.com/demo_admin/default/site
Regarding webapp2 I've never used it, but I've heard it's similar to Appengine web framework. So if you're comfortable with it stay there.
Either direction, boto is the python interface for AWS so if you choose python, you'll have to check it out. It is actively mantained.
I have a relatively extensive sqlite database that I'd like to import into my Google App Engine python app.
I've created my models using the appengine API which are close, but not quite identical to the existing schema. I've written an import script to load the data from sqlite and create/save new appengine objects, but the appengine environment blocks me from accessing the sqlite library. This script is only to be run on my local app engine instance, and from there I hope to push the data to google.
Am I approaching this problem the wrong way, or is there a way to import the sqlite library while running in the local instance's environment?
I would make suitable CSV files from the Sqlite data, in a separate script, then use bulk loading to push the data from the CSV files up to app engine.
If you need to access your datastore outside of the App Engine environment (like if you need to use libraries not present in App Engine or do other things App Engine does not support with the datastore) then the best option is the Remote Api.
There is an excellent tutorial on that here:
http://code.google.com/appengine/articles/remote_api.html
Essentially you import the remote_api module, authenticate with Google to access your datastore, then run your data access commands (query, update, delete, etc) as you normally would in app engine.
According to Google, you're doing it backwards. The app should be pulling data from you where you have more flexibility in converting to the new model anyway.
I have not had any trouble importing pysqlite2, reading data, then transforming it and writing it to AppEngine using the remote_api.
What error are you seeing?