I am writing a micro-service that will have to share database owned by a different micro-service.
I understand that from a micro-services architecture perspective, this is not a good design. Hence, I decided to separate out the database access as another micro-service, whose only task it to manage access to db.
I need pointers on how I may write such an app using Python which exposes API for read/write to a database?
I realize this is not a design perspective answer.
Did you have a chance to take a look at sandman, a Python library that can generate a REST API over a database?
Related
I'm running Django (1.5) on App Engine and I need to use some kind of key-value cache. I know App Engine's memcache API and also the Django's cache framework. I wonder which one should I use.
On one hand I would like my code to be as portable as possible for migrating it to another cloud platform. But on the other hand I would like to fully utilize the services offered by App Engine.
Is writing a custom cache backend for Django that will use the App Engine memcache is the best solution?
Tzach, I think you're already answering your question.
Putting your app in GAE and not using the services provided by Google it doesn't look to me as a wise decision, even more, when those features are key for performance at the same time free or very cheap.
On the other hand, the embedded default cache in Python is not guaranteed to give its best results under GAE, as GAE instances are not a normal server where you'd run your django instance, e.g. instances can be shutdown at any time.
These special characteristics found in Django are tuned in the django for GAE versions.
For that reason, and taking into account that using the GAE memcache is also straightforward, I'd recommend you using the easiest ones to add to your application.
And, if in the future, you move to another platform, there will be more things to change than the key-value cache.
My two cents on that is to focus firstly in getting the job done and secondly in optimizing the performance on GAE and only afterwards to start thinking on things to improve.
I have been developing a fairly simple desktop application to be used by a group of 100-150 people within my department mainly for reporting. Unfortunately, I have to build it within some pretty strict confines similar to the specs called out in this post. The application will just be a self contained executable with no need to install.
The problem I'm running into is figuring out how to handle the database need. There will probably only be about 1GB of data for the app, but it needs to be available to everyone.
I would embed the database with the application (SQLite), but the data needs to be refreshed every week from a centralized process, so I figure it would be easier to maintain one database, rather than pushing updates down to the apps. Plus users will need to write to the database as well and those updates need to be seen by everyone.
I'm not allowed to set up a server for the database, so that rules out any good options for a true database. I'm restricted to File Shares or SharePoint.
It seems like I'm down to MS Access or SQLite. I'd prefer to stick with SQLite because I'm a fan of python and SQLAlchemy - but based on what I've read SQLite is not a good solution for multiple users accessing it over the network (or even possible).
Is there another option I haven't discovered for this setup or am I stuck working with MS Access? Perhaps I'll need to break down and work with SharePoint lists and apps?
I've been researching this for quite a while now, and I've run out of ideas. Any help is appreciated.
FYI, as I'm sure you can tell, I'm not a professional developer. I have enough experience in web / python / vb development that I can get by - so I was asked to do this as a side project.
SQLite can operate across a network and be shared among different processes. It is not a good solution when the application is write-heavy (because it locks the database file for the duration of a write), but if the application is mostly reporting it may be a perfectly reasonable solution.
As my options are limited, I decided to go with a built in database for each app using SQLite. The db will only need updated every week or two, so I figured a 30 second update by pulling from flat files will be OK. then the user will have all data locally to browse as needed.
Is there any software library that provides an interface for storing and querying data like the Google App Engine Datastore, but uses a local file or service instead of running on App Engine?
The specific features I am looking for are:
Stores data as Entities with Named Properties
Query support
Atomic transactions
Python language bindings
Runs on my local machine
either stores to a single file
or connects to a local database
service
Free and open source
Thanks
You can also check MongoDB. It is an open source document-oriented database system.
You may also want to check out Appscale (http://www.appscale.com). It lets you run your App Engine apps without modification outside of Google (on your laptop, on your local cluster / behind your firewall, or in Amazon EC2). AppScale is and does each of the requirements you list here. It automatically installs/configures/manages the datastore service (and all other APIs/services) for your apps to use, so you don't have to.
Have a look at ZODB - not exactly alike but similiar http://www.zodb.org/
from the docs
Some of the features that ZODB brings to you:
Transparent persistence for Python objects
Full ACID-compatible
transaction support (including savepoints) History/undo ability
Efficient support for binary large objects (BLOBs)
Pluggable storages
Scalable architecture
I am starting web project that should be very flexible and modular and definitely will grow much in the future. As we plan to provide an api to other developers I started thinking that maybe it is a good idea to implement all the methods as api and provide functionality to use them remotely and internally.
For instance, say we want to extract all registered Users. So we design method in api, like get_all_users,which maybe available via REST or internal invocation. The problem is I cannot figure out how to distinguish access from internal usage and remote usage, as I should take into consideration also speed of internal invocation,code reusage and checking of user permissions(secret keys for api access, etc). What are the best practices? How to organize such API?
So to build the API, Tastypie or Piston. These let you write 'resources' (which are basically API versions of the views) - instead of returning httpresponses you just return objects which piston/tastypie convert into json or xml or yaml or whatever your favorite data-language is.
Securing the access: Consider decorating any resources which contain confidential information with the #login_required or #staff_member_required decorator as appropriate.
You might even want to go a step further and write a decorator to check that the staff user is using https (whereas you might allow a normal user to use any connection type).
If you have not considered it yet I recommend to use Tastypie to set up an API with django.
In order to distinguish between your internal and remote usage maybe you can simply design a different URL scheme.
I'm developing a web application and considering Django, Google App Engine, and several other options. I wondered what kind of "penalty" I will incur if I develop a complete Django application assuming it runs on a dedicated server, and then later want to migrate it to Google App Engine.
I have a basic understanding of Google's data store, so please assume I will choose a column based database for my "stand-alone" Django application rather than a relational database, so that the schema could remain mostly the same and will not be a major factor.
Also, please assume my application does not maintain a huge amount of data, so that migration of tens of gigabytes is not required. I'm mainly interested in the effects on the code and software architecture.
Thanks
Most (all?) of Django is available in GAE, so your main task is to avoid basing your designs around a reliance on anything from Django or the Python standard libraries which is not available on GAE.
You've identified the glaring difference, which is the database, so I'll assume you're on top of that. Another difference is the tie-in to Google Accounts and hence that if you want, you can do a fair amount of access control through the app.yaml file rather than in code. You don't have to use any of that, though, so if you don't envisage switching to Google Accounts when you switch to GAE, no problem.
I think the differences in the standard libraries can mostly be deduced from the fact that GAE has no I/O and no C-accelerated libraries unless explicitly stated, and my experience so far is that things I've expected to be there, have been there. I don't know Django and haven't used it on GAE (apart from templates), so I can't comment on that.
Personally I probably wouldn't target LAMP (where P = Django) with the intention of migrating to GAE later. I'd develop for both together, and try to ensure if possible that the differences are kept to the very top (configuration) and the very bottom (data model). The GAE version doesn't necessarily have to be perfect, as long as you know how to make it perfect should you need it.
It's not guaranteed that this is faster than writing and then porting, but my guess is it normally will be. The easiest way to spot any differences is to run the code, rather than relying on not missing anything in the GAE docs, so you'll likely save some mistakes that need to be unpicked. The Python SDK is a fairly good approximation to the real App Engine, so all or most of your tests can be run locally most of the time.
Of course if you eventually decide not to port then you've done unnecessary work, so you have to think about the probability of that happening, and whether you'd consider the GAE development to be a waste of your time if it's not needed.
Basically, you will change the data model base class and some APIs if you use them (PIL, urllib2, etc).
If your goal is app-engine, I would use the app engine helper http://code.google.com/appengine/articles/appengine_helper_for_django.html. It can run it on your server with a file based DB and then push it to app-engine with no changes.
It sounds like you have awareness of the major limitation in building/migrating your app -- that AppEngine doesn't support Django's ORM.
Keep in mind that this doesn't just affect the code you write yourself -- it also limits your ability to use a lot of existing Django code. That includes other applications (such as the built-in admin and auth apps) and ORM-based features such as generic views.
There are a few things that you can't do on the App Engine that you can do on your own server like uploading of files. On the App Engine you kinda have to upload it and store the datastore which can cause a few problems.
Other than that it should be fine from the Presentation part. There are a number of other little things that are better on your own dedicated server but I think eventually a lot of those things will be in the App Engine