I made a Google App Engine application as a class project in my university. Now I need to optimize it to use it commercially.
Nowadays, the code is very slow. It has only few Models with many properties in each.
Before rewriting the Models code, I need to know if my application will be faster if I increase the number of Models, i.e. increase decoupling. And until what point should I consider dividing Models in other Models?
Other question, if decoupling positively affects performance, is the number of properties in a Model directly proportional to the time to fetch it from datastore?
Getting/Putting big objects
Since the datastore API can only put or get an object (as opposed to individual fields), each time you fetch an object for the datastore it fetches all its fields regardless if you are going to use them all at that point or not. This effect will be stronger when we write the object back to the datastore, it has to write everything even when you alter a single field in the object. (writes cost more than reads)
Before I say anything further: the proliferation of models should follow your pattern of reads and writes.
To be practical and give an example, say I am modeling the posts on a blog, where each would have a
title, adateand abody(which we will rightfully assume as being much bigger than the other fields). In the straightforward object modeling, I’d have those three fields in a single classBlogPost, but I am penalized for the whole object each time I just want to read thetitles(say to simply display a list of posts and construct links to the posts), or just to update thedates.So what one can do is create a class
Bodythat would holdbody, and keep a reference (a key) to it insideBlogPost.Lightweight makes for efficient cross-side serialization
This also makes the
BlogPostclass more lightweight if I decide to serialize it and send it over the wire, e.g to send it to the web browser (to be used in JavaScript), with GWT for example.Answer from Nick Johnson: