Optimising Zend Framework applications (1) – cache db objects, PHP code profiling and optimisation

I’m optimizing some zend framework applications these days and I’ve been reading, researching, optimising and measuring results.

Some links to read before:
Optimising a Zend Framework application – by Rob Allen – PHPUK February 2011
Profile your PHP application and make it fly – by Lorenzo Alberton – PHPNW 9 Oct 2010

The optimisation process is iterative: measure the performances, improve the worst problem and starto over. Go on until performances are not satisfactory.
That does not mean that we can start writing our application and optimising everything later. An eye on performances is needed since the beginning, in order to select an architecture that will support the traffic.

In this post I’ll explain some common techniques I’ve used to optimise a high traffic application. Optimisation is done at the end, but we should be aware of what can be done, in order to design the application in a way that is easier to optimise if needed.

1. Cache database calls with Zend_Cache

Database is often a relevant bottleneck, especially when shared with other applications and we cannot expect it responding immediately. A database server often make the script hanging and increase the average page loading time. Note that in some scenarios (e.g: local server, simple queries operating on table with small indexes and db server caching in memory) the db could be faster than the a disk access to get the cached object. Measure if possible.

For a recent application, I’ve written a wrapper/system to automatically cache all the database models methods to get data using a default lifetime. Depending on the architecture of the site, as possible the cache should not expire and be invalidaded when the data is manually flagged as invalid (from admin area). If that is not possible, set a reasoable lifetime.

The normal cache logic is : if valid, load from disk/memory cache, if not, load fresh data (db). In case the loading of fresh data is not possible (e.g.: db not answering), I suggest a change: if the fresh data not available, use old cache instead of throwing exception or interrupt the application (and log the error). If you use Zend cache, look at the second param of  Zend_Cache_Core::load().

2. Profile and optimise queries

Even though the data is retrieved most of the time from the cache, queries might be need to be optimised: explain queries and check if indexes are added, select which index to use, change field type if needed (an interesting book about it). Remember that MySQL (and in general any db server) works very fast with simple queries (no joins) and small indexes. Consider denormalising the db structure and avoid some frequent joins. You can use MySQL triggers to automatically update the columns when changed on the parent tables.

3. Profile and optimise PHP code

To profile the PHP scripts use Xdebug  profiler + firefox extension (to enable when needed via cookie)  + K/WinCacheGrind. Another tool is xhprof (web based) by facebook, that shows the functions most frequently called.