I have a web page that displays dynamic data which changes every 2 seconds. Data is selected from various data sources including Oracle. Currently, the page reloads every 10 seconds and runs a PHP script which retrieves the data and displays the page. I have other pages that gives a different view on the same data. This means the same query is run again for them as well. If I have 4 of these pages with 10 concurrent users each, suddenly the data retrieval happens 40 times every 10 seconds, obviously not ideal. I have some ideas on how to improve this situation, but I thought I would ask from some ideas from other experts that might have come across a similar situation. I’m not bound to PHP, and my server is on a Linux platform.
Regards
Marius
I had a similar problem in a recent project and opted to cache all the retrieved database data using a Memcached daemon. So instead of querying the database 40 times you will query it once and then subsequently simply open a TCP/IP connection to the memcached daemon to fetch the cached result data. Note that PHP has a memcached extension which is quite easy to understand and use.
If the page you display to the user must update its data every 2 seconds or so then it might be best to reload the data via AJAX instead of loading the HTML, CSS and Javascript every time the page gets refreshed. Taha above mentioned using a Comet server to simulate ‘pushing’ the data from the server to the client. You can easily roll your own. From my understanding the two main techniques that will work with all major browsers are:
An excellent tutorial for using the above two techniques is availabe here. The author also provides their code for download.
Although I used Apache + PHP to implement my Comet solution, it might be better to switch to a non-blocking framework such as Tornado (Python).