I have a database that i’m dealing with which is updated every few hours remotely (not on any specific time span) and i have no control over the administration of it. I have web clients connecting to it to view information contained within it. These clients (coded using PHP and Javascript) might be checking the database very often (impatient users) even though there may not be any change to the database itself and the check will involve quite a lengthy query involving lots of lookups and cross referencing, etc.
So in order to cut down on database queries and to keep things snappy what would be the best way to limit the number of times the clients will actually run the whole query on the database?
To make things crystal clear, i have no way of altering the database, i can only query it. But i have full control over the source of the web client.
Do you have control over the web app? If users are logged in and have sessions associated with them, you could cache the time since a user last queried the database, and refuse to hand back new data if a request is beneath some threshold interval since the last query.
PHP Sessions