I just rolled a small web application into production on a remote host. After it ran a while, I noticed that MS SQL Server started to consume most of the RAM on the server, starving out IIS and my application.
I have tried changing the ‘server maximum memory’ setting, but eventually the memory usage begins to creep above this setting. Using the activity monitor I have determined that I am not leaving open connections or something obvious so I am guessing its in cache, materialized views and the like.
I am beginning to believe that this setting doesn’t mean what I think it means. I do note that if I simply restart the server, the process of memory consumption starts over without any adverse impact on the application – pages load without incident, all retrievals work.
There has got to be a better way to control sql or force it to release some memory back to the system????
From the MS knowledge base:
Are you using any extended stored procedures, .NET code, etc that could be leaking memory outside of SQL Server’s control?
There is some more discussion on memory/address space usage here