I am looking for best practices to implement a TraceListener that will write logs inside SQL server from an ASP.NET application.
What are the things that should be taken into account when implementing such a class in order to avoid performance reduction?
Will stored procedures be faster than plain ADO.NET INSERT statements?
I really like the idea of writing the logs into a temporary in-memory buffer and flush it to the database at some later point from a background thread, but what data structure is most suitable for such scenario? Queue<T> seems like a good candidate but I cannot add elements to it without some synchronization mechanism.
I found over the internet an article that shows an example of a custom TraceListener that writes to SQL server but before putting it into production code I would like to have some more feedback.
Stored Procedures won’t be any faster then paramteized SQL. I prefer a Stored Procedure over hardcoding SQL in my application but if you were going to generate the insert statements then that is even better.
Using a buffer is a good idea if your ok with the idea that you might loose data. If you want to decouple the client from the insert and you want it to be durable you could use MSMQ. Then you could write a windows service that would process the queue and it woulod be completly decoupled from the application. It could also then aggregate logs from multiple servers if you have a server farm.