I like ORM tools, but I have often thought that for large updates (thousands of rows), it seems inefficient to load, update and save when something like
UPDATE [table] set [column] = [value] WHERE [predicate]
would give much better performance.
However, assuming one wanted to go down this route for performance reasons, how would you then make sure that any objects cached in memory were updated correctly.
Say you’re using LINQ to SQL, and you’ve been working on a DataContext, how do you make sure that your high-performance UPDATE is reflected in the DataContext’s object graph?
This might be a ‘you don’t’ or ‘use triggers on the DB to call .NET code that drops the cache’ etc etc, but I’m interested to hear common solutions to this sort of problem.
You’re right, in this instance using an ORM to load, change and then persist records is not efficient. My process goes something like this
1) Early implementation use ORM, in my case NHibernate, exclusively
2) As development matures identify performance issues, which will include large updates
3) Refactor those out to sql or SP approach
4) Use Refresh(object) command to update cached objects,
My big problem has been informing other clients that the update has occured. In most instances we have accepted that some clients will be stale, which is the case with standard ORM usage anyway, and then check a timestamp on update/insert.