I have servlets that caches user information rather then retrieving it from the user store on every request (shared Ehcache). The issue I have is that if a client is multi-threaded and they make more then one simultaneous request, before they have been authenticated, then I get this in my log:
Retrieving User [Bob]
Retrieving User [Bob]
Retrieving User [Bob]
Returned [Bob] ...caching
Returned [Bob] ...caching
Returned [Bob] ...caching
What I would want is that the first request would call the user service, while the other two requests get blocked – and when the first request returns, and then caches the object, the other two requests go through:
Retrieving User [Bob]
blocking...
blocking...
Returned [Bob] ...caching
[Bob] found in cache
[Bob] found in cache
I’ve thought about locking on the String “Bob” (because due to interning it’s always the same object right?). Would that work? And if so how do I keep track of the keys that actually exist in the cache and build a locking mechanism around them that would then return the valid object once it’s retrieved. Thanks.
You can’t guarantee that there will be no deadlock if you don’t have exclusive control of your locks. Interned
Stringare globally visible, so they are a poor candidate.Instead, use a map between keys and their corresponding locks. You might use synchronized access to a concurrent map, or use a
ConcurrentMap. I’m not sure which is cheaper, but I’d lean towardConcurrentMapbecause it expresses your intent succinctly.(Using a
ReadWriteLockis optional; with it, you could do something fancy like allowing concurrent reads of a cached value by multiple threads, yet still have another thread get an exclusive lock when the value needs to be updated.)It might be expensive to create a lot of locks that end up being discarded because a lock already exists. Or, you might use an old runtime without
java.util.concurrent. In cases like those you could synchronize on the map: