Skip to content

My Beef With ConcurrentMap

July 26, 2013

Many times I’ve needed to use a container such as a hash map to cache objects that are expensive to compute.

And often the cache must be thread-safe so that multiple threads may access it.

And frequently I’ve had a code reviewer ask “why didn’t you just use a concurrent map?” when they saw the synchronization scaffolding surrounding access to the map.

And always my answer is the same: the API of ConcurrentMap is flawed.

An operation when caching is the lookup-create sequence, where if the item is already cached, use it, otherwise create it and save it for future use. When it’s expensive to create the object, usually you want this whole operation to be atomic to prevent threads from creating the object simultaneously. Using a map and locks, this is usually handled with something like:

  V lookupOrCreate(K key) {
     synchronized (myMap) {
       V val = myMap.get(key);
       if (val == null) {
         val = computeExpensivelyForKey(key);
         myMap.put(key, val);
     return val;

The equivalent atomic lookup-create sequence with a ConcurrentMap uses its putIfAbsent function:

  V lookupOrCreate(K key) {
     return putIfAbsent(key, computeExpensivelyForKey(key))

However, this doesn’t help much, because it must already know the expensive value at the time it looks it up (and if computing it is this cheap, then caching may not be necessary anyway). Because of this, I’ve never found much use for the ConcurrentMap container, at least in a caching context, and almost always opt for the non-concurrent maps with my own locking.

The API would be much improved if it included an additional putIfAbsent function taking a factory that is called when needed:

  interface ValueFactory<V> {
     V create(K key);
  V putIfAbsent(K key, ValueFactory<V> factory);

Now you could call computeExpensivelyForKey in the factory, which the concurrent map would only invoke if absent.

I should point out that ConcurrentMap has a performance benefit because it allows concurrent access. So if you expect many threads to simultaneously access the map, it makes the better choice despite its lacking interface. If you want both concurrent access and atomic “lookup or create,” consider using the Cache from the Guava library. I recently discovered this library and was happy to see that it offers this interface. It also has advanced features such as expiration times, an LRU policy, etc. which are usually desired for caches anyway.

One Comment leave one →
  1. Wiktor Wandachowicz permalink
    June 15, 2014 2:12 AM

    Some discussion about this can be found at:

    Yes, it’s doable to overcome ConcurrentMap shortcomings by extending it and providing your own implementation. For example, ConcurrentHashMapWithInit was proposed in aforementioned discussion.

    But of course extending original ConcurrentMap with lambda-style method to be called only when necessary is quite proper idea, since we have Java 8 already.

Leave a Reply

Your email address will not be published. Required fields are marked *