Re: multithreaded cache?

Discussion in 'Java' started by Eric Sosman, May 15, 2012.

  1. Eric Sosman

    Eric Sosman Guest

    On 5/15/2012 5:14 AM, bugbear wrote:
    > I'm using various Apache Commons Maps as a
    > multithread cache, protected using
    > ReentrantReadWriteLock, so that getting() uses a read lock,
    > and putting() uses a write lock.
    >
    > But I've got an issue; in the
    > case of a cache miss (protected by a read lock),
    > the required value is acquired using the "underlying function"
    > that the cache is over; this value is then put() into
    > the cache (protected by a write lock)
    >
    > This is all perfectly thread safe, and gives
    > correct results.
    >
    > However, if the underlying function is slow
    > and/or resource hungry (consider cacheing
    > a ray traced image!) many threads can
    > end up calling the real function (second
    > and subsequent threads to the first get a miss
    > during the first threads call to the underlying function).
    >
    > "obviously..." what I want is for only
    > the thread that FIRST has a cache miss
    > calls the underlying function, whilst other
    > threads (for the same key) wait.
    >
    > This seems "obvious" enough that I'm guessing
    > there's a standard solution.
    >
    > Googling led me to several "heavy" libraries;
    >
    > This appears more a locking/cacheing issue
    > than a Java issue, although my implementation
    > is in Java.
    >
    > Can anyone (please) point me at a
    > canonical solution/implementation?


    Don't know whether it's "canonical," but one fairly obvious
    approach is to insert the missed key in the map immediately, but
    with a value that means "Coming Soon To A Cache Near You." The
    original thread can then release its read lock and fetch the true
    value from its source.

    Other threads that look for the same value will "hit" the
    cache, but will recognize the "Coming Soon" as an indication to
    wait until the first thread comes up with the value.

    There must be many ways to arrange this, but one that requires
    very little awareness on the part of the querying threads is to
    make the hashed "value" a value-holder object. Then a querying
    thread could do something like

    mapLock.lockForReading();
    ValueHolder holder = map.get(key);
    mapLock.release();

    if (holder == null) {
    mapLock.lockForWriting();
    holder = map.get(key); // inserted by other thread?
    if (holder == null) {
    holder = new ValueHolder(key);
    map.put(key, holder);
    }
    mapLock.release();
    }

    Value = holder.getValue();

    The ValueHolder class might look like

    class ValueHolder {
    private final Key key;
    private Value cachedValue;
    private SourceException fetchError;

    ValueHolder(Key key) {
    this.key = key;
    }

    synchronized
    Value getValue() throws SourceException {
    if (cachedValue == null) {
    if (fetchError != null) {
    // notify new thread of old error
    throw fetchError;
    }
    try {
    cachedValue = fetchValueFromSource(key);
    } catch (SourceException ex) {
    fetchError = ex;
    throw fetchError;
    }
    }
    return cachedValue;
    }
    }

    (That's not the only way to deal with errors in fetching, but it
    seems perfectly reasonable to me even if it's a little odd to throw
    the same exception object multiple times.)

    The cache-querying threads lock the map only briefly, long
    enough to determine whether a ValueHolder is present. ValueHolder
    can delay its callers for a long time (if not, why cache at all?),
    but will only delay callers that are interested in that particular
    key; callers looking for other keys will get different ValueHolder
    instances and won't be interfered with. And only one thread will
    actually attempt fetchValueFromSource() on any given key.

    --
    Eric Sosman
    d
     
    Eric Sosman, May 15, 2012
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Jeff Nokes

    Cache::Cache Stale Segments

    Jeff Nokes, Sep 30, 2003, in forum: Perl
    Replies:
    0
    Views:
    608
    Jeff Nokes
    Sep 30, 2003
  2. DesignerX

    Page.Cache vs HttpContext.Current.Cache

    DesignerX, Jan 20, 2004, in forum: ASP .Net
    Replies:
    5
    Views:
    8,331
    vMike
    Jan 20, 2004
  3. Silvio Bierman

    Re: multithreaded cache?

    Silvio Bierman, May 15, 2012, in forum: Java
    Replies:
    10
    Views:
    567
    Mike Schilling
    May 26, 2012
  4. Robert Klemme

    Re: multithreaded cache?

    Robert Klemme, May 17, 2012, in forum: Java
    Replies:
    20
    Views:
    703
    Robert Klemme
    May 25, 2012
  5. Kevin McMurtrie

    Re: multithreaded cache?

    Kevin McMurtrie, May 25, 2012, in forum: Java
    Replies:
    3
    Views:
    329
    Robert Klemme
    Jun 2, 2012
Loading...

Share This Page