We have a system where the results of expensive database queries + business logic are cached to reduce latency for subsequent requests. Multiple hosts handle requests and use a shared cache. In the 'cache-aside' pattern, the request flow is:
- Determine whether the item is currently held in the cache
- If the item is not currently in the cache, read the item from the data store
- Store a copy of the item in the cache
Because our queries and business logic are expensive, several thousand requests for the item can occur between steps 1 and 3, each triggering step two, overloading the data store. We use a pattern we dub the pending or the sentinel:
- Determine whether the item is currently held in the cache and atomically store a 'result pending' object (sentinel) if it is missing
- If the item is not currently in the cache, read the item from the data store; if the item in the cache is the sentinel, back off and retry the operation until a result arrives or timeout (in which case, restart the operation)
- Store a copy of the item in the cache, replacing the sentinel value and effectively 'unblocking' requests trying to read the value
I can't find any examples or discussion of this in the literature, is there a name or any reference for this pattern?
Aucun commentaire:
Enregistrer un commentaire