- New cached data is added to the "New" cache tree
After a given time or size is reached, the "New" tree is essentially frozen
- "New" becomes "Old"
- Another empty "New" is created
After the longest cached item has timed-out, the "Old" tree is deleted.
- All the memory is freed.
- The freezing of the current "New" is triggered
- Add key to "New" if it doesn't exist
- Update or add data and mark time
- Similar key in "Old" will be masked
Advantages of Roll-over design
No separate thread needed
- All cache work triggered by user interaction
- No need to scan for old data
Robust long run times
- Old data automatically purged
- Tolerates low memory states
- No disk access
- Zero baseline CPU utilization (demand driven)
Can limit resource utilization
- Uncached data allowed
- Can fix total cached items
Allows distributed computing
- cache will be on both sides of a network link