Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Least Recently Used (LRU) Eviction Strategy #57

Open
Turnerj opened this issue May 16, 2020 · 5 comments
Open

Least Recently Used (LRU) Eviction Strategy #57

Turnerj opened this issue May 16, 2020 · 5 comments
Labels
help wanted Extra attention is needed
Milestone

Comments

@Turnerj
Copy link
Member

Turnerj commented May 16, 2020

Issue created from #53 (comment)

Currently the only eviction strategy is one of absolute time. If there was a method to define a "size" of cache, it would be worth exploring a "Least Recently Used" system to ensure that the hottest items stay in the cache.

Thoughts:

  • Implement an extension (eg. AutoEvict/FixedCapacity) that manages the more advanced eviction strategy
  • New extension tracks number of items in the cache, cache keys, expiry dates and last access dates
    • Could use a LinkedList to track most recently used or a timestamp directly
    • Probably easier with a ConcurrentDictionary with cache keys as the key and a custom type as the value
  • Once a criteria has been hit (eg. X number of items in the cache), use what it knows to evict locally

Challenges:

  • Keeping the extension up-to-date with cache data
    • Can use the various extension hooks but still is a bit fiddly
  • Only evicting locally
    • Can do what I already do for RedisRemoteEvictionExtension (pass the cache layers in directly) but it is pretty cumbersome

Notes:
To handle a distributed LRU would be excessively complex and likely not useful. Would mean that every access to a cache item (including local) would have to then broadcast that access back through the cache system and whatever distributed backends are used.

@Turnerj Turnerj added this to the Future milestone May 16, 2020
@Turnerj Turnerj added the help wanted Extra attention is needed label Jun 4, 2020
@Turnerj
Copy link
Member Author

Turnerj commented Sep 11, 2020

Might be worth investigating with or comparing against (when the time comes) this .NET caching solution which uses a LRU strategy: https://github.com/bitfaster/BitFaster.Caching

@prat0088
Copy link

That would be fantastic. LRU and bounded size are two requirements for my usage scenario.

@ghost
Copy link

ghost commented Jan 25, 2021

This is something we are looking for in our team. It would be excellent to have this implemented

@Turnerj
Copy link
Member Author

Turnerj commented Jan 25, 2021

Thanks for your feedback - it won't make the next release (coming in the next few days) but it is definitely something I'm looking into!

@Turnerj
Copy link
Member Author

Turnerj commented Oct 1, 2021

(For my own reference)

An interesting LFU cache implementation: https://github.com/papers-we-love/papers-we-love/blob/master/caching/a-constant-algorithm-for-implementing-the-lfu-cache-eviction-scheme.pdf

Not sure how applicable it is to this given its a LFU not an LRU plus it is a storage mechanism, not an eviction strategy. There still might be some useful pieces of information to extract from it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants