Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to dump all date in cache #145

Open
ghost opened this issue Jan 28, 2021 · 7 comments
Open

Ability to dump all date in cache #145

ghost opened this issue Jan 28, 2021 · 7 comments

Comments

@ghost
Copy link

ghost commented Jan 28, 2021

Is there a way to dump all date in the cache? or maybe just get a list of keys so I can iterate over the cache and access each value.

@ghost
Copy link
Author

ghost commented Jan 28, 2021

For redis, this might need an instance of IServer to have access to the keys command

@Turnerj
Copy link
Member

Turnerj commented Jan 28, 2021

Hey @lfcosio - currently there isn't a method to iterate over or dump out the data from the cache in any browse-able way. Do you mind expanding on how/why you'd need this?

It does sound like an interesting and reasonable feature, just wanting an understand of the how/why to make sure I can factor in how best to approach it for a future update.

@prat0088
Copy link

In my case, there are times I want to inspect the state of caches while debugging issues with our service. Specifically I like to dump the keys, find what I'm looking for, then dump a few values. Being able to get a list of keys would be enough for me. Then I could fetch the data and dump that.

@Turnerj
Copy link
Member

Turnerj commented Mar 16, 2021

Hey @prat0088 - so if I had a method like IAsyncEnumerable<string> GetKeysAsync(), that would work for you?

Individually it wouldn't be hard for cache layers - in-memory has all its keys and file-based has a manifest. MongoDB can query all keys in a collection. Redis can query all keys in a database (cache keys aren't prefixed so a whole DB of keys would be returned.

The harder part is seeing which layer we query as some are local and some are distributed. If you want all cache keys, we would typically query the last layer as it is likely to be distributed and have everything. If we query the first layer, typically local, you will only know of the keys cached locally.

I will assume then you want to know what is cached in what layer right?

@rapha22
Copy link

rapha22 commented May 7, 2021

One usecase for this is multi-tenant applications that share the same cache stores. In the application I work in, we frequently need to flush all entries related to a tenant -- but that tenant only. We have a custom cache engine with memory and Redis layers, and one of the operations is to delete all keys with a specific prefix (the disired tenant's). Listing the keys would enable us to not have to store them separately or doing this kind of workaround when using CacheTower (I'm not using it today, but am evaluating caching solutions to replace my home-bred one, and got really interested).

(Listing keys with a specific prefix would be nice too (like Redis' KEYS prefix* command), though I don't know if that would be feasible for all storage mechanisms.)

@Turnerj
Copy link
Member

Turnerj commented May 8, 2021

Thanks for the information @rapha22! With something like this, where would you be expecting the keys to come from? So if you had in-memory and Redis, is it just returning the keys from Redis?

@rapha22
Copy link

rapha22 commented May 10, 2021

Hi, @Turnerj! I think Redis would be the better option, since it would have all keys (in constrast to memory, which would have only one server's keys). That would guarantee that no keys we're interested in would be missed.

In our case, we don't choose the layer to list the keys, but instead tell each layer to clear itself up by listing and removing keys with a specific prefix. In the Redis layer, we use the KEYS prefix* command, which halts the server while getting the keys, but since the memory cache handles most requests (the Redis layer is cleared before the memory layer) and clearing the cache is not something that happens all the time, the performance impact is not that noticeable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants