lua-resty-lrucache - Lua-land LRU cache based on the LuaJIT FFI.
- Name
- Status
- Synopsis
- Description
- Methods
- Prerequisites
- Installation
- Community
- Bugs and Patches
- Author
- Copyright and License
- See Also
This library is considered production ready.
-- file myapp.lua: example "myapp" module
local _M = {}
-- alternatively: local lrucache = require "resty.lrucache.pureffi"
local lrucache = require "resty.lrucache"
-- we need to initialize the cache on the lua module level so that
-- it can be shared by all the requests served by each nginx worker process:
local c, err = lrucache.new(200) -- allow up to 200 items in the cache
if not c then
error("failed to create the cache: " .. (err or "unknown"))
end
function _M.go()
c:set("dog", 32)
c:set("cat", 56)
ngx.say("dog: ", c:get("dog"))
ngx.say("cat: ", c:get("cat"))
c:set("dog", { age = 10 }, 0.1) -- expire in 0.1 sec
c:delete("dog")
c:flush_all() -- flush all the cached data
end
return _M
# nginx.conf
http {
# only if not using an official OpenResty release
lua_package_path "/path/to/lua-resty-lrucache/lib/?.lua;;";
server {
listen 8080;
location = /t {
content_by_lua_block {
require("myapp").go()
}
}
}
}
This library implements a simple LRU cache for OpenResty and the ngx_lua module.
This cache also supports expiration time.
The LRU cache resides completely in the Lua VM and is subject to Lua GC. As
such, do not expect it to get shared across the OS process boundary. The upside
is that you can cache arbitrary complex Lua values (such as deep nested Lua
tables) without the overhead of serialization (as with ngx_lua
's shared
dictionary
API).
The downside is that your cache is always limited to the current OS process
(i.e. the current Nginx worker process). It does not really make much sense to
use this library in the context of
init_by_lua
because the cache will not get shared by any of the worker processes (unless
you just want to "warm up" the cache with predefined items which will get
inherited by the workers via fork()
).
This library offers two different implementations in the form of two classes:
resty.lrucache
and resty.lrucache.pureffi
. Both implement the same API.
The only difference is that the latter is a pure FFI implementation that also
implements an FFI-based hash table for the cache lookup, while the former uses
native Lua tables.
If the cache hit rate is relatively high, you should use the resty.lrucache
class which is faster than resty.lrucache.pureffi
.
However, if the cache hit rate is relatively low and there can be a lot of
variations of keys inserted into and removed from the cache, then you should
use the resty.lrucache.pureffi