Module cachestack
Implements a system for deterministic recycling of objects, via an underlying cache.
When an object is created, or fetched from the cache, it may be registered with this system. At a given point in execution, the system will then claim the object and add it to the cache. The system is primarily intended for more intense parts of a program and must be activated explicitly, being idle otherwise. When not running, registration is a no-op, though the cache remains available for fetching.
An instance of this system is created via the NewCacheStack function. (Typical programs
will only need one instance, but this is not required, e.g. various subsystems could each
have their own.) NewCacheStack returns two functions: NewType, which we will return
to shortly, as well as a WithLayer routine.
We may invoke the latter to use the cache stack. The function takes a body along with a variable argument list. In pseudo-code:
WithLayer(function(a, b) -- a: 5, b: "dog"
local object = GetObject() -- fetch from the cache or create a new one
Register(object) -- we want to reclaim this object later
InitializeAndUse(object, a, b)
end, 5, "dog") -- when call ends, object is sent to cache
The first time around, the cache would be empty, so a new object would need to be created.
Since we register this object, it will be claimed by the cache once we have executed the
WithLayer body.
Most code will not have a Register() call out in the open like this. A vector library,
for instance, might hard-wire it, as well as GetObject(), into various factory functions
and operators, e.g. ZeroVector(), __add, etc. For this reason, we can wind up with
objects that we want to keep, but that the cache would like to claim. We can unregister
such objects by returning them from the WithLayer body:
local result, number = WithLayer(function()
local a, b = vector.Random(), vector.Random()
local c = a * 5 + b * 6 + vector.Random() -- creates / fetches lots of intermediates
return c, 8 -- return final result of computation; non-registered objects fine too
end)
Since we returned it, c will not be claimed when WithLayer() completes. We can safely
use it—as result—without later code hijacking it and giving us perplexing
errors. The garbage collector will also now treat it like any other object.
As hinted at by the "layer" and "stack" terminology, we may nest these calls, allowing for the usual benefits of composition. In these more general cases, returning an object will prevent its immediate caching; however, the outer layer will try again:
local function InnerBody (rhs)
return IdentityMatrix() * rhs -- intermediate gets cached, result temporarily spared
end
local v = WithLayer(function()
local d, e = vector.Random(), vector.Random() -- inner layer will not try to claim these...
local a = WithLayer(InnerBody, e) -- ...even if we use them there
local b = WithLayer(InnerBody, 7)
return a:DotProduct(b)
end) -- outer layer ended: a and b sent to cache, along with d and e
As the last example demonstrates—mixing matrices and vectors—it can be useful
to allow multiple object types. In fact, the cache itself is logically divided by type,
and this is where the NewType routine mentioned a while back comes into play. Given
our previous example, we might have done begun our program so:
local MatrixType = NewType()
local VectorType = NewType()
These types are themselves functions. They accept various commands, most operating on an instance of the type in question. (Since it aims for generality, the system does not make many assumptions about its input; validation is the user's responsibility.) A partially implemented vector type might look like:
local VectorType = NewType()
-- Vector methods.
local Methods = {}
Methods.__index = Methods
local function GetVector ()
local v = VectorType("fetch") -- try to get a vector from the cache
if not v then
v = setmetatable({}, Methods) -- failing that, make a new one...
VectorType("register", v) -- ...and register it (if caching is active)
end
return v
end
local function SetVector (x, y)
local v = GetVector()
v.x, v.y = x, y
return v
end
function Methods.__add (v1, v2)
return SetVector(v1.x + v2.x, v1.y + v2.y)
end
function Methods:DotProduct (other)
return self.x * other.x + self.y * other.y
end
function Methods.__mul (a, b)
if type(a) == "number" then
return SetVector(a * b.x, a * b.y)
elseif type(b) == "number" then
return SetVector(b * a.x, b * a.y)
else
return SetVector(a.x * b.x, a.y * b.y) -- arbitrarily do memberwise
end
end
-- ETC.
-- Exported interface: vector factories et al.
local M = {}
function M.Random ()
return SetVector(math.random(), math.random())
end
M.Vector = SetVector
-- ETC.
return M
This ought to give some idea of the "hard-wiring" mentioned earlier. All the cache-related
machinery—here comprising the "fetch" and "register" commands—is done
indirectly through the GetVector() and SetVector() utility functions, which the user
never needs to see.
A few additional commands allow objects to anchor other items, e.g. a matrix view might need to hang on to its parent, where the actual memory is found. This is included in the type since the items themselves might be registered; they must not be cached while in use.
Functions
| NewCacheStack () | Instantiate a CacheStack. |
Functions
- NewCacheStack ()
-
Instantiate a CacheStack. See the summary for a deeper treatment.
Returns:
-
function
Called as
local tfunc = NewType(on_cache), where tfunc is a function used to interface with the type, with the following commands and arguments:- "fetch": Called as
object = tfunc("fetch"). If an object is available in this type's cache, removes and returns it; otherwise, returns * nil*. - "get_ref": Called as
local item = tfunc("get_ref", object, category). If an earlier "ref" command added an item under this category, returns it; otherwise, returns nil. - "ref": Called as
tfunc("ref", object, item), where category is some non-nil name. Creates a weakly-keyed reference to item under the given category, incrementing its reference count. This ties item's lifetime to object, while also keeping it from being sent to the cache. Any item previously referenced is evicted and has its reference count decremented. - "register": Called as
tfunc("register", object). If a layer is active, the object will be added it to its list for later caching; otherwise, a no-op. - "unref": Called as
tfunc("unref", object, category). If an earlier "ref" command added an item under this category, removes it and decrements its reference count. - "unregister": Called as
tfunc("unregister", object). If the object was registered in any layer's list—or is referenced and being held—it is removed. This has no effect on already-cached objects.
If provided, the on_cache behavior will be called as
pcall(on_cache, item)whenever an item gets sent to this type's cache.If Lua closes and the cache is not empty, any __gc metamethods belonging to the lingering items are invoked. The "on cache" behavior should be written with this in mind to avoid introducing "double free"-type errors.
- "fetch": Called as
-
function
Called as
return WithLayer(body, ...). This pushes a layer onto the cache stack and makes a protected call to body with the remaining arguments. When body concludes or errors out, the layer is popped. If there was another layer underneath, it picks up where it left off.If we try to register an item while a layer is running, it will be added to a list kept by that layer. Any items in this list when the layer is popped will be added to the cache.
If body throws an error, the whole list is sent to the cache. The error is propagated.
Otherwise, whatever body returns is returned by WithLayer itself. If any of these return values are items registered in this layer's list, they will be removed, although the next layer down, if present, will adopt them. (See the module summary for examples.)
If an object is still referenced, it will be moved to a holding area, rather than the cache, until its reference count drops to 0.
-
function
Called as