Essay - Published: 2024.11.20 | build | cache | create | dotnet | fsharp |
DISCLOSURE: If you buy through affiliate links, I may earn a small commission. (disclosures)
In this post we'll build a simple lookaside cache with F#.
A lookaside cache is one of the most common types of caches. It's relatively simple to implement and reason about while potentially offering substantial performance improvements so is often a first choice until a better paradigm is decided on.
A lookaside cache works like this:
Simple logic but quite effective at reducing load on downstream systems.
We start by modeling the data that will live in our cache. For us this is a generic type wrapped in a CachedItem containing an expiration time so we know if it's still valid or not.
type CachedItem<'T> =
{
Value: 'T
Expiration: DateTimeOffset
}
With our cached data modeled, we can now build our caching logic to operate on that model.
Here we create a cache that:
'vGetOrAddAsync function
type SimpleKeyCache<'v>() =
let _cache = ConcurrentDictionary<string, CachedItem<'v>>()
member this.GetOrAddAsync
(key: string)
(refreshFn: string -> Task<'v>)
(ttl: TimeSpan)
: Task<'v>
=
task {
let now = DateTimeOffset.UtcNow
match _cache.TryGetValue(key) with
| true, item when item.Expiration > now ->
return item.Value
| _ ->
let! newResult = refreshFn key
let newCachedItem =
{
Value = newResult
Expiration = now + ttl
}
_cache.AddOrUpdate(key, newCachedItem, fun key oldValue -> newCachedItem) |> ignore
return newResult
}
Here's a demo to prove this works:
The full source code for the example code can be found below. HAMINIONs members get access to the full source code files (github) for this example along with dozens of others so you can clone it on your own machine and run it.
open System
open System.Collections.Concurrent
open System.Threading.Tasks
printfn "Hello from F#"
(* Cache *)
type CachedItem<'T> =
{
Value: 'T
Expiration: DateTimeOffset
}
type SimpleKeyCache<'v>() =
let _cache = ConcurrentDictionary<string, CachedItem<'v>>()
member this.GetOrAddAsync
(key: string)
(refreshFn: string -> Task<'v>)
(ttl: TimeSpan)
: Task<'v>
=
task {
let now = DateTimeOffset.UtcNow
match _cache.TryGetValue(key) with
| true, item when item.Expiration > now ->
return item.Value
| _ ->
let! newResult = refreshFn key
let newCachedItem =
{
Value = newResult
Expiration = now + ttl
}
_cache.AddOrUpdate(key, newCachedItem, fun key oldValue -> newCachedItem) |> ignore
return newResult
}
(* Examples *)
let mutable refreshCount = 0
let refreshFunction
(s: string)
: Task<int>
=
task {
refreshCount <- refreshCount + 1
printfn "Fetching for key: %A refreshCount: %A" s refreshCount
return refreshCount
}
let cache = SimpleKeyCache<int>()
let cacheTtl = TimeSpan.FromSeconds(2)
let cacheKey = "iamakey"
let getOrAdd = fun() ->
cache.GetOrAddAsync
cacheKey
refreshFunction
cacheTtl
|> fun t -> t.Result
printfn "Fetch key - expect refresh"
printfn "* value: %A" (getOrAdd())
printfn "Fetch key - expect cache hit "
printfn "* value: %A" (getOrAdd())
printfn "Waiting til expiry"
Task.Delay(cacheTtl).Wait()
printfn "Fetch key - expect refresh"
printfn "* value: %A" (getOrAdd())
This outputs:
Hello from F#
Fetch key - expect refresh
Fetching for key: "iamakey" refreshCount: 1
* value: 1
Fetch key - expect cache hit
* value: 1
Waiting til expiry
Fetch key - expect refresh
Fetching for key: "iamakey" refreshCount: 2
* value: 2
This is a very simple cache implementation and is missing many features / perf optimizations / concurrency safeguards that more robust implementations would have so use at your own discretion. If you're looking for an off-the-shelf cache, you might consider using dotnet's standard in-memory cache to start - it's basically a wrapper on ConcurrentDictionary with a much more robust implementation.
That said, sometimes you just want a simple implementation of something without taking external dependencies so in those cases this cache should work fine. In fact this is the cache that's currently being used for One Million Checkboxes and it seems to be working fine.
If you liked this post you might also like:
The best way to support my work is to like / comment / share for the algorithm and subscribe for future updates.