Build a Simple F# Lookaside Cache

Date: 2024-11-20 | create | build | fsharp | cache | dotnet |

In this post we'll build a simple lookaside cache with F#.

What is a Lookaside Cache?

A lookaside cache is one of the most common types of caches. It's relatively simple to implement and reason about while potentially offering substantial performance improvements so is often a first choice until a better paradigm is decided on.

A lookaside cache works like this:

  • On fetch - see if the item is in the cache
  • if item in cache and not expired -> return it
  • else get new item, save in cache -> return it

Simple logic but quite effective at reducing load on downstream systems.

Building a lookaside cache with F#

We start by modeling the data that will live in our cache. For us this is a generic type wrapped in a CachedItem containing an expiration time so we know if it's still valid or not.

type CachedItem<'T> =
    {
        Value: 'T 
        Expiration: DateTimeOffset
    }

With our cached data modeled, we can now build our caching logic to operate on that model.

Here we create a cache that:

  • Creates a cache for generic type 'v
  • Uses ConcurrentDictionary under the hood to handle concurrent write access (tho we don't really use this)
  • Exposes a GetOrAddAsync function
    • Takes in the cache key of the item (for lookup and save)
    • Has a refresh function for fetching the new item
    • Time to Live (ttl) for setting the item's expiration
    • Wrapped in a task so we don't hog threads on IO
type SimpleKeyCache<'v>() = 

    let _cache = ConcurrentDictionary<string, CachedItem<'v>>() 

    member this.GetOrAddAsync 
        (key: string) 
        (refreshFn: string -> Task<'v>) 
        (ttl: TimeSpan) 
        : Task<'v> 
        = 
        task {
            let now = DateTimeOffset.UtcNow

            match _cache.TryGetValue(key) with
            | true, item when item.Expiration > now ->
                return item.Value
            | _ -> 
                let! newResult = refreshFn key 
                let newCachedItem = 
                    {
                        Value = newResult 
                        Expiration = now + ttl
                    }
                _cache.AddOrUpdate(key, newCachedItem, fun key oldValue -> newCachedItem) |> ignore
                return newResult
        }

Using the F# lookaside cache

Here's a demo to prove this works:

  • On initial fetch -> refreshes
  • Fetch again -> doesn't refresh (cache hit)
  • Wait til after ttl and fetch -> refreshes

The full source code for the example code can be found below. HAMINIONs members get access to the full source code files (github) for this example along with dozens of others so you can clone it on your own machine and run it.

open System
open System.Collections.Concurrent
open System.Threading.Tasks

printfn "Hello from F#"

(* Cache *)

type CachedItem<'T> =
    {
        Value: 'T 
        Expiration: DateTimeOffset
    }

type SimpleKeyCache<'v>() = 

    let _cache = ConcurrentDictionary<string, CachedItem<'v>>() 

    member this.GetOrAddAsync 
        (key: string) 
        (refreshFn: string -> Task<'v>) 
        (ttl: TimeSpan) 
        : Task<'v> 
        = 
        task {
            let now = DateTimeOffset.UtcNow

            match _cache.TryGetValue(key) with
            | true, item when item.Expiration > now ->
                return item.Value
            | _ -> 
                let! newResult = refreshFn key 
                let newCachedItem = 
                    {
                        Value = newResult 
                        Expiration = now + ttl
                    }
                _cache.AddOrUpdate(key, newCachedItem, fun key oldValue -> newCachedItem) |> ignore
                return newResult
        }

(* Examples *)


let mutable refreshCount = 0
let refreshFunction
    (s: string)
    : Task<int>
    =
    task {
        refreshCount <- refreshCount + 1
        printfn "Fetching for key: %A refreshCount: %A" s refreshCount

        return refreshCount
    } 

let cache = SimpleKeyCache<int>()

let cacheTtl = TimeSpan.FromSeconds(2) 
let cacheKey = "iamakey"

let getOrAdd = fun() -> 
    cache.GetOrAddAsync 
        cacheKey
        refreshFunction
        cacheTtl
    |> fun t -> t.Result

printfn "Fetch key - expect refresh"
printfn "* value: %A" (getOrAdd())

printfn "Fetch key - expect cache hit "
printfn "* value: %A" (getOrAdd())

printfn "Waiting til expiry"
Task.Delay(cacheTtl).Wait() 

printfn "Fetch key - expect refresh"
printfn "* value: %A" (getOrAdd())

This outputs:

Hello from F#
Fetch key - expect refresh
Fetching for key: "iamakey" refreshCount: 1
* value: 1
Fetch key - expect cache hit 
* value: 1
Waiting til expiry
Fetch key - expect refresh
Fetching for key: "iamakey" refreshCount: 2
* value: 2

Next

This is a very simple cache implementation and is missing many features / perf optimizations / concurrency safeguards that more robust implementations would have so use at your own discretion. If you're looking for an off-the-shelf cache, you might consider using dotnet's standard in-memory cache to start - it's basically a wrapper on ConcurrentDictionary with a much more robust implementation.

That said, sometimes you just want a simple implementation of something without taking external dependencies so in those cases this cache should work fine. In fact this is the cache that's currently being used for One Million Checkboxes and it seems to be working fine.

If you liked this post you might also like:

Want more like this?

The best / easiest way to support my work is by subscribing for future updates and sharing with your network.