Open sourcing the home CMS
Thanks to my sponsors: Seth, Taneli Kaivola, Jörn Huxhorn, Richard Stephens, Borys Minaiev, Thehbadger, Paul Horn, Enrico Zschemisch, hgranthorner, compwhizii, Evan Relf, Ben Mitchell, Vincent, Ahmad Alhashemi, Dom, Guy Waldman, Adam Lassek, Jonathan Adams, Rufus Cable, Tabitha and 267 more
I’ve been bragging about my website software for years! For… whew, it’s been 5 years!
I didn't want to make a CMS! I did it out of spite!
I’ve been teasing folks about the cool things I did from the beginning — here’s all the articles and series I’ve written that mention it:
- 2020: A new website
- 2021: Don’t shell out!
- 2022: Updating fasterthanli.me
- 2024: Face cams: the missing guide
- 2024: State of the fasterthanlime
- 2025: Highlighted code in slides
- 2025: Impromptu disaster recovery
- 2025: More DevOps than I bargained for
I’ve also made two shorts about it:
"Listen here, you"
And.. I’m pretty proud of it!
But as soon as you consider open-sourcing something well, then you suddenly have a whole new different set of standards.
My rationale for keeping it closed this whole time was: I don’t want to be maintaining this specific thing for everyone. It’s my thing.
A thing which has had different names throughout the year, mirroring my mental health:
futile
as in, it’s futile to make your own CMS, you fool, why even bothertold
as in “text object linker”, suggested by a friend, but didn’t feel cozy enoughsnug
as in “cozy but not as overloaded”, but decided it was too suffocating eventually
Finally, I’m open-sourcing it as home
. Home is obviously not a very searchable name, and
maybe that’s a feature.
The documentation is available at https://home.bearcove.eu
I'm not sure about this yellow but I'm trying colors, okay?
…and the docs site is powered by itself, obviously. It is pretty barebones, unlike https://fasterthanli.me, it doesn’t have search or anything. The backend is the same, it just needs some front-end components.
Open-sourced components
I couldn’t open-source just home, there are some, mh, dependencies.
bearcove/home-tree-sitter
The home-tree-sitter repo is a collection of tree-sitter grammars I’ve collected over the years.
home-tree-sitter on main via 🦀 v1.86.0
❯ du -sh .
214M .
I wrote maybe only a couple, the rest are just gathered from The Online, and
it’s just now occuring to me that woops, even just running tree-sitter generate
might be a violation of some license, so.. mea culpa, now it’s all out
in the open.
bearcove/home-drawio
The home-drawio repo packages some
drawio code as a CLI tool that converts .drawio
files to .svg
, doing some minification and font shenanigans that work well for
home.
home-tree-sitter on main via 🦀 v1.86.0
❯ home-drawio
Usage:
convert <file> [options] Convert file to SVG
serve Start HTTP server
Options:
--output, -o <file> Output file
--minify, -m Whether to minify the output or not
--help Show this help message
It uses bun under the hood to generate a static executable — that means I don’t need to bring node.js with me (or chtulhu forbid, a whole browser) with me in production.
It’s available from…
bearcove/homebrew-tap
A homebrew tap for everything bearcove. For now it only has home-drawio
, and
home
itself. It makes it easy for people to contribute to projects whose documentation
is powered by home.
```term
fasterthanli.me on main [!?]
❯ brew tap bearcove/tap
==> Tapping bearcove/tap
Cloning into '/opt/homebrew/Library/Taps/bearcove/homebrew-tap'...
remote: Enumerating objects: 47, done.
remote: Counting objects: 100% (47/47), done.
remote: Compressing objects: 100% (25/25), done.
remote: Total 47 (delta 14), reused 45 (delta 12), pack-reused 0 (from 0)
Receiving objects: 100% (47/47), 6.61 KiB | 6.61 MiB/s, done.
Resolving deltas: 100% (14/14), done.
Tapped 2 formulae (16 files, 16.6KB).
I realize there’s plenty of excellent options for documentation websites out there nowadays. I just like my thing. I’m not interested in discussing “why not X”
bearcove/timelord
timelord might be the most interesting / immediately-usable repository in the bunch: it restores timestamps between CI builds, resulting in a lot more “fresh” in cargo’s verbose output, and a lot fewer rebuilds.
Fresh, fresh, fresh!
It is used as a crate from…
bearcove/beardist
beardist, my own little wrapper around cargo, does some cool things like:
- Print a whole lot of info about the build system, toolchain, etc.
- Bundles dependencies (dylibs) and rewrites RUNPATH/RPATH as needed on Linux/macOS
- Packages things up as
.tar.xz
and uploads them to GitHub Releases - Updates a homebrew tap based on github releases
It also has some non-CI tools like “bump the tag for this repo and push it”, or “wait for a new release to be up on releases, then amend some kubernetes manifests”.
Those nice runners are donated by Depot to the cause.
bearcove/home
Finally, we have the pièce de résistance — home itself.
It’s a weird codebase by all measures. For a while, you couldn’t cargo run
it.
Through the help of some miraculous build scripts, you now can — they just copy dylibs around:
#![allow(clippy::disallowed_methods)]
use camino::{Utf8Path, Utf8PathBuf};
use std::env;
use std::fs;
use std::process::Command;
const DYLIB_EXTENSION: &str = if cfg!(target_os = "macos") {
"dylib"
} else if cfg!(target_os = "linux") {
"so"
} else if cfg!(target_os = "windows") {
"dll"
} else {
panic!("Unsupported operating system")
};
#[derive(Clone, Debug)]
struct BuildInfo {
/// Path to the cargo executable, obtained from the CARGO environment variable.
cargo: Utf8PathBuf,
/// Path to the rustc executable, obtained from the RUSTC environment variable.
rustc: Utf8PathBuf,
/// The build profile (e.g., "debug" or "release"), obtained from the PROFILE environment variable.
/// Determines which build configuration to use.
profile: Profile,
/// The base directory of the project, typically two levels up from the current directory.
/// Used as a reference point for locating other project-related paths.
/// e.g. `/Users/amos/bearcove/home`
#[allow(dead_code)]
workspace_dir: Utf8PathBuf,
/// Where the workspace artifacts are written
/// e.g. `/Users/amos/bearcove/home/target`
/// e.g. `/tmp/beardist-build-cache/foo/bar/baz/target`
workspace_target_dir: Utf8PathBuf,
/// Path to the rubicon-exports crate directory, located in crates-outside-workspace/rubicon-exports.
/// Contains the source code for rubicon-exports that needs to be built.
/// e.g. `/Users/amos/bearcove/home/crates/rubicon-exports`
rubicon_exports_dir: Utf8PathBuf,
/// OUT_DIR environment variable
out_dir: Utf8PathBuf,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
enum Profile {
Debug,
Release,
}
impl Profile {
fn artifact_dir(self, target_dir: &Utf8Path) -> Utf8PathBuf {
target_dir.join(match self {
Self::Debug => "debug",
Self::Release => "release",
})
}
}
impl BuildInfo {
fn new() -> Self {
let rustc = Utf8PathBuf::from(env::var("RUSTC").unwrap());
let cargo = Utf8PathBuf::from(env::var("CARGO").unwrap());
let profile = match env::var("PROFILE").unwrap().as_str() {
"debug" => Profile::Debug,
"release" => Profile::Release,
_ => panic!("Unsupported profile"),
};
let out_dir = Utf8PathBuf::from(env::var("OUT_DIR").unwrap());
let workspace_dir = Utf8PathBuf::from_path_buf(std::env::current_dir().unwrap())
.unwrap()
.parent()
.unwrap()
.parent()
.unwrap()
.to_path_buf();
let workspace_target_dir = if let Ok(target) = env::var("CARGO_TARGET_DIR") {
Utf8PathBuf::from(target)
} else {
workspace_dir.join("target")
};
let rubicon_exports_dir = workspace_dir
.join("crates-outside-workspace")
.join("rubicon-exports");
BuildInfo {
cargo,
profile,
workspace_dir,
out_dir,
rustc,
workspace_target_dir,
rubicon_exports_dir,
}
}
fn workspace_artifact_dir(&self) -> Utf8PathBuf {
self.profile.artifact_dir(&self.workspace_target_dir)
}
fn rubicon_exports_artifact_dir(&self) -> Utf8PathBuf {
self.profile.artifact_dir(&self.out_dir)
}
/// Builds librubicon-exports and places it in the workspace's artifact directory, so that
/// we may link against it when building the main project.
fn build_rubicon_exports(&self) {
let mut cmd = Command::new(&self.cargo);
cmd.arg("build")
.arg("--manifest-path")
.arg(self.rubicon_exports_dir.join("Cargo.toml"))
.env("CARGO_TARGET_DIR", &self.out_dir);
println!(
"cargo:rerun-if-changed={}",
self.rubicon_exports_dir.join("Cargo.toml")
);
println!(
"cargo:rerun-if-changed={}",
self.rubicon_exports_dir.join("src").join("lib.rs")
);
if self.profile == Profile::Release {
cmd.arg("--release");
}
eprintln!("building rubicon-exports: {cmd:?}");
let status = cmd.status().expect("Failed to execute cargo build");
if !status.success() {
panic!("cargo build failed with status: \x1b[31m{}\x1b[0m", status);
}
let dylib_name = format!("librubicon_exports.{}", DYLIB_EXTENSION);
let artifact_dir = self.rubicon_exports_artifact_dir();
println!("cargo:rustc-link-search=native={artifact_dir}");
let dylib_path = artifact_dir.join(&dylib_name);
eprintln!("expecting dylib at: \x1b[32m{:?}\x1b[0m", dylib_path);
if !dylib_path.exists() {
panic!(
"rubicon-exports dylib not found at expected path: \x1b[31m{:?}\x1b[0m",
dylib_path
);
}
// Copy the dylib to the workspace's target directory
let workspace_dylib_path = self.workspace_artifact_dir().join(&dylib_name);
copy_file(&dylib_path, &workspace_dylib_path);
}
/// Copies `libstd-HASH.{dylib,so,etc.}` into the workspace's artifact directory ($TARGET/$PROFILE).
/// This will allow running home without `cargo run` — `.cargo/config.toml` sets the RPATH on macOS & Linux
/// to look for libs there (including libstd).
fn copy_libstd(&self) {
let rustc_libdir = Command::new(&self.rustc)
.arg("--print")
.arg("target-libdir")
.output()
.expect("Failed to execute rustc")
.stdout;
let libdir = Utf8PathBuf::from(
std::str::from_utf8(&rustc_libdir)
.unwrap()
.trim()
.to_string(),
);
let suffix = format!(".{}", DYLIB_EXTENSION);
let libstd_path = libdir
.read_dir_utf8()
.unwrap()
.find_map(|entry| {
let entry = entry.unwrap();
let name = entry.file_name();
println!("examining {name}");
if name.starts_with("libstd-") && name.ends_with(&suffix) {
return Some(entry.into_path());
}
None
})
.unwrap();
let libstd_name = libstd_path.file_name().unwrap().to_string();
let workspace_libstd_path = self.workspace_artifact_dir().join(&libstd_name);
copy_file(&libstd_path, &workspace_libstd_path);
}
}
fn main() {
let build_info = BuildInfo::new();
let build_info = Box::leak(Box::new(build_info));
println!("{:?}", build_info);
let rubicon_exports_handle = std::thread::spawn(|| {
build_info.build_rubicon_exports();
});
let copy_libstd_handle = std::thread::spawn(|| {
build_info.copy_libstd();
});
rubicon_exports_handle
.join()
.expect("Rubicon exports thread panicked");
copy_libstd_handle
.join()
.expect("Libstd symlink thread panicked");
}
fn copy_file(source: &Utf8Path, destination: &Utf8Path) {
if let Some(parent) = destination.parent() {
fs::create_dir_all(parent).unwrap_or_else(|e| {
eprintln!("Failed to create parent directory {}: {e}", parent);
});
}
if let Ok(_meta) = destination.symlink_metadata() {
eprintln!("Destination file exists (or is a symlink) removing it: {destination}");
fs::remove_file(destination).unwrap_or_else(|e| {
eprintln!("Failed to remove existing file: {e}");
});
}
fs::copy(source, destination).unwrap_or_else(|e| {
panic!(
"Failed to copy file from {:?} to {:?}: {}",
source, destination, e
);
});
}
home is very nice to use, at least for me, cause I made it:
fasterthanli.me on main [!+]
❯ home serve --open
Loading empty config (got roots .)
Warning: Using dummy MOM_API_KEY. Set MOM_API_KEY if you want to be able to deploy.
✂️
Booting up in development
Mom is listening on 127.0.0.1:1118
Starting up cub, who expects a mom at: http://127.0.0.1:1118
INFO mod_cub::impls: Waiting for mom's good morning message to initialize tenants and start serving content...
Setting up tenant fasterthanli.me
Inserting tenant fasterthanli.me, base dir is /Users/amos/bearcove/fasterthanli.me
Setting global state with 1 tenants
Trying to load all sponsors from db...
fasterthanli.me No sponsors found in DB
Grabbing sponsors inflight for tenant fasterthanli.me; gs has 1 tenants
INFO mod_mom::impls::endpoints: got /events request
INFO mod_mom::impls::endpoints: GET /events -> 101 (took 32.584µs)
INFO mod_mom::impls::endpoints: connection upgraded to websocket!
INFO mod_mom::impls::endpoints: in good morning, for tenant fasterthanli.me, sending -1 sponsors (-1 means None)
INFO mod_mom::impls::endpoints: Starting WebSocket message loop
INFO mod_momclient: 🧸 mom connection established! uri=ws://127.0.0.1:1118/events elapsed=29.078417ms
No revision in good morning, let's make one
INFO mod_revision::impls::make: Processed 2355 events in 112.641417ms (3869 add actions)
INFO mod_revision::impls::make: Revision config: /// That config is part of the revision paks — it's stored in `home.config.json` and
/// contains no secrets at all
RevisionConfig {
/// where to push this site in prod?
id: fasterthanli.me,
/// note: domains are configured on mom's side so folks can't
/// randomly override, say, `fasterthanli.me`, with whatever they want.
/// Patreon campaign IDs to allow access
patreon_campaign_ids: Vec<String> [
3012080,
693778,
],
/// admin github user IDs
admin_github_ids: Vec<String> [
7998310,
],
/// admin patreon user IDs
admin_patreon_ids: Vec<String> [
47556,
],
/// SVG font face collection
svg_fonts: Vec<SvgFontSpec> [
SvgFontSpec {
/// how the font is referred to in CSS, e.g. `IosevkaFtl`
family: IosevkaFtl,
/// where to find the font on disk (relative to the base directory, ie. where `content` is)
path: (),
/// weight: 400 is normal, 700 is bold, etc.
weight: FontWeight {
0: 400,
},
/// style: normal, etc.
style: FontStyle::Normal,
},
SvgFontSpec {
/// how the font is referred to in CSS, e.g. `IosevkaFtl`
family: IosevkaFtl,
/// where to find the font on disk (relative to the base directory, ie. where `content` is)
path: (),
/// weight: 400 is normal, 700 is bold, etc.
weight: FontWeight {
0: 700,
},
/// style: normal, etc.
style: FontStyle::Normal,
},
],
}
INFO mod_revision::impls::load: Built 255/255 pages in 881.000334ms
INFO mod_revision::impls::watch: [fasterthanli.me] Watching /Users/amos/bearcove/fasterthanli.me/home.json
INFO mod_revision::impls::watch: [fasterthanli.me] Watching /Users/amos/bearcove/fasterthanli.me/content
INFO mod_revision::impls::watch: [fasterthanli.me] Watching /Users/amos/bearcove/fasterthanli.me/templates
INFO mod_cub::impls: 🦊 Visit the site at http://fasterthanli.me.snug.blog:1111
INFO mod_cub::impls: GET / -> 200 (took 10.812291ms)
^C WARN mod_mom::impls::endpoints: Received SIGINT
WARN mod_mom::impls::endpoints: Exiting immediately
WARN mod_cub::impls::graceful_shutdown: Received SIGINT
WARN mod_cub::impls::graceful_shutdown: Exiting immediately
And not just because of the colored terminal output, but because of dragging assets, making sure we serve nice JPEG-XL, and AVIF and WebP assets, being able to embed videos pretty easily, the fast deploys, the integration with Vite as a bundler so that you can add Svelte 5 components like the search on my website.
That deploy was actually pretty slow. I want to run the build in the lint and deploy in parallel. There's no reason to serialize all this.
I could see people argue that it’s nothing special, but it’s a lot of non-special things put together just the way I like them and really at the end of the day that’s all I’m asking for.
Some of the weird things about it include dylo, which I’m probably going to get rid of soon — along with rubicon, because my new project facet obviates the needs for those “impl crate / consumer crate” a lot.
Basically, all the types that are used in the APIs will be in types crates, which will be shared across different parts of the project, and the implementations will be of type dylib. And that’ll be that.
There’s not going to be feature flags to enable serde derives or other derives conditionally. It will always be a facet derive, which allows us to do things like CLI argument parsing, serialization and deserialization, treating objects as values in templating engines, etc.
You know what they say: over promise and over deliver. Right?
Of course, this is still a work in progress, but I’m excited to have home open source while I work through this transition and I’m going to make sure that I measure compile times while I do it.
Another way in which home is weird compared to something like hugo or zola is that it’s not actually a static website generator. It’s more like it has a server component.
In fact, it has two server components: it has the mom
, which takes care of
asset derivations, dealing with deployments, etc., and it has the cubs
, which
do caching and template rendering at the end.
Closing words
I don’t know why this felt like such a big step. You open source something and nothing really happens. It’s not like facet — home is not meant to take over the world.
But it is nice to be able to use public infrastructure for it. And I suppose some folks, despite my advice, will want to run it for themselves and improve it.
happy hundredth repository I guess???
I’m excited to add more features to it. I think some more tooling is gonna come out of it to build Docker images efficiently, for example, without a Docker daemon present.
Right now I use regclient for it and it takes way longer than it should. I have an issue open for that.
There are so many things to do, y’all. I don’t have time to do everything I want to, so I’m gonna ship this article and move on to the next thing. I hope some of this can be of use to you. Thanks for following me along for the ride. I’m lucky to be able to do that work for you.
Here's another article just for you:
Rust generics vs Java generics
In my previous article, I said I needed to stop thinking of Rust generics as Java generics, because in Rust, generic types are erased.
Someone gently pointed out that they are also erased in Java, the difference was elsewhere. And so, let’s learn the difference together.
Java generics
I learned Java first (a long, long time ago), and their approach to generics made sense to me at the time.