A recent concept I shared on Farcaster garnered some interest, so I wanted to dive deeper into the details. This approach leverages several tools and services to create a decentralized, git-based "CMS" that enables editing content from virtually anywhere.
The core of the system is an Obsidian template that includes structured YAML frontmatter. File names use a Zettelkasten-inspired millisecond timestamp tag, along with a created
timestamp and the initial post title.
---
title: "1671418753342"
created: "1671418753342"
longform: false
published: false
---
Using the timestamp as the title allows most content to exist as a "journal" entry. More descriptive titles can be set when needed.
Obsidian is configured to automatically sync content with a private Github repo shortly after editing, providing continuous backup and versioning. A published
flag in the frontmatter determines if an entry should be displayed publicly.
Upon pushing to the repo, a Github Action runs to upload any assets from the designated folder to a Cloudflare R2 bucket (similar to Amazon S3 but with a generous free tier).
name: Cloudflare
on:
push:
branches:
- main
workflow_dispatch: null
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: R2 Directory Upload
uses: willfurstenau/r2-dir-upload@main
with:
accountid: "${{ secrets.CF_ACCOUNT_ID }}"
accesskeyid: "${{ secrets.CF_ACCESS_KEY }}"
secretaccesskey: "${{ secrets.CF_SECRET_KEY }}"
bucket: iam-bucket
source: "${{ github.workspace }}/Assets"
destination: /
With content in the repo and assets on Cloudflare, we turn to NextJS to build the frontend. Two key queries come into play: getObsidianEntries
and getObsidianEntry
.
getObsidianEntries
fetches all entries from the designated repo and directory:
export default async function getObsidianEntries() {
const token = process.env.NEXT_PUBLIC_GITHUB;
const {
data: {
repository: {
object: { entries },
},
},
} = await fetch(`https://api.github.com/graphql`, {
method: `POST`,
headers: {
"Content-Type": `application/json`,
Authorization: `Bearer ${token}`,
},
body: JSON.stringify({
query: `
query fetchEntries($owner: String!, $name: String!) {
repository(owner: $owner, name: $name) {
object(expression: "HEAD:Content/") {
... on Tree {
entries {
name
object {
... on Blob {
text
}
}
}
}
}
}
}
`,
variables: {
owner: `GITHUB_USERNAME`,
name: `REPO_NAME`,
first: 100,
},
}),
next: {
revalidate: 1 * 30,
},
}).then((res) => res.json());
return entries;
}
A Github access token is required to query private repos via GraphQL (see Authenticating with GraphQL). Drilling into the repository object, filtered by the Content
directory on the main
branch, returns the desired entries. Next13's revalidation flag keeps the content fresh.
Once entries are fetched, the content can be manipulated as needed. I parse the YAML frontmatter with gray-matter
and render Markdown using react-remark
, part of the expansive Unified/Remark ecosystem.
The second query, getObsidianEntry
, is used for individual post pages:
export default async function getObsidianEntry(slug: any) {
const paths = await getObsidianEntries();
const _paths = await Promise.all(paths);
const entry = _paths.find((entry: any) => entry.slug === slug);
return entry;
}
Ideally this would directly query GraphQL for the specific entry needed, but I haven't yet gotten that filter to work. As a temporary solution, the appropriate entry is filtered out using the page slug (Zettelkasten tag) as the key.
And there you have it - a decentralized approach to using Obsidian as a lightweight CMS. I hope these notes prove useful for anyone interested in setting up a similar system.