My new favorite free CMS can run through your build minutes, let's get around that
I’m always looking to get the most out of my free tier developer services. Netlify has a generous free tier, so I often host my projects there. But complexity expands when the primary user of a website is a non-technical person.
While developing a website for a friend, I found my new favorite solution for content management. It’s called Decap CMS.
Decap allows you to use your own git repo as your CMS. This is something you’re probably already used to if you’ve programmed your own blog or portfolio. It’s easy for a programmer but daunting and risky for a non-technical person.
Decap allows you to build a UI into the /admin page of your website for your users to update. You define the content types and your user updates them in familiar UI without ever seeing JSON or markdown. They can even upload images.
If you’ve set up Decap with Netlify, you’ve probably noticed the default workflow:
For a small site that’s fine, but it has a few real annoyances:
There’s a cleaner approach and it really only takes 2 additional steps:
Decap will still commit to GitHub, but instead of the site reading those files in the build, it reads them via GitHub’s raw content URL. No rebuild needed.
Now, it’s 2026. So at this point you can probably just ask Claude to set this up for you. But if you want more details this post walks through how to set it up.
This guide is going to assume you already have Decap setup are just looking to save your build minutes.
For GitHub raw content to be accessible without a token, the repository must be public.
If your repo is currently private:
Is this safe?
For a typical contractor or small business site, yes. The content (services, testimonials, portfolio photos) is information you want the public to see. The only things that should never be in a public repo are secrets—API keys, tokens, passwords. Those belong in environment variables, never committed to the repo.
By default, Netlify rebuilds on every push to main. Since your content updates will just be data file commits, you wont want to trigger a rebuild for this.
In Netlify: Site configuration → Build & deploy → Continuous deployment → Branch deploys — you can configure ignored build paths. But I prefer a config file.
In your netlify.toml at the project root, add this line:
[build]
ignore = "git diff --quiet $CACHED_COMMIT_REF $COMMIT_REF -- . ':!content' ':!public/uploads'"
Where :!content is a folder name you want ignored. This tells Netlify to skip the build if the only files that changed are inside content/. Code changes still rebuild; content-only commits do not.
All content fetching can live in one file: services/api.ts. It replaces what used to be filesystem reads (fs.readFileSync) with fetch calls to GitHub.
Here’s some snippets:
const CONTENT_BASE =
'https://raw.githubusercontent.com/{owner}/{repo}/main/content';
async function fetchJson<T>(file: string): Promise<T> {
const res = await fetch(`${CONTENT_BASE}/${file}`, { cache: 'no-store' });
if (!res.ok) throw new Error(`fetch ${file} failed: ${res.status}`);
return res.json() as Promise<T>;
}
const UPLOADS_BASE =
'https://raw.githubusercontent.com/{owner}/{repo}/main/public';
function resolveUploadUrl(path: string | null | undefined): string | null {
if (!path) return null;
if (path.startsWith('/uploads/')) return `${UPLOADS_BASE}${path}`;
return path;
}
If you have high traffic and want to reduce GitHub fetches, swap { cache: 'no-store' } for { next: { revalidate: 60 } } and Next.js will cache for 60 seconds between refreshes.
Two base URLs, one helper for images, one generic JSON fetcher. Everything else builds on those.
cache: 'no-store' tells Next.js never to cache these responses — always go to GitHub for the latest version. The content is small JSON, so there’s no meaningful performance cost to fetching fresh on every request.
Each collection has its own export. The pattern is identical across all of them: call fetchJson, handle errors by falling back to the locally stored data (from whatever the last build was) so the site never shows a blank page.
An important thing to note is that whenever we encounter a photo url, we have to change that to the actual remote using our resolveUploadUrl util.
import { LOCAL_SITE_CONFIG, LOCAL_PORTFOLIO } from '../content';
export async function fetchSiteConfig(): Promise<SiteConfig> {
try {
return await fetchJson<SiteConfig>('site-config.json');
} catch {
return LOCAL_SITE_CONFIG;
}
}
export async function fetchPortfolio(): Promise<PortfolioItem[]> {
try {
const { portfolio } = await fetchJson<{ portfolio: PortfolioItem[] }>('portfolio.json');
return portfolio.map((item) => ({
...item,
photos: item.photos.map((p) => resolveUploadUrl(p) ?? p),
}));
} catch {
return LOCAL_PORTFOLIO;
}
}
/admin and edit a piece of content.revalidate window, or instantly if you used no-store.Cancelled deploy.Now when you’re developing you’ll have to be aware that you have a new dev on your team. Changes your client makes go directly into the git history, so you’ll have to pull down their changes before making your own.
Slightly slower first load (cold cache). When the Next.js cache expires and re-fetches from GitHub, there’s a small latency hit vs. reading a local file. For a static site with a 60-second revalidation window, visitors almost never hit this.
Public repo means public history. Anyone can see every commit, including past content. If you ever committed sensitive data, scrub it before going public.
GitHub rate limits. Unauthenticated raw content requests are generous but not unlimited. For a low-traffic site this is a non-issue. Add a GITHUB_TOKEN env var if you want headroom.
GitHub doesn’t publish a specific number for raw.githubusercontent.com. In May 2025 they announced updated rate limits for unauthenticated requests, including raw content downloads, but didn’t disclose the threshold. Community reports show HTTP 429s appearing under sustained load.
For a small business site this is unlikely to be an issue. However, if you want to be safe, you can add a GITHUB_TOKEN environment variable and pass it as an Authorization header. Authenticated requests are documented at 5,000 per hour per token:
async function fetchJson<T>(file: string): Promise<T> {
const res = await fetch(`${CONTENT_BASE}/${file}`, {
cache: 'no-store',
headers: process.env.GITHUB_TOKEN
? { Authorization: `Bearer ${process.env.GITHUB_TOKEN}` }
: {},
});
if (!res.ok) throw new Error(`fetch ${file} failed: ${res.status}`);
return res.json() as Promise<T>;
}
To generate a token: GitHub → Settings → Developer settings → Personal access tokens → Fine-grained tokens. Scope it to read-only access on the repo’s contents. Add it as a GITHUB_TOKEN environment variable in Netlify.