I want to host my blog on a custom domain that I own (fahmifj.space) instead of using the GitHub Pages domain (fahmifj.space).
However, migrating a site to another domain could negatively impact SEO. Although I’m not really concerned about it, I think it’s a great opportunity to learn something from this case.
So, instead of moving everything, I keep GH Pages as the primary site and let the custom domain act as a mirror. Both domains serve the same content without redirecting one to the other.
This post documents the setup and explains why keeping GitHub Pages as the primary works better for me.
Cloudflare Workers as Reverse Proxy
Cloudflare provides a feature called Workers. It allows you to write and run serverless code, similar in concept to Google Cloud Functions.
Using this feature, I write a Worker that acts as a reverse proxy. Its job is to fetch my GitHub Pages content and serve it back on the custom domain.
┌──────────────────────────┐
│ fahmifj.space │
│ (upstream hosting) │
└──────────┬───────────────┘
│ GitHub Pages origin
▼
Cloudflare Worker
(proxy + rewrite)
│
▼
fahmifj.space
(custom domain)Create Worker (v1)
This first version of the Worker functions purely as a content fetcher.
export default {
async fetch(request) {
const url = new URL(request.url);
// Force everything to fetch from GitHub Pages
url.hostname = "fahmifj.space";
const response = await fetch(url.toString(), {
headers: request.headers,
redirect: "follow"
});
// Copy GitHub response and serve it
return new Response(response.body, response);
}
};The result is straightforward, it serves the same content as GitHub Pages.

Attach Custom Domain to Worker
Next, I attach my custom domain to the Worker (Workers & Pages → My Worker → Settings → Domain & Routers).

At this point, fahmifj.space is already serving the same content as fahmifj.space.
Worker with URL Rewriter (v2)
Hugo builds my site with absolute URLs based on my GH Pages domain, which is fahmifj.space. Because of this, clicking most links on the custom domain sends users back to GH Pages.
To fix that, I need to make an adjustment on the Worker to rewrite all the GitHub Pages URLs and replace them with the custom domain. I also add a condition to:
- Block search engines indexing bot.
- Return 404 for the sitemap.
- Remove canonical link before the contents are served on the custom domain.
export default {
async fetch(request) {
const upstream = "fahmifj.space"; // GitHub Pages domain
const host = "fahmifj.space"; // Custom domain
const url = new URL(request.url);
// Prevent indexing by googlebot
const ua = request.headers.get("user-agent") || "";
if (ua.includes("googlebot", "bingbot", "slurp", "duckduckbot", "baiduspider", "yandex")) {
return new Response("Not for indexing", { status: 403 });
}
// Do not serve sitemap
if (url.pathname === "/sitemap.xml") {
return new Response("", { status: 404 });
}
url.hostname = upstream;
// Fetch upstream content
const res = await fetch(url.toString(), {
headers: request.headers,
redirect: "follow",
});
// Only HTML needs rewriting, leave CSS and JS as is
const contentType = res.headers.get("content-type") || "";
if (!contentType.includes("text/html")) {
return res;
}
// Read HTML response
let html = await res.text();
// Rewrite all URLs from upstream → custom domain
html = html.replaceAll(`https://${upstream}`, `https://${host}`);
html = html.replaceAll(`http://${upstream}`, `https://${host}`);
html = html.replaceAll(`${upstream}`, host);
// Remove canonical <link> tag (so GitHub Pages stays canonical)
html = html.replace(
/<link[^>]+rel=["']?canonical["']?[^>]*>/gi,
""
);
// Add no-index header (no crawling on my custom domain)
const newHeaders = new Headers(res.headers);
newHeaders.set("X-Robots-Tag", "noindex, nofollow");
// Return modified HTML
return new Response(html, {
status: res.status,
headers: newHeaders
});
}
};Conclusion
With this setup, the custom domain acting as a mirror can coexist with GH Pages.
You might think that it is odd to keep using GitHub pages even though I already have a custom domain. The reason is simple: I want my blog to stay online as long as possible by relying on a free service provided by GitHub itself.
If I used my custom domain as the primary site, I would be responsible for maintaining that domain (or even server). The custom domain can expire in one or two year and I might forget renewing it or even I decide to switch to another TLD. These all could negatively impact SEO*.
*or .. maybe I’m just a lazy sysadmin.
But with GitHub Pages as the primary site, I don’t have to worry about maintaining a domain. GitHub handles everything and the canonical URL stays consistent. This way, I will still have the freedom to change my custom domain in the future without worrying about SEO.
Okay that’s it. See you in the next post!