Skip to content

JoinDataCops/react-prerender-worker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

⚑ react-prerender-worker

10 million apps built on Lovable. Millions more on Bolt, v0, Cursor, Replit. Every single one is invisible to Google.

AI platforms are building 100,000+ new React apps every day. Beautiful apps. Functional apps. Apps that Google, ChatGPT, and every crawler that drives traffic see as an empty <div>. This is the defining infrastructure crisis of AI-built software.

The "industry solution"? Migrate to Next.js. Rewrite your entire app. Pay Vercel $20–$100+/month. Get locked into their ecosystem forever.

Your AI built the app in 5 minutes. Next.js wants you to spend weeks rewriting it. That's insane.

This repo fixes it. 30 minutes. $0. Zero code changes.

License: MIT Cloudflare Workers Supabase React Vite

Works with: React Β· Vite Β· Create React App Β· Remix SPA Β· Gatsby Β· Vue Β· Svelte Β· Angular Β· Astro Β· any static SPA

Built for: Lovable Β· Bolt.new Β· v0 Β· Cursor Β· Replit Β· any AI app builder


🚨 The AI App SEO Crisis β€” By The Numbers

Every React SPA β€” whether built by you, an AI, or a team of engineers β€” ships this to crawlers:

<html>
  <body>
    <div id="root"></div>
    <script src="/assets/index.js"></script>
  </body>
</html>

That's what Google sees. That's what ChatGPT sees. That's what every bot that drives traffic sees. Nothing.

The scale of the problem:

  • 70% of modern websites are invisible to AI crawlers β€” Spruik research, Dec 2025. GPTBot, ClaudeBot, PerplexityBot don't render JavaScript. Period.
  • Vercel's own research confirms it β€” the company selling you Next.js published data proving AI crawlers can't read JavaScript-rendered pages. They created the problem, then sold you the solution.
  • 10M+ projects on Lovable alone, plus millions more on Bolt.new, v0, Cursor, and Replit β€” all outputting invisible SPAs
  • 100,000+ new React apps every day across AI platforms, every one born invisible

What this costs your business:

  • 🚫 Zero organic search traffic β€” Google can't index what it can't read
  • 🚫 Broken social sharing β€” LinkedIn, Twitter, Facebook show blank previews
  • 🚫 Invisible to AI β€” ChatGPT, Perplexity, Claude never recommend your product
  • 🚫 Lost revenue β€” every day your pages aren't indexed is money left on the table

πŸ€– Built for AI App Builders

This isn't a tool for Next.js developers. It's infrastructure for the millions of people building apps with AI who have no idea their sites are invisible.

Platform Apps Built Framework Output SEO Out of the Box? Fix with This Repo?
🟣 Lovable 10M+ projects React + Vite ❌ No βœ… 30 min
⚑ Bolt.new Millions React + Vite ❌ No βœ… 30 min
β–² v0 (Vercel) Millions React ❌ No βœ… 30 min
πŸ”΅ Cursor Millions Any SPA ❌ No βœ… 30 min
🟒 Replit Millions Any SPA ❌ No βœ… 30 min

Your AI built the app. This repo makes Google see it. 30 minutes. $0.

These platforms output React SPAs. Their users β€” entrepreneurs, creators, small businesses β€” don't know what SSR is. They don't know what a framework migration is. They just know Google can't find their site. This is the fix.


πŸ’€ Next.js β€” The $100/Month Trap

Let's talk about what the "industry standard" actually costs you.

The Comparison

⚑ react-prerender-worker Next.js on Vercel
πŸ’° Monthly cost $0 (free tiers) $20–$100+ (bandwidth limits on free)
⚑ Bot response time ~50ms (edge cache) ~200–500ms (traditional SSR)
πŸ”’ Vendor lock-in None β€” keep your stack Deep (community revolt)
πŸ“„ Pages supported 350,000+ tested in production Depends on pricing tier
🌍 Global performance Edge (300+ cities) Regional servers
πŸ”„ Migration effort Zero β€” drop-in addition Full app rewrite
πŸ‘€ User experience Pure SPA (instant navigation) SSR + hydration overhead
πŸ€– AI crawler support 100+ bot patterns Basic
πŸ”„ Cache automation Built-in (pg_cron, auto-refresh) Manual ISR configuration
⏱️ Setup time ~30 minutes Days to weeks of migration
πŸ—οΈ Host anywhere Cloudflare, Vercel, Netlify, anywhere Vercel-optimized only

The Vendor Lock-In Receipts

This isn't speculation. The developer community is screaming about it:

  • πŸ“’ Reddit: "Is Next.js a Vendor Lock-In Architecture?" β€” hundreds of upvotes, developers sharing migration horror stories
  • πŸ“° Netlify's public callout β€” Netlify engineers have publicly documented Next.js features that only work properly on Vercel
  • πŸƒ Companies leaving β€” growing list of teams migrating away from Next.js due to Vercel dependency, spiraling costs, and deployment complexity
  • πŸ’Έ The cost trap β€” Vercel's free tier has hard bandwidth limits. Hit them with a viral post and you're paying $150+ overnight

The Absurdity Angle

Think about what Next.js migration actually means for an AI platform user:

  1. Your AI built a working app in 5 minutes βœ…
  2. Next.js says: "Now learn React Server Components" ❌
  3. Next.js says: "Rewrite every route as a server action" ❌
  4. Next.js says: "Configure ISR, understand caching strategies" ❌
  5. Next.js says: "Deploy on Vercel and pay us monthly" ❌
  6. Next.js says: "Oh, and your app won't work the same on other hosts" ❌

Or: Add this Worker. 30 minutes. Done. Your app doesn't change. Your hosting doesn't change. Google sees everything.

"But Isn't This Cloaking?" β€” No.

Google's own documentation explicitly approves pre-rendering as a legitimate SEO technique:

"Dynamic rendering is not cloaking" β€” Google Search Central

Google recommends pre-rendering for JavaScript-heavy sites. The content served to bots is identical to what users see after the SPA loads. This is the Google-approved approach.


🧠 How It Actually Works

A tiny Cloudflare Worker (free tier) sits in front of your domain. Every request, it makes one decision in under 1ms:

                    yoursite.com
                        β”‚
                        β–Ό
              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
              β”‚  Cloudflare Worker  β”‚
              β”‚   (Bot Detector)    β”‚
              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                   β”‚           β”‚
            Bot? β”€β”€β”˜           └── Human?
              β”‚                     β”‚
              β–Ό                     β–Ό
     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
     β”‚   Supabase DB  β”‚    β”‚ Cloudflare CDN β”‚
     β”‚ (Rich HTML     β”‚    β”‚  (Your React   β”‚
     β”‚  with Schema)  β”‚    β”‚   SPA, fast)   β”‚
     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
  • Bots β†’ get pre-built, SEO-rich HTML with Schema.org, Open Graph, meta tags
  • Humans β†’ get your blazing-fast React SPA from the global CDN
  • Cache β†’ auto-refreshes every 6 hours via pg_cron β€” zero maintenance
  • You β†’ pay nothing, change nothing in your codebase

No server. No build step. No framework swap. No manual cache management.


πŸ€– 100+ Bot Patterns β€” Out of the Box

The Worker recognizes every crawler that matters:

Category Bots Detected
πŸ” Search Engines Googlebot, Bingbot, Yandex, Baidu, DuckDuckGo, Ecosia, Mojeek
πŸ€– AI Crawlers GPTBot, ClaudeBot, PerplexityBot, Google-Extended, Gemini, Meta AI, Mistral, Cohere
πŸ“± Social Media Facebook, Twitter/X, LinkedIn, WhatsApp, Discord, Telegram, Reddit, Pinterest
πŸ“Š SEO Tools Ahrefs, SEMrush, Screaming Frog, Moz, SERPstat
🍎 Platform Bots Applebot, AmazonBot, PetalBot
πŸ“° Feed Readers Feedly, Flipboard, NewsBlur, Inoreader
πŸ›οΈ Archives Wayback Machine, Archive.org

Every single one sees your full content, Open Graph tags, Schema.org markup, and meta descriptions β€” not an empty <div>.


πŸ—οΈ Works With Any Framework AND Any Host

This is not a framework. It's a layer that sits in front of any SPA, hosted anywhere:

Frameworks

Framework Compatible? Notes
βš›οΈ React + Vite βœ… Primary target, battle-tested with 9,000+ pages
βš›οΈ Create React App βœ… Drop-in, no ejection needed
βš›οΈ Remix (SPA mode) βœ… Works with client-side Remix
🟒 Vue.js βœ… Any Vue SPA that builds to static files
πŸ”Ά Svelte/SvelteKit βœ… Static adapter works perfectly
πŸ…°οΈ Angular βœ… Standard Angular CLI builds
πŸš€ Astro βœ… Client-rendered pages
πŸ“¦ Any static SPA βœ… If it builds to HTML/JS/CSS, it works

Hosting Providers

The Worker just needs an origin URL. Your app can live anywhere:

Host PAGES_ORIGIN value Notes
☁️ Cloudflare Pages https://your-project.pages.dev Easiest β€” same ecosystem
β–² Vercel https://your-project.vercel.app Works perfectly
πŸ”· Netlify https://your-project.netlify.app Works perfectly
πŸš€ Lovable https://your-id.lovable.app Built with AI, SEO with this
πŸ™ GitHub Pages https://username.github.io/repo Free static hosting
πŸ”₯ Firebase Hosting https://your-project.web.app Google's CDN
🌊 Surge.sh https://your-project.surge.sh Simple static hosting
πŸ–₯️ Any server https://your-origin-url.com VPS, Docker, anything with a URL

The Worker doesn't care where your files live. It only needs the URL to proxy human traffic to. Set PAGES_ORIGIN to whatever your hosting provider gives you, and it just works.

Zero code changes to your app. The Worker only routes traffic β€” it never touches your build or hosting.


πŸš€ Setup in 30 Minutes

What You Need (All Free)

  • Your SPA, hosted anywhere (Vercel, Netlify, Cloudflare Pages, Lovable, GitHub Pages, etc.)
  • A Supabase account (free tier)
  • A Cloudflare account (free tier β€” only for the Worker + DNS)
  • Your domain's DNS managed by Cloudflare

Step 1 β€” Create the Cache Table

Run in Supabase SQL Editor:

CREATE TABLE IF NOT EXISTS prerendered_pages (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  path TEXT UNIQUE NOT NULL,
  html TEXT NOT NULL,
  title TEXT,
  hit_count INTEGER DEFAULT 0,
  created_at TIMESTAMPTZ DEFAULT now(),
  updated_at TIMESTAMPTZ DEFAULT now(),
  expires_at TIMESTAMPTZ
);

CREATE INDEX IF NOT EXISTS idx_prerendered_pages_path ON prerendered_pages(path);

ALTER TABLE prerendered_pages ENABLE ROW LEVEL SECURITY;
CREATE POLICY "Public read access" ON prerendered_pages FOR SELECT USING (true);

Step 2 β€” Deploy the Prerender Function

Create supabase/functions/prerender/index.ts:

import { createClient } from "https://esm.sh/@supabase/supabase-js@2";

const supabase = createClient(
  Deno.env.get("SUPABASE_URL")!,
  Deno.env.get("SUPABASE_SERVICE_ROLE_KEY")!
);

Deno.serve(async (req) => {
  if (req.method === "OPTIONS")
    return new Response(null, { headers: { "Access-Control-Allow-Origin": "*" } });

  const url = new URL(req.url);
  let path = url.searchParams.get("path") || "/";
  if (path !== "/" && path.endsWith("/")) path = path.slice(0, -1);

  const { data } = await supabase
    .from("prerendered_pages")
    .select("html")
    .eq("path", path)
    .maybeSingle();

  if (data?.html) {
    return new Response(data.html, {
      headers: { "Content-Type": "text/html; charset=utf-8", "X-Cache": "hit" },
    });
  }

  return new Response("Not found", { status: 404, headers: { "X-Cache": "miss" } });
});

Step 3 β€” Deploy the Worker

npm install -g wrangler
wrangler login
wrangler deploy

Step 4 β€” Set Environment Variables

Cloudflare Dashboard β†’ Worker β†’ Settings β†’ Variables:

Variable Value
SUPABASE_URL https://YOUR-PROJECT.supabase.co
SUPABASE_ANON_KEY Your Supabase anon key
PAGES_ORIGIN Your app's origin URL (must include https://)

PAGES_ORIGIN examples by host:

  • Cloudflare Pages: https://my-app.pages.dev
  • Vercel: https://my-app.vercel.app
  • Netlify: https://my-app.netlify.app
  • Lovable: https://my-id.lovable.app
  • GitHub Pages: https://user.github.io/repo
  • Any server: https://your-origin.com

Step 5 β€” Configure Worker Routes

Cloudflare β†’ Websites β†’ Your Domain β†’ Workers Routes:

Route Worker
yourdomain.com/* your-worker-name
www.yourdomain.com/* your-worker-name

⚠️ Use Worker Routes, not Pages Custom Domains.

Step 6 β€” Automate Cache Refresh (Optional but Recommended)

Set up a pg_cron job to auto-refresh your cache every 6 hours:

SELECT cron.schedule(
  'refresh-prerender-cache',
  '0 */6 * * *',
  $$
  SELECT net.http_post(
    url := 'https://YOUR-PROJECT.supabase.co/functions/v1/generate-prerender-cache',
    headers := '{"Authorization": "Bearer YOUR_ANON_KEY"}'::jsonb
  );
  $$
);

This means zero manual maintenance. New pages are cached automatically.


πŸ“ Populating the Cache

Generate HTML for each page and store it in prerendered_pages:

await supabase.from("prerendered_pages").upsert(
  {
    path: "/about",
    title: "About Us",
    html: `<!DOCTYPE html>
<html lang="en">
<head>
  <title>About Us | YourApp</title>
  <meta name="description" content="Learn about our mission">
  <meta property="og:title" content="About Us">
  <link rel="canonical" href="https://yourdomain.com/about">
</head>
<body><h1>About Us</h1><p>Your content here...</p></body>
</html>`
  },
  { onConflict: "path" }
);

βœ… Verify It Works

# Human request β€” gets your SPA
curl -I https://yourdomain.com/
# β†’ No X-Prerendered header

# Bot request β€” gets cached HTML
curl -I -H "User-Agent: Googlebot/2.1" https://yourdomain.com/
# β†’ X-Prerendered: true, X-Cache: hit

⚠️ Common Mistakes

Mistake Fix
PAGES_ORIGIN missing https:// Always include the protocol
Bot Fight Mode enabled Turn it OFF (Cloudflare β†’ Security β†’ Bots)
Domain added as Pages Custom Domain Use Worker Routes instead
Empty cache table Run your cache generator first
Worrying about "cloaking" Google approves dynamic rendering

πŸ›‘οΈ Common Concerns β€” Addressed

"Isn't serving different HTML to bots considered cloaking?"

No. Google explicitly documents this as dynamic rendering and approves it for JavaScript-heavy sites. The key requirement: the content must be equivalent. Your pre-rendered HTML contains the same content users see after the SPA loads β€” just without waiting for JavaScript execution.

"Next.js Edge Runtime is also fast"

True β€” but you still need to rewrite your entire app to use it. This solution gives you edge-speed responses without changing a single line of your existing code. Also, Next.js Edge Runtime still requires Vercel or complex self-hosting. This runs on Cloudflare's free tier.

"What about the loading spinner for humans?"

Users see a brief loading state (~1 second) while the SPA boots. After that, every navigation is instant β€” no server round-trips, no hydration jank. Modern SPAs with code splitting load fast. And the people who matter most for your growth β€” Google, AI crawlers, social media bots β€” see full content immediately.

"Can this handle large sites?"

Battle-tested with 350,000+ pages in production, auto-refreshed every 6 hours. The Supabase free tier handles this comfortably.


🀝 Built for AI-Assisted Development

This repo is designed to be implemented by AI coding assistants like Lovable, Cursor, Bolt, or ChatGPT. Hand the setup guide to your AI, point it at your React app, and you'll have full SEO in under an hour.

The architecture is intentionally simple β€” two services (Cloudflare + Supabase), one Worker file, one Edge Function, one database table. Any AI agent can understand and implement it.


License

MIT β€” use it, fork it, ship it.


Serving 350,000+ SEO-optimized pages for $0/month in production.

Keywords: react seo, react spa seo, react prerender, react server side rendering alternative, react cloudflare workers, react supabase, vite seo, cra seo, react google indexing, react open graph, react social sharing, spa prerendering, react bot detection, nextjs alternative, free react ssr, react crawlers, react ai crawlers, react schema markup, react meta tags, static site generation react, lovable seo, bolt.new seo, v0 seo, ai app builder seo, react spa google invisible, nextjs alternative free, vite seo fix, react prerender free, cloudflare worker seo, ai website seo, ai generated website google indexing, nextjs vendor lock-in, vercel alternative, cursor seo, replit seo, ai built website seo, dynamic rendering react, react prerender cloudflare