Your website looks great in a browser. The copy is sharp, the product is clearly explained, and the value proposition is front and center. But when an AI crawler visits that same URL, it may receive something entirely different: a mostly empty HTML document with no meaningful content at all.
This is the JavaScript rendering problem, and it affects a significant share of modern websites. Understanding why it happens — and how to fix it — is one of the most impactful technical improvements you can make for AI visibility.
How AI crawlers fetch pages
An AI crawler is, at its core, an HTTP client. It sends a GET request to a URL, receives a response, and reads the content. That's it. There is no browser engine, no JavaScript runtime, no DOM construction. The crawler sees exactly what the server returns in that initial HTTP response — nothing more.
This is fundamentally different from Googlebot, which has a separate rendering queue that executes JavaScript and processes the page as a browser would (with some delay). AI crawlers — GPTBot, PerplexityBot, ClaudeBot, Google-Extended — do not have this capability. When they fetch a page, they get raw HTML and stop there.
What AI crawlers actually receive from JavaScript-heavy sites
React and other single-page apps (client-side rendering)
A React app built with Create React App, Vite, or a similar client-side bundler sends an almost empty HTML file as its initial response. The content you see in the browser is injected by JavaScript after the page loads. A typical initial HTML might look like:
<!DOCTYPE html>
<html>
<head>
<title>My App</title>
</head>
<body>
<div id="root"></div>
<script src="/static/js/main.chunk.js"></script>
</body>
</html>To an AI crawler, this page has no content. Your headline, product description, pricing copy, and FAQs are all invisible.
Next.js with client-side rendering
Next.js supports multiple rendering modes. If your pages use useEffect to fetch content, or if you've built a fully client-side route, the behavior is the same as a plain React app — the initial HTML is a shell. However, Next.js pages using getServerSideProps, getStaticProps, or the App Router with Server Components do send full content in the initial response. The rendering mode per page determines what AI crawlers see.
Framer
Framer generates visually sophisticated sites, but its output is heavily JavaScript-driven. Text, animations, and component content are often rendered client-side. Pages built in Framer typically deliver a minimal initial HTML with the actual content injected by the Framer runtime. AI crawlers visiting a Framer site may receive only navigation shells and footer boilerplate.
Webflow
Webflow is better than Framer in this regard — it does generate more server-side HTML — but it still uses client-side JavaScript for certain interactions, dynamic content, and CMS collections. CMS-driven pages in Webflow may have their content populated client-side, making it invisible to crawlers that do not execute JavaScript.
How to diagnose the problem
The fastest way to check what an AI crawler sees is to fetch your page without a browser. Run this in your terminal:
curl -A "GPTBot" https://yoursite.com/your-pageRead the output carefully. Is your headline there? Your product description? Your key copy? If the output is a script tag and a <div id="root">, you have a rendering problem.
You can also test with other AI crawler user agents:
curl -A "PerplexityBot" https://yoursite.com/your-page
curl -A "ClaudeBot" https://yoursite.com/your-page
curl -A "Google-Extended" https://yoursite.com/your-pageSolutions
Server-side rendering (SSR)
SSR generates the full HTML for each page request on the server, before sending it to the client. The initial HTTP response contains all the content. In Next.js, this means using getServerSideProps or Server Components in the App Router. In other frameworks, it means rendering on the server rather than the client.
SSR is the most reliable solution and has additional benefits: faster time-to-first-paint for users, better Core Web Vitals, and full compatibility with all crawlers.
Static site generation (SSG)
If your content doesn't need to be dynamic per-request, static generation is even better. Next.js's getStaticProps pre-renders pages at build time, and they are served as flat HTML files. There is no JavaScript dependency for the initial content load. AI crawlers receive fully rendered, content-rich HTML. This is the gold standard for AI visibility.
Adaptive rendering
For sites where changing the rendering mode is impractical — Framer sites, heavily customized Webflow setups, or legacy React apps — adaptive rendering offers a middle path. The server detects the user agent of the incoming request. If it is an AI crawler, it serves pre-rendered HTML with the full content. If it is a browser, it serves the normal JavaScript-heavy experience.
This is not cloaking. The content served to crawlers is identical to what human visitors see — it is just rendered differently. It is the same principle as serving a print stylesheet to a printer or an AMP page to a mobile crawler.
Adaptive rendering can be implemented at the CDN level (Cloudflare Workers, Vercel Edge Functions) or at the origin server level. It requires a pre-rendering service to generate the static snapshots.
Quick wins while you plan the larger fix
- Add a
noscripttag with key content. While not a complete solution, anoscriptblock with your headline and main copy gives crawlers something to work with while you implement SSR or SSG. - Ensure your JSON-LD schema is server-rendered. Even if your body content is client-rendered, your schema markup in the
headshould be in the initial HTML. This gives AI systems entity data even if they can't read your prose. - Prioritize your most important pages. Focus SSR or SSG migration effort on homepage, product pages, and pricing first. These are the pages AI systems are most likely to retrieve and cite.
The rendering problem is fixable. Most modern frameworks support server rendering natively — it is often just a matter of enabling it. The return is immediate: pages that were invisible to AI systems become readable, citable, and visible overnight.