RESOURCES / AI VISIBILITY / CONTENT STRUCTURE
Architecture AI can decompose.
We audited a luxury resort last month that had invested heavily in their website. Custom photography, immersive video, beautifully written copy about every amenity. When we disabled JavaScript in the browser and reloaded their homepage, almost everything disappeared. The hero section went blank. The room descriptions vanished. The restaurant menus, the spa offerings, the event spaces, all gone. What remained was a header, a footer, and a handful of navigation links.
That stripped-down version is exactly what AI crawlers see when they visit. And it's why this resort was invisible across every AI platform despite having one of the best websites in their market.
What AI crawlers actually see
GPTBot, ClaudeBot, and PerplexityBot do not execute JavaScript. They fetch the raw HTML your server sends on the initial request and read only what's in that response. They don't scroll, click, or wait for API calls. An analysis of over 500 million GPTBot requests tracked by Daydream found zero evidence of JavaScript execution, even when the crawler downloaded JS files.
Google's crawler has spent decades building sophisticated rendering capabilities. It runs a headless browser, executes scripts, and captures the fully rendered page. Every other major AI crawler operates more like a basic scraper from 2005. If your content loads after the initial HTML, it doesn't exist to ChatGPT, Claude, or Perplexity.
The brands most affected are often the ones that invested the most. JavaScript frameworks like React, Vue, and Angular create beautiful interactive sites for humans. For AI crawlers, those same sites can appear functionally empty.
What AI needs to find on your page
The fix isn't about dumbing down your website. It's about making sure the information AI needs is present in the initial HTML, regardless of what else happens after JavaScript runs.
Lead with the answer.
AI pulls individual chunks of text, not entire pages. The first sentence of each section should state the key fact directly. If someone asks "Does this hotel have a spa?", the answer should be findable in a clear sentence in your HTML, not buried in a JavaScript-loaded tab.
Use real heading hierarchy.
H1 for the page title, H2s and H3s for sections, structured to match the questions your customers ask. SE Ranking's study of 129,000 domains found that pages with section lengths between 120 and 180 words between headings earned 70% more ChatGPT citations than pages with very short or very long sections.
Put data in real HTML.
Tables should use proper table tags, not screenshots. Lists should be actual HTML lists. Prices, hours, locations, and key facts should be in text, not embedded in images or loaded via API calls. Write descriptive alt text for every image, because AI crawlers read alt attributes but cannot interpret visual content.
Add schema markup in the HTML source.
FAQPage, LocalBusiness, Hotel, Museum, Organization, and HowTo schemas help AI understand what your page is about and how entities relate. But as Search Engine Journal reported, structured data injected via Google Tag Manager or client-side JavaScript is invisible to AI crawlers. It must be in the server-rendered HTML.
How to test in five minutes
Right-click any page on your site and select "View Page Source." Search for your most important information. Your property name, your key differentiator, your location, your hours. If that information isn't in the source, AI can't see it.
You can also disable JavaScript in your browser's developer tools and reload the page. Whatever remains visible is the version AI crawlers receive.
We do this for every client as the first step of a technical audit, and it consistently reveals gaps that no amount of content strategy can fix until they're addressed.
The good news: these are engineering problems with known solutions. Server-side rendering, static site generation, or prerendering services can deliver fully formed HTML to crawlers while preserving the interactive experience for humans. The fix is usually weeks, not months, and the visibility impact is often immediate.
Your content may already have the answers. Make sure AI can actually read them.
We audited a luxury resort last month that had invested heavily in their website. Custom photography, immersive video, beautifully written copy about every amenity. When we disabled JavaScript in the browser and reloaded their homepage, almost everything disappeared. The hero section went blank. The room descriptions vanished. The restaurant menus, the spa offerings, the event spaces, all gone. What remained was a header, a footer, and a handful of navigation links.
That stripped-down version is exactly what AI crawlers see when they visit. And it's why this resort was invisible across every AI platform despite having one of the best websites in their market.
What AI crawlers actually see
GPTBot, ClaudeBot, and PerplexityBot do not execute JavaScript. They fetch the raw HTML your server sends on the initial request and read only what's in that response. They don't scroll, click, or wait for API calls. An analysis of over 500 million GPTBot requests tracked by Daydream found zero evidence of JavaScript execution, even when the crawler downloaded JS files.
Google's crawler has spent decades building sophisticated rendering capabilities. It runs a headless browser, executes scripts, and captures the fully rendered page. Every other major AI crawler operates more like a basic scraper from 2005. If your content loads after the initial HTML, it doesn't exist to ChatGPT, Claude, or Perplexity.
The brands most affected are often the ones that invested the most. JavaScript frameworks like React, Vue, and Angular create beautiful interactive sites for humans. For AI crawlers, those same sites can appear functionally empty.
What AI needs to find on your page
The fix isn't about dumbing down your website. It's about making sure the information AI needs is present in the initial HTML, regardless of what else happens after JavaScript runs.
Lead with the answer.
AI pulls individual chunks of text, not entire pages. The first sentence of each section should state the key fact directly. If someone asks "Does this hotel have a spa?", the answer should be findable in a clear sentence in your HTML, not buried in a JavaScript-loaded tab.
Use real heading hierarchy.
H1 for the page title, H2s and H3s for sections, structured to match the questions your customers ask. SE Ranking's study of 129,000 domains found that pages with section lengths between 120 and 180 words between headings earned 70% more ChatGPT citations than pages with very short or very long sections.
Put data in real HTML.
Tables should use proper table tags, not screenshots. Lists should be actual HTML lists. Prices, hours, locations, and key facts should be in text, not embedded in images or loaded via API calls. Write descriptive alt text for every image, because AI crawlers read alt attributes but cannot interpret visual content.
Add schema markup in the HTML source.
FAQPage, LocalBusiness, Hotel, Museum, Organization, and HowTo schemas help AI understand what your page is about and how entities relate. But as Search Engine Journal reported, structured data injected via Google Tag Manager or client-side JavaScript is invisible to AI crawlers. It must be in the server-rendered HTML.
How to test in five minutes
Right-click any page on your site and select "View Page Source." Search for your most important information. Your property name, your key differentiator, your location, your hours. If that information isn't in the source, AI can't see it.
You can also disable JavaScript in your browser's developer tools and reload the page. Whatever remains visible is the version AI crawlers receive.
We do this for every client as the first step of a technical audit, and it consistently reveals gaps that no amount of content strategy can fix until they're addressed.
The good news: these are engineering problems with known solutions. Server-side rendering, static site generation, or prerendering services can deliver fully formed HTML to crawlers while preserving the interactive experience for humans. The fix is usually weeks, not months, and the visibility impact is often immediate.
Your content may already have the answers. Make sure AI can actually read them.
CONTACT US
