# static-html-for-ai
> A reference site demonstrating how to optimize static HTML pages for AI
> crawlers (GPTBot, ClaudeBot, PerplexityBot, OAI-SearchBot, and others).
> The site is itself a worked example: every recommendation listed here is
> applied in the page's own structure, headers, and metadata.
Author: Michael McGrory, Solutions Engineer (Partnerships) at Cloudflare.
This site is hosted on Cloudflare Workers and is built as a single static
HTML page with a markdown mirror, an `llms.txt` index, and a
`robots.txt` that uses Content Signals directives.
## Docs
- [How to Optimize Static HTML for AI Crawlers (markdown)](/index.html.md): The full reference page in plain markdown — recommended for LLM consumption.
- [How to Optimize Static HTML for AI Crawlers (HTML)](/index.html): The same content rendered as static HTML with JSON-LD schema and visible structure.
## Site files
- [robots.txt](/robots.txt): Crawler directives, including Content Signals for `search`, `ai-input`, and `ai-train`.
- [sitemap.xml](/sitemap.xml): Sitemap with `lastmod` for each URL.
- [llms-full.txt](/llms-full.txt): Concatenated full content of the site's primary pages, suitable for direct ingestion into an LLM context window.
## Optional
- [Cloudflare AI Crawl Control](https://developers.cloudflare.com/ai-crawl-control/): Cloudflare's product for monitoring and controlling AI crawler access. Referenced throughout the main page.
- [llms.txt specification](https://llmstxt.org/): The proposal this file follows.