# Cookly Robots.txt - Optimized for Search Engines & AI Platforms # Default rules for all crawlers User-agent: * Disallow: /api/ Disallow: /accounts/ Disallow: /de/ Disallow: /fr/ Disallow: /zh/ Disallow: /es/ Disallow: /cooking-school/ Disallow: /class/ Disallow: /location/ Disallow: /googleab5ea5d7efec7ca1.html Allow: /llms.txt # AI Crawlers - Explicitly allow with same restrictions User-agent: GPTBot Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: ChatGPT-User Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: ClaudeBot Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: Claude-Web Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: anthropic-ai Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: PerplexityBot Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: Google-Extended Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: Applebot-Extended Disallow: /api/ Disallow: /accounts/ Allow: / User-agent: cohere-ai Disallow: /api/ Disallow: /accounts/ Allow: / # Block aggressive scrapers User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: MJ12bot Disallow: / User-agent: DotBot Disallow: / # Sitemaps Sitemap: https://www.cookly.me/sitemap.xml