AI Crawl Control now includes a Robots.txt tab that provides insights into how AI crawlers interact with your robots.txt files.
What’s new
The Robots.txt tab allows you to:
- Monitor the health status of
robots.txtfiles across all your hostnames, including HTTP status codes, and identify hostnames that need arobots.txtfile. - Track the total number of requests to each
robots.txtfile, with breakdowns of successful versus unsuccessful requests. - Check whether your
robots.txtfiles contain Content Signals directives for AI training, search, and AI input. - Identify crawlers that request paths explicitly disallowed by your
robots.txtdirectives, including the crawler name, operator, violated path, specific directive, and violation count. - Filter
robots.txtrequest data by crawler, operator, category, and custom time ranges.
Take action
When you identify non-compliant crawlers, you can:
- Block the crawler in the Crawlers tab
- Create custom WAF rules for path-specific security
- Use Redirect Rules to guide crawlers to appropriate areas of your site
To get started, go to AI Crawl Control > Robots.txt in the Cloudflare dashboard. Learn more in the Track robots.txt documentation.
Source: Cloudflare



![Microsoft Copilot Studio – UPDATE – Classic agent creation experience in Teams [MC1282727] 4 pexels anniroenkae 2832533](https://mwpro.co.uk/wp-content/uploads/2025/06/pexels-anniroenkae-2832533-150x150.webp)
![Create and edit SharePoint pages with Copilot-powered AI [MC1282683] 5 pexels suzyhazelwood 3695297](https://mwpro.co.uk/wp-content/uploads/2025/06/pexels-suzyhazelwood-3695297-150x150.webp)
![Dynamics 365 Customer Service – Configure quality evaluation with quality evaluation agent [MC1178985] 7 Dynamics 365 Customer Service – Configure quality evaluation with quality evaluation agent [MC1178985]](https://mwpro.co.uk/wp-content/uploads/2024/08/pexels-trinitykubassek-445109-150x150.webp)