AI Crawl Control now includes a Robots.txt tab that provides insights into how AI crawlers interact with your robots.txt files.
What’s new
The Robots.txt tab allows you to:
- Monitor the health status of
robots.txtfiles across all your hostnames, including HTTP status codes, and identify hostnames that need arobots.txtfile. - Track the total number of requests to each
robots.txtfile, with breakdowns of successful versus unsuccessful requests. - Check whether your
robots.txtfiles contain Content Signals directives for AI training, search, and AI input. - Identify crawlers that request paths explicitly disallowed by your
robots.txtdirectives, including the crawler name, operator, violated path, specific directive, and violation count. - Filter
robots.txtrequest data by crawler, operator, category, and custom time ranges.
Take action
When you identify non-compliant crawlers, you can:
- Block the crawler in the Crawlers tab
- Create custom WAF rules for path-specific security
- Use Redirect Rules to guide crawlers to appropriate areas of your site
To get started, go to AI Crawl Control > Robots.txt in the Cloudflare dashboard. Learn more in the Track robots.txt documentation.
Source: Cloudflare


![Dynamics 365 Customer Service – Simulate AI field prediction capability from Case Management Agent [MC1204658] 3 pexels pixabay 36464](https://mwpro.co.uk/wp-content/uploads/2024/08/pexels-pixabay-36464-150x150.webp)
![Microsoft 365 Copilot for Service – Create case records from customer emails with one click [MC1204498] 4 swimming 1199678 1920](https://mwpro.co.uk/wp-content/uploads/2025/06/swimming-1199678_1920-150x150.webp)

![Dynamics 365 Customer Service – Configure quality evaluation with quality evaluation agent [MC1178985] 7 Dynamics 365 Customer Service – Configure quality evaluation with quality evaluation agent [MC1178985]](https://mwpro.co.uk/wp-content/uploads/2024/08/pexels-trinitykubassek-445109-150x150.webp)