A report from 404 Media reveals how AI companies like Anthropic are bypassing a website’s robots.txt file by deploying new web crawlers with different names. This makes it more difficult for websites to block crawlers, as they constantly need to update their files to include the new bots:
Anthropic’s current and active crawler is called “CLAUDEBOT.” Neither Reuters nor Condé Nast, for example, blocks CLAUDEBOT. This means that these websites—and hundreds of others who have copy pasted old blocker lists—are not actually blocking Anthropic.









