Skip to content

New Cloudflare Tools Let Sites Detect and Block AI Bots for Free

New Cloudflare Tools Let Sites Detect and Block AI Bots for Free

According to Dark Visitors founder Gavin King, most of the major AI agents still abide by robots.txt. “That’s been pretty consistent,” he says. But not all website owners have the time or knowledge to constantly update their robots.txt files. And even when they do, some bots will skirt the file’s directives: “They try to disguise the traffic.”

Prince says Cloudflare’s bot-blocking won’t be a command that this kind of bad actor can ignore. “Robots.txt is like putting up a ‘no trespassing’ sign,” he says. “This is like having a physical wall patrolled by armed guards.” Just as it flags other types of suspicious web behavior, like price-scraping bots used for illegal price monitoring, the company has created processes to spot even the most carefully concealed AI crawlers.

Cloudflare is also announcing a forthcoming marketplace for customers to negotiate scraping terms of use with AI companies, whether it involves payment for using content or bartering for credits to use AI services in exchange for scraping. “We don’t really care what the transaction is, but we do think that there needs to be some way of delivering value back to original content creators,” Prince says. “The compensation doesn’t have to be dollars. The compensation can be credit or recognition. It can be lots of different things.”

There’s no set date to launch that market, but even if it rolls out this year it will be joining an increasingly crowded field of projects intended to facilitate licensing agreements and other permissions arrangements between AI companies, publishers, platforms, and other websites.

What do the AI companies make of this? “We’ve talked to most of them, and their reactions have ranged from ‘this makes sense and we’re open’ to ‘go to hell,’” says Prince. (He wouldn’t name names, though.)

The project has been fairly quick-turnaround. Prince cites a conversation with Atlantic CEO (and former WIRED editor in chief) Nick Thompson as inspiration for the project; Thompson had discussed how many different publishers had encountered surreptitious web scrapers. “I love that he’s doing it,” Thompson says. If even big-name media organizations struggled to deal with the influx of scrapers, Prince reasoned, independent bloggers and website owners would have even more difficulty.

Cloudflare has been a leading web security firm for years, and it provides a large portion of the infrastructure holding up the web. It has historically remained as neutral as possible about the content of the websites its services; on the rare occasions it made exceptions to that rule, Prince has emphasized that he doesn’t want Cloudflare to be the arbiter of what’s allowed online.

Here, he sees Cloudflare as uniquely positioned to take a stand. “The path we’re on isn’t sustainable,” Prince says. “Hopefully we can be a part of making sure that humans get paid for their work.”

Leave a Reply