Publishers urged to embrace future where bot readers provide majority of revenue
AI agents and bots will become the “primary” revenue source for the publisher websites they visit, the co-founders of AI monetisation company Tollbit believe.
“It’s not happening overnight,” Olivia Joslin said. “It’s not going to make their year this year in terms of the revenue that’s flowing through the pipes. But this is certainly going to be the primary mode of revenue. It’s going to have to come from AI visitors.”
Tollbit provides technology which enables free-to-air websites to block AI access, track scraping and set rules/payments terms for bot traffic. It is currently said to be installed on more than 7,000 publisher websites.
According to data gathered from the various websites Tollbit works with, by the end of 2025 there was a ratio of one AI bot visiting a publisher website for every 31 human visits. This was up from one AI bot to every 50 human in Q2 and one in 200 at the start of 2025.
Press Gazette has reported on the growing view among many publishers that they should block AI training bots from visiting their websites without a licensing deal.
But studies have shown that bot-blocking is largely ineffective if AI companies are determined to scrape publisher content.
SEO consultant Barry Adams said: “It’s become absolutely necessary for publishers to block ALL bots by default – ideally at the CDN layer [the server network which distributes internet content around the world] – and only allow access to specified bots that deliver a positive value exchange (like Googlebot).”
Publisher attitudes to AI have shifted
Tollbit co-founders Toshit Panigrahi and Joslin told Press Gazette that publisher attitudes shifted last year, reaching a tipping point in the autumn. They said most now agree that the future is in monetising retrieval augmented generation, or RAG, which is the process through which generative AI models retrieve and reference new information from the web in real time.
Panigrahi said: “I do think that the conversation has shifted from training to go to RAG. I think they understand the unit economics are going to be micro transactions at huge volumes,” essentially meaning a paywall for bots. Last month Tollbit’s bot monetisation technology was added for publishers that use ArcXP, the publishing platform developed by the Washington Post.
Panigrahi said: “There is a growing audience on the internet. It doesn’t even show up in your logs. You don’t know when it accesses your site in most cases. And yet, it’s going to be the primary reader of your site in the future.
“For your human audiences, you have paywalls, and you have subscriptions, and you have ads and you have affiliate links. You have many ways to monetise your human business. But if this is the biggest visitor in the future, this autonomous visitor – and the words every year are going to change, you can call it a bot, an agent, an AI app, whatever it is, it’s an autonomous visitor, there’s not a human on the other side – how are you going to have a value exchange?
“You don’t have any way to monitor them. You have no way to grant them access. You have no way to have a value exchange. That’s where we made them understand that this is actually a new audience on the internet. Yes, it’s a copyright problem, yes, it’s perhaps a licensing problem, but it’s an audience problem. That’s what we have to solve for first. And when you look at that, the way you approach the problem starts changing.”
Bot-blocking cybersecurity measures ‘losing game’
Panigrahi described bot-blocking cybersecurity as “in the long run, a losing game. It creates a wrong incentive structure, because when you try to block the bots, the only thing you incentivise them to do is get better at bypassing your defences and it ends up in an endless cat and mouse game.
The latest Tollbit State of the Bots report found that of almost 40 companies offering website scraping services many “explicitly advertise cybersecurity detection evasion techniques and many do not default to abiding by robots.txt”.
Panigrahi said publishers find it “consistently stunning” to discover how quickly their content, whether it’s got cybersecurity layers and whether or not it’s behind a paywall, can be accessed and that the full text can be scraped.
Tactics include “proxy networks, residential IP addresses, headless browsers, spoofing of referrals to try to trick the cybersecurity tool into letting them through”.
Publishers are currently “coming to terms” with this idea after many “did invest heavily” in cybersecurity in 2025. They have found it “disheartening” to find out how easily their content can still be scraped, Panigrahi said.
He added: “What we’ve been saying to the industry for a while is that we’re not saying rip out your cybersecurity. Making it longer to scrape that article actually buys you some time. The fact that it takes four seconds to 64 seconds and is unreliable and you have to retry a few times is actually good.
“It means that it allows someone like us to create that Spotify-like model. What we can do is we can go to the AI company, to the Fortune 500 companies, and say, hey, Tollbit’s a way that you can get reliable access in, you know, 100 milliseconds. It’ll always be reliable. It’s first-party access. You’ll never get blocked. Yes, it’ll cost you a little bit of money, but it comes with a licence, and that’s a cherry on top. You don’t have to go scraping this content.”
Joslin argued that micro-transactions are better for publishers than one-off deals with OpenAI and other AI companies.
“And on the other side too, it’s not just going to be the big OpenAIs and Googles of the world that are going to need to access your content. There’s a long tail of specialised agent search applications and things being built on top of these core models.
“And if you’re a publisher, it’s not worth doing a deal with a small developer that’s trying to do the right thing. Maybe they’ve raised $2m, maybe they have a team of five people. They don’t have bandwidth to do deals with all of the publishers, and it’s not worth the publishers’ time to do the direct negotiations.”
What types of content do AI bots want to read?
A few trends have emerged in terms of the type of content that is most attractive to AI visitors.
Panigrahi cited “things that are live and updating” such as sports results, foreign currency exchange rates, crypto market rates, weather and deals/coupons.
But the four factors that weigh into whether content is in demand from AI or can command a premium, according to Panigrahi, are: uniqueness, whether it is paywalled, strength of brand and freshness.
He said: “The more unique the content, the more irreplaceable the content, the more you can command for it, which makes sense. Local news actually commands a premium in that regard. Niche B2B publishers command a premium because they are very irreplaceable content – an industry report PDF, for example, falls into that same bucket.”
He continued: “How easy is it to get to the article? If it’s paywalled, it commands a premium…
“All things being equal, if it’s a marquee publisher, it usually commands a premium because people will trust that source more.”
Some SEO tactics won’t work for LLMs
Panigrahi acknowledged this is “not that different” from traditional SEO tactics but said some former tactics would be counterproductive in LLMs.
He cited as an example one publisher writing 40 articles about Donald Trump every day.
“It actually might hurt you in this new ecosystem, because when a bot needs to read an article, does it need all ten articles about Trump from ten different publishers? Maybe not. It maybe could do with five, maybe could do with four. So the more replaceable that content becomes, if it’s commoditised, the more it hurts publishers. So this actually incentivises more unique long tail content, instead of just optimising for safe keywords.”
Panigrahi also noted that the AI monetisation challenge “goes beyond publishers now”, highlighting e-commerce companies who realised the significance to their businesses of AI agents that will be able to order food or book restaurant tables for people. This will ultimately mean less humans looking at online adverts.
“It is actually the same problem that publishers face,” he continued. “There’s a new visitor. You can’t really block them. They’re disintermediating you from your human visitors, and the more you try to block them, the better they get at just scraping the content…
“By the end of 2026 if we do our job right, I think from publishers to regulators to some of the other verticals I think everyone will understand that this is an entire AI internet economy problem. This is not just publishers complaining about technology.”
The post Publishers urged to embrace future where bot readers provide majority of revenue appeared first on Press Gazette.