Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will “eat just about anything that finds its way inside.”
Aaron clearly warns users that Nepenthes is aggressive malware. It’s not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an “infinite maze” of static files with no exit links, where they “get stuck” and “thrash around” for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That’s likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.
Notice how it’s “AI haters” and not “people trying to protect their IP” as it would be if it were say…China instead of AI companies stealing the IP.
They’re framing it as “AI haters” instead of what it actually is, which is people who do not like that robots have been programmed to completely ignore the robots.txt files on a website.
No AI system in the world would get stuck in this if it simply obeyed the robots.txt files.
The disingenuous phrasing is like “pro life” instead of what it is, “anti-choice”
I hope it’s effective.
It’s not. If it was, every search engine out there would be belly up at the first nested link.
Google/Bing just consume their own crawling traffic. You don’t want to NOT show up in search queries right?
You don’t want to NOT show up in search queries right?
At this point?
I am fully ok NOT being in search engines for any of my sites. Organic traffic has always been much more valuable than inorganic traffic.
So instead of the AI wasting your resources and money by ignoring your robots.txt, you’re going to waste your own resources and money by inviting them to increase their load on your server, but make it permanent and nonstop. Brilliant. Hey, even better, you should host your site on something that charges you based on usage, that’ll really show the AI makers who is boss. 🤣
It’s already permanent and nonstop. They’re known to ignore robots.txt, and remove user agent on detection.
And the goal is not only to prevent resource abuse, but break a predatory model.
But, feel free to continue gracefully doing nothing while other takes action, it’s bound to help eventually.
Hey, you don’t need to convince me, you’ve clearly already committed to bravely sacrificing your own time and money in this valiant fight. Go get ‘em, tiger! I look forward to the articles about AI being stopped coming out any day now.