this post was submitted on 29 Jan 2025
592 points (96.7% liked)

Technology

61227 readers
4363 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will "eat just about anything that finds its way inside."

Aaron clearly warns users that Nepenthes is aggressive malware. It's not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an "infinite maze" of static files with no exit links, where they "get stuck" and "thrash around" for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That's likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.

you are viewing a single comment's thread
view the rest of the comments
[–] digdilem@lemmy.ml 26 points 16 hours ago

It's not that we "hate them" - it's that they can entirely overwhelm a low volume site and cause a DDOS.

I ran a few very low visit websites for local interests on a rural. residential line. It wasn't fast but was cheap and as these sites made no money it was good enough Before AI they'd get the odd badly behaved scraper that ignored robots.txt and specifically the rate limits.

But since? I've had to spend a lot of time trying to filter them out upstream. Like, hours and hours. Claudebot was the first - coming from hundreds of AWS IPs and dozens of countries, thousands of times an hour, repeatedly trying to download the same urls - some that didn't exist. Since then it's happened a lot. Some of these tools are just so ridiculously stupid, far more so than a dumb script that cycles through a list. But because it's AI and they're desperate to satisfy the "need for it", they're quite happy to spend millions on AWS costs for negligable gain and screw up other people.

Eventually I gave up and redesigned the sites to be static and they're now on cloudflare pages. Arguably better, but a chunk of my life I'd rather not have lost.