this post was submitted on 29 Jan 2025
595 points (96.7% liked)

Technology

61227 readers
4347 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Building on an anti-spam cybersecurity tactic known as tarpitting, he created Nepenthes, malicious software named after a carnivorous plant that will "eat just about anything that finds its way inside."

Aaron clearly warns users that Nepenthes is aggressive malware. It's not to be deployed by site owners uncomfortable with trapping AI crawlers and sending them down an "infinite maze" of static files with no exit links, where they "get stuck" and "thrash around" for months, he tells users. Once trapped, the crawlers can be fed gibberish data, aka Markov babble, which is designed to poison AI models. That's likely an appealing bonus feature for any site owners who, like Aaron, are fed up with paying for AI scraping and just want to watch AI burn.

you are viewing a single comment's thread
view the rest of the comments
[–] Crassus@feddit.nl 8 points 1 day ago (1 children)

It can detect cycles. From a quick look at the demo of this tool it (slowly) generates some garbage text after which it places 10 random links. Each of these links loops to a newly generated page. Thus although generating the same link twice will surely happen. The change that all 10 of the links have already been generated before is small

[–] LovableSidekick@lemmy.world -5 points 1 day ago* (last edited 1 day ago) (2 children)

I would simply add links to a list when visited and never revisit any. And that's just simple web crawler logic, not even AI. Web crawlers that avoid problems like that are beginner/intermediate computer science homework.

[–] dev_null@lemmy.ml 10 points 21 hours ago (1 children)

They are no loops and repeated links to avoid. Every link leads to a brand new, freshly generated page with another set of brand new, never before seen links. You can go deeper and deeper forever without any loops.

[–] LovableSidekick@lemmy.world 1 points 4 hours ago

You can limit the visits to a domain. The honeypot doesn't register infinite new domains.

[–] vrighter@discuss.tchncs.de 2 points 19 hours ago (1 children)

sure, if you have enough memory to store a list of all guids.

[–] LovableSidekick@lemmy.world 1 points 4 hours ago

It doesn't have to memorize all possible guids, it just has to limit visits to base urls.