this post was submitted on 21 Aug 2025
206 points (89.3% liked)

Selfhosted

50716 readers
530 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Some thoughts on how useful Anubis really is. Combined with comments I read elsewhere about scrapers starting to solve the challenges, I'm afraid Anubis will be outdated soon and we need something else.

you are viewing a single comment's thread
view the rest of the comments
[–] non_burglar@lemmy.world 25 points 1 day ago (2 children)

I had to get my glasses to re-read this comment.

You know why anubis is in place on so many sites, right? You are literally blaming the victims for the absolute bullshit AI is foisting on us all.

[–] ryannathans@aussie.zone 2 points 1 day ago (1 children)

Yes, I manage cloudflare for a massive site that at times gets hit with millions of unique bot visits per hour

[–] non_burglar@lemmy.world 5 points 1 day ago (1 children)

So you know that this is the lesser of the two evils? Seems like you're viewing it from client's perspective only.

No one wants to burden clients with Anubis, and Anubis shouldn't exist. We are all (server operators and users) stuck with this solution for now because there is nothing else at the moment that keeps these scrapers at bay.

Even the author of Anubis doesn't like the way it works. We all know it's just more wasted computing for no reason except big tech doesn't give a care about anyone.

[–] ryannathans@aussie.zone 3 points 1 day ago

My point is, and the author's point is, it's not computation that's keeping the bots away right now. It's the obscurity and challenge itself getting in the way.

[–] billwashere@lemmy.world -5 points 1 day ago (1 children)

I don’t think so. I think he’s blaming the “solution” as being a stop gap at best and painful for end-users at worst. Yes the AI crawlers have caused the issue but I’m not sure this is a great final solution.

As the article discussed, this is essentially “an expensive“ math problem meant to deter AI crawlers but in the end it ain’t really that expensive. It’s more like they put two door handles on a door hoping the bots are too lazy to turn both of them but also severely slowing down all one-handed people. I’m not sure it will ever be feasible to essentially figure out how to have one bot determine if the other end is also a bot without human interaction.

[–] ryannathans@aussie.zone 1 points 1 day ago* (last edited 1 day ago)

It works because it's a bit of obscurity, not because it's expensive. Once it's a big enough problem to the scrapers, the scrapers will adapt and then the only option is to make it more obscure/different or crank up the difficulty which will slow down genuine users much more