this post was submitted on 21 Aug 2025
185 points (89.0% liked)

Selfhosted

50688 readers
525 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Some thoughts on how useful Anubis really is. Combined with comments I read elsewhere about scrapers starting to solve the challenges, I'm afraid Anubis will be outdated soon and we need something else.

you are viewing a single comment's thread
view the rest of the comments
[–] daniskarma@lemmy.dbzer0.com 1 points 9 hours ago* (last edited 9 hours ago) (1 children)

I'm against it for several reasons. Running unauthorized heavy duty code on your end. It's not JS in order to make your site functional, it's heavy calculations unprompted. If they would add simple button "click to run challenge" would at least be more polite and less "malware-like".

For some old devices the challenge last over 30 seconds, I can type a captcha in less time than that.

It blocks behind the necessity to use a browser several webs that people (like the article author) tend to browse directly from a terminal.

It's a delusion. As shown by the article author solving the PoW challenge is not that much of an added cost. Span reduction would be the same with any other novel method, crawlers are just not prepared for it. Any prepared crawler would have no issues whatsoever. People are seeing results just because it's obscurity, not because it really works as advertised. And in fact I believe some sites are starting to get crawled aggressively despite anubis as some crawlers are already catching up with this new Anubis trend.

Take into account that the challenge needs to be light enough so a good user can enter the website in a few seconds running the challenge on a browser engine (very inefficient). A crawler interested in your site could easily put up a solution to mine the PoW using CUDA in a GPU which would be hundreds if not thousands of times more efficient. So the balance of difficulty (still browsable for users but costly to crawl) is not feasible.

It's not universally applicable. Imagine if all internet were behind PoW challenges. It would be like constant Bitcoin mining, a total waste of resources.

The company behind Anubis seems more shady to me each day. They feed on anti-AI paranoia, they didn't even answer the article author valid critics when he email them, they use clearly PR language aimed to convince and please certain demographics to place their product. They are full of slogans but lack substance. I just don't trust them.

[–] Dremor@lemmy.world 2 points 8 hours ago* (last edited 8 hours ago)

Fair point. I do agree with the "clic to execute challenge" approach.

For the terminal browser, it has more to do with it not respecting web standard than Anubis not working on it.

As for old hardware, I do agree that a temporization could be good idea, if it wasn't so easy to circumvent. In such case bots would just wait in the background and resume once the timer is fullified, which would vastly decrease Anubis effectiveness as they don't uses much power to do so. There isn't really much that can be done here.

As for the CUDA solution, that will depend on the implemented hash algorithm. Some of them (like the one used by Monero) are made to vastly more inefficient on GPU than it is on the CPU. Moreover, GPU servers are far more expensive to run than CPU ones, so the result would be the same : crawling would be more expensive.

In any case, the best solution would be by far to make it a legal requirement to respect robot.txt, but for now the legislators prefer to look the other way.