this post was submitted on 03 Mar 2024
147 points (92.0% liked)
Fediverse
28519 readers
386 users here now
A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).
If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!
Rules
- Posts must be on topic.
- Be respectful of others.
- Cite the sources used for graphs and other statistics.
- Follow the general Lemmy.world rules.
Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Somebody put up a site saying
There is an extreme amount of hostility from a certain segment of the (mostly Mastodon-using) Fediverse community toward anything that does anything with Fediverse content "without consent". Trouble is, there's no machine-readable mechanism for determining what people have consented to in most cases, and certainly no standard for it.
If your computer sends my computer an image and some text via ActivityPub, without any further communication, may I...
Some of those things are what Mastodon does normally, but could be understood as copyright violations because the protocol doesn't transmit any licensing information. Others, like search indexing are almost certainly legal, and the protocol is silent about them, but a few people will get very angry at anyone who visibly handles them differently from Mastodon. Meanwhile, how many people are quietly running servers with search indexes that aren't even aware of Mastodon's new opt-in/out search features?
Pixelfed has started attaching licenses to content, but I think we might need more sophisticated, machine-readable licenses.
If I'm reading this comment right, it's relying on a mistaken understanding of robots.txt. It is not an instruction to the server hosting it not to serve certain robots. It's actually a request to any robot crawling the site to limit its own behavior. Compliance is 100% voluntary on the part of the robot.
The ability to deny certain requests from servers that self-report running a version of their software with known vulnerabilities would be useful.