Repeat after me: anything I write on the internet should be treated as public information. If I want to keep any conversation private, I will not post it in a public website.
Fediverse
A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).
If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!
Rules
- Posts must be on topic.
- Be respectful of others.
- Cite the sources used for graphs and other statistics.
- Follow the general Lemmy.world rules.
Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy
I agree with you, however there are issues with not just privacy but also authenticity. I should be able to post as me, even in public, and have a way to prove it. Nobody else should be posting information as me, if that makes sense.
For that, we should start bringing our own private keys to the server, instead of trusting the server to control everything.
And if we start doing that, pretty soon we will end up asking ourselves why do we need the server in the first place, and we will evolve to something like what nostr is doing.
I'm all for it.
Sure, but that's already solved on the fediverse by using HTTP Signatures and isn't related to Authorized Fetch.
To add a bit of important nuance to this idea (particularly how this argument comes up with regards to threads). This does not apply to legal rights over your content. That is to say, of course you should treat any information you put out there as out of your control with regards to access but if somebody tries to claim legal rights over your content they are probably breaking the law.
Right. Publicly available does not mean in public domain. But the issue here is not of copyright, but merely of gated access.
Seriously. Bobthenazi could just go to an aligned server and make an account Bobthenotzi and boom -- perfectly able to follow whoever he wants.
One more reason to argue that we should drop the idea of "aligned" servers and that we are moving to a future where it is better to charge (small) amounts from everyone instead of depending on (large) donations from a few.
Ideally, a distributed fediverse wouldn't need much in terms of donations because it's a bunch of small instances instead of a few huge ones.
Not the point. The point that instances that are open for everyone will be open for bad actors as well.
If the mere act of signing up to an instance requires a small payment, you are automatically preventing the absolute majority of spammers, "spray and pray" scammers and channer trolls.
And so, once again, people discover the unsolvable dilemma of DRM.
You can't both publish your data where it can be seen by computers that are not under your control and somehow keep control of that data. Anything that purports to do so is either a temporary bandaid soon to be bypassed or nothing but placebo to begin with.
What I think a lot of conversations about privacy and security on the Fediverse miss is that the Fediverse is radically public.
A protocol that sends everything you share to a long list of servers that haven't been pre-screened and could be anything from a professionally-managed instance of vanilla Mastodon to an ad hoc, informally-specified, bug-ridden, slow implementation of half of ActivityPub running on a jailbroken smart light bulb can only ever be radically public. It's possible to block most interactions with someone you don't want to talk to, but not to reliably prevent them from seeing content you share to anything more than a short list of vetted followers.
There probably isn't any reasonable way to change that while keeping the open federation model, though it's possible to build closed networks on top of ActivityPub for those who want the formats it supports for a curated group. This isn't a problem to be solved in my view, but an inherent reality: the Fediverse is for things you want to make public.
Exactly! The only way that we can make sure that the Internet is not controlled by anyone is to make it available for everyone. If we are fighting for an open internet, we need to understand that this type of thing will be part of the package.
We, by which I mean some loose group of people who want decentralized tools to thrive should also be building things for secure, private communication, and we are. Matrix, for example offers strongly end-to-end encrypted federated chat rooms and private messages. It also has a kind of rough UX and, IIRC resource-intensive server software. We should work toward improving that.
I'm not advocating against privacy at all. I want people to understand as clearly as possible that Mastodon, Lemmy, and anything that works like them isn't private and can't be private when part of an open federated network so they can decide whether that's a good fit for how they're using it. The block evasion described in the link is just run a server on a domain that isn't blocked, and I imagine any other mitigations bolted onto Mastodon that don't break open federation will be little better.
tools to thrive should also be building things for secure, private communication.
Sure, but this should not be seen on the same class of software of "social media" or even "the web".
Matrix [...] has a kind of rough UX and, IIRC resource-intensive server software. We should work toward improving that.
Except that I get the vibes from the Matrix community that the shit UX is part of the attraction because it does a wonderful job of gatekeeping.
I don't hold out much hope for Matrix working out ever, but perhaps someday someone will use it as inspiration for making something that doesn't suck.
I'm kind of tired of social networks offering even the pretense of privacy. Just loudly proclaim that everything is public but clients can filter out shit you don't wanna see.
That doesn't work for vulnerable minorities. Manually filtering each shitty person after you step in their shit gets old. Coupled with the fact that not shutting down shitty people just means more shitty people are likely to turn up.
It's not sustainable
I think in this context it's meant on a technical level: as far as the fediverse is concerned, there's not a whole lot instances can do. Anyone can just spin up an instance and bypass blocks unless it works on an allowlist basis, which is kind of incompatible with the fediverse if we really want to achieve a reasonable amount of decentralization.
I agree that we shouldn't pretend it's safe for minorities: it's not. If you're a minority joining Mastodon or Lemmy or Mbin, you need to be aware that blocking people and instances has limitations. You can't make your profile entirely private like one would do on Twitter or any of Meta's products. It's all public.
You can hide the bad people from the users but you can't really hide the users from the bad people. You can't even stop people from replying to you on another instance. You can refuse to accept the message on the user's instance, but the other instance can still add comments that don't federate out. Which is kind of worse because it can lead to side discussions you have no way of seeing or participate in to defend yourself and they can be saying a lot of awful things.
It's the unfortunate reality. Social networks simply cannot in offer privacy. If they were upfront about it, then people could make rational decisions about what they share.
But instead they (including Mastodon) pretend like they can offer privacy, when they in fact cannot, resulting in people sharing things that they would not otherwise share.
It's not as black and white as you make it. The options aren't "perfect security" and "no security".
The option that most people that experience regular harassment want is "enough security to minimise the shit we have to deal with to a level that is manageable even if it's imperfect"
While you're theoretically right, we've seen in practice that nobody really offers even the imperfect privacy you describe, and on decentralized systems it only becomes harder to solve.
A Facebook style centralized network where you explicitly grant access to every single person who can see your content - is as close as we can get. But nobody is trying to make that kind of social network anymore, because there isn't much demand for it.
If you want a soapbox (Twitter/mastodon/bluesky, Reddit/Lemmy/kbin, Instagram/pixelfed, YouTube/toktok/peertube) then privacy is going to be a dream, especially if decentralized.
Vulnerable folk are looking for community, not a soap box. The goal is to connect with other folk whilst being as free as possible from harassment.
It's absolutely possible to achieve that without perfect privacy controls.
Privacy and being free of (in-context) harassment aren't the same thing. Your posts can all be public but your client can filter out any harassment, for example.
If the goal is privacy so that people who aren't in the community don't know that you're in the community, and don't know what the community is even talking about, I'm skeptical that it's practical. Especially for a decentralized network, I think that the sacrifices needed to make this happen would make the social network unappealing to users. For example, you'd need to make it invite only and restrict who can invite, or turn off any kind of discovery so that you can't find people who aren't already in your circle. At that point you might as well just use a group chat.
Privacy and being free of (in-context) harassment aren't the same thing.
They're related. Often, the ability to limit your audience is about making it non trivial for harassers to access your content rather than impossible.
If the goal is privacy so that people who aren't in the community don't know that you're in the community
That's not the goal. The goal is to make a community that lets vulnerable folk communicate whilst keeping the harassment to a manageable level and making the sensitive content non trivial to access for random trolls and harassers.
It's not about stopping dedicated individuals, because they can't be stopped in this sort of environment for all the reasons you point out. It's about minimising harassment from the random drive by bigots
Hmmm I think I understand the intent. I'll have to think on it some more.
My gut tells me that protecting people from drive-by bigotry is antithetical to content/community discovery. And what is a social network without the ability to find new communities to join or new content to see?
Perhaps something like reddit where they can raise the bar for commenting/posting until you've built up karma within the community? That's not a privacy thing though.
What would this look like to you, and how does it relate to privacy? I've got my own biases that affect how I'm looking at the problem, so I'd be interested in getting another perspective.
You're thinking about this in an all or nothing way. A community in which everyone and everything they post is open to everyone isn't safe.
A community in which no one can find members or content unless they're already connected to that community stagnates and dies.
A community where some content and some people are public and where some content and some people are locked down is what we need, and though it's imperfect, things like authorised fetch brings us closer to that, and that's the niche that future security improvements on the Fediverse need to address.
No one is looking for perfect, at least not in this space.
reasons why i love blahaj.zone 🥹
It’s not sustainable to keep offering poorly designed solutions. People need to understand some basic things about the system they're using. The fediverse isn't a private space and fediverse developers shouldn't be advertising pseudo-private features as private or secure.
I have no idea what any of that meant.
I still could use an ELI5 about what this authorized fetch feature was supposed to do. Was it supposed to basically disengage the Mastodon network from Threads? To stop Threads crap from showing up on Mastodon? Or to stop Mastodon discussions from showing up in Threads? Or something different?
Authorised Fetch existed long before Instagram Threads. When it is turned on, an instance will require any other server to sign their request to fetch any post. This prevents "leaking" of posts via ActivityPub to blocked instances.
This setting is turned off by default, because some software are incompatible with it (like /kbin, Pixelfed before June 2023, maybe Lemmy too), because it makes server load higher, and it may make some replies missing (at least on microblogging side).
When it is turned on, an instance will require any other server to sign their request to fetch any post. This prevents “leaking” of posts via ActivityPub to blocked instances.
Oh I see. Yeah that sounds pretty hopeless. Does it use the fetching site's domain validated TLS certificate? Is the idea to permit fetching unless the fetching domain is on a blacklist? If yes, someone didn't have their thinking cap on. The whole concept is dumb though, there is no way to prevent posts from leaking. The saying is that once 3 people know a secret, it is no longer secret.
Stop asking for pseuso-privacy features. The Fediverse is public by nature. Any "measures" to control access to the public posts on it are just lying to users.
Server owners should be able to control who can access their servers - but that is NOT - and should NOT be - treated as a privacy feature.