0xtero

joined 1 year ago
[–] 0xtero@kbin.social 2 points 11 months ago (1 children)

Yeah. That's what I said

[–] 0xtero@kbin.social 2 points 11 months ago (3 children)

In this case, the "lemmy devs" and the operators of lemmy.ml are the same people and it's hosted within EU.
But - that's still a far cry from getting any kind of GDPR violation report going, much less getting it through the process to actual fines.
People like to bring up GDPR violations as a some kind of super-moderator tool, but it isn't that easy and it definitely isn't automated.

[–] 0xtero@kbin.social 9 points 11 months ago

Effect of ActivityPub, not Lemmy. All federating systems function similarly, because it's a feature of the protocol.
If instances want, they can ignore delete requests and your content stays in their cache forever (remember Pleroma nazis from couple of years ago?) - now, that is an instance problem that might be a GDPR issue, but good luck reporting it to anyone who cares. At best you can block and defederate, but that doesn't mean your posts are removed.

The fediverse has no privacy, it's "public Internet". Probably a good idea to treat it as such.

[–] 0xtero@kbin.social 3 points 11 months ago

It's also a matter of scale. FB has 3 billion users and it's all centralized. They are able to police that. Their Trust and Safety team is large (which has its own problems, because they outsource that - but that's another story). The fedi is somewhere around 11M (according to fedidb.org).
The federated model doesn't really "remove" anything, it just segregates the network to "moderated, good instances" and "others".

I don't think most fedi admins are actually following the law by reporting CSAM to the police (because that kind of thing requires a lot resources), they just remove it from their servers and defederate. Bottom line is that the protocols and tools built to combat CSAM don't work too well in the context of federated networks - we need new tools and new reporting protocols.

Reading the Stanford Internet Observatory report on fedi CSAM gives a pretty good picture of the current situation, it is fairly fresh:
https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media

[–] 0xtero@kbin.social 0 points 11 months ago (13 children)

I find it interesting that Meta Platforms, Inc., a company known for harvesting user data, is blocking some servers from fetching its public posts. They decided to implement a feature Mastodon calls Authorized fetch.

This was always going to happen. They will block agressively, because they can't have their precious advertising money mixed with CSAM, nazis and other illegal content. And the fedi is full of that.

[–] 0xtero@kbin.social 11 points 11 months ago (1 children)

I've been using Debian since 1.3. Haven't really ever needed anything else.
I did "experiment" a bit when the decision to go with systemd was taken, but in the end, most distros went with it and it really isn't that big deal for me.

So it's just Debian. I need a computer that works.

[–] 0xtero@kbin.social 6 points 11 months ago

Pleroma in that case I guess

[–] 0xtero@kbin.social 1 points 11 months ago

The Gnome devs say you don't need a mascot.

[–] 0xtero@kbin.social 1 points 1 year ago

Chat Control is a huge privacy problem.
But a threat to free software? Nah.

But the coming Cyber Resilience Act might be
https://www.eff.org/deeplinks/2023/05/eus-proposed-cyber-resilience-act-raises-concerns-open-source-and-cybersecurity