thenexusofprivacy

joined 11 months ago
[–] thenexusofprivacy@lemmy.world 1 points 8 months ago

Yep. I very much agree with all of you. Here's how I phrased it in Embrace, Extend, and Exploit: Meta's plan for ActivityPub, Mastodon and the fediverse

Of course, if and when Meta sees the fediverse as a significant threat, they'll ruthlessly stamp it out.

But right now, they've got a huge potential longer-term opportunity to coopt the fediverse as a basis for decentralized surveillance capitalism. It might not work out, of course, but even if it doesn't keeping a neutered fediverse around might still be useful to Meta as long as it's not a threat to their dominance (just as Google subsidizes the Firefox browser).

[–] thenexusofprivacy@lemmy.world 2 points 9 months ago

Agreed, and a very good point. "Visible to people on allow-list servers" is very much along the lines of local-only posts ("visible to people only on this server"). I think of it as "scoped" visibility, although leashed or moored might well be a better term.

[–] thenexusofprivacy@lemmy.world 2 points 9 months ago (2 children)

Exactly. There's a core disagreement about whether making a public post means consenting to it being used for all purposes without consent (the multiple battles about consent-based search), but relatively few people are confused about whether bad actors will use it without consent.

[–] thenexusofprivacy@lemmy.world 5 points 9 months ago (2 children)

A very interesting idea! Actually it seems to me there are two interesting ideas here:

  • endorsements. Something like this (whether it's from feeler servers or other sources) is clearly needed to make consent-based federation scale. IndieWeb's Vouch protocol and the "letters of introduction" Erin Shephard discusses in "A better moderation system is possible for the social web" are similar approaches. You could also imagine building endorsement logic on top of an instance catalog like the FediSeer (of The Bad Space) or infrastructure like FIRES.

  • restricting visibility of a boost to servers the original post is federated with. This is something that's long overdue in the fediverse! Akkoma's bubble is a somewhat-similar concept; Bonfire's boundaries might well support this.

 

cross-posted from: https://lemmy.sdf.org/post/12134548

Patrick Eddington has a good summary:

"Unlike the House Judiciary Committee bill passed by that body in December by a 35-2 bipartisan margin, the new bill 1) does not mandate a warrant before FBI personnel can sift through the FISA Section 702 database for information on U.S. Persons and 2) still allows federal law enforcement agencies to buy data on U.S. Persons from data brokers--no warrant required.

The bill also allows for FBI agents to go through the Section 702 database for information "relevant to an existing, open, predicated full national security investigation.""

There were reports that intelligence agencies will have a secret briefing for Congress this afternoon, although Eddington now says it might not happen. In any case, a vote is expected Thursday.

If you're in the US, now's a critical time to contact your legislators. This issue crosses party lines, so even if your representatives usually don't listen to you, they'll be paying attention to the number of calls they get on this one! Eddington has instructions on how to do it via Congress' site, or Demand Progress has a handy web page.

 

What if Meta's hidden objective behind the Threads-to-Mastodon initiative is a play on app.net? And, what if threads.net is a measured step towards what could be the greatest pivot in all of tech?

[–] thenexusofprivacy@lemmy.world 2 points 10 months ago

Yep, totally agree!

[–] thenexusofprivacy@lemmy.world 2 points 10 months ago

Or, using Gab provides a sense of what's possible.

And child porn is a great example -- and CSAM more generally. Today's fediverse would have less CSAM if the CSAM instances weren't on it. Why hasn't that happened? The reason that many instances give for not block the instances that are well-known sources of CSAM is that CSAM isn't the only thing on that instance. And it's true: these instances have lots of people talking about all kinds of things, and only a relatively-small number of people spreading CSAM. So not blocking them is completely in aligment with the Big Fedi views Evan articulates: everybody (even CSAM-spreaders) should have an account, and it's more important to have the good (non-CSAM) people on the fediverse than to keep the bad (CSAM-spreading) people off.

A different view is that whoa, even a relatively-small number of people spreading CSAM is way too many, and today's fediverse would be better if they weren't on it, and if the instances that allow CSAM are providing a haven for them then those instances shouldn't be on the fediverse. It seems to me that view would result in less CSAM on the fediverse, which I see as a good thing.

[–] thenexusofprivacy@lemmy.world 3 points 10 months ago* (last edited 10 months ago) (1 children)

Me: "fedi would be better with fewer Nazis and fascists"

sj_zero: "these pieces are deeply authoritarian"

[–] thenexusofprivacy@lemmy.world 1 points 10 months ago* (last edited 10 months ago) (2 children)

I agree that small doesn't equal safer, in other articles I've quoted Mekka as saying that for many Black Twitter users there's more racism and Nazis on the fediverse than Twitter. And I agre that better tools will be good. The question is whether, with current tools, growth with the principles of Big Fedi leads to more or less safety. Evan assumes that safety can be maintained: "There may be some bad people too, but we'll manage them." Given that the tools aren't sufficient to manage the bad people today, that seems like an unrealistic assumption to me.

And yes, there are ways to keep these people off the fediverse (although they're not perfect). Gab isn't on the fediverse today because everybody defederated it. OANN isn't on the fediverse today because everybody threatened to defederate the instance that (briefly) hosted them, and as a result the instance decided to enforce their terms of service. There's a difference between Evan's position that he wants them to have accounts on the fediverse, and the alternate view that we don't want them to have accounts on the fediverse (although may not always be able to prevent it).

[–] thenexusofprivacy@lemmy.world 3 points 10 months ago (2 children)

It's a good comment, thanks for sharing it here! On the bolded part, yes, it's possible to do polls on Mastodon ... it could be very interesting to do a series around these questions. But of course a lot depends on who's doing the poll. Evan for example has blocked a lot of people -- which is fine, there is nothing the matter with blocking people, but it skews the poll results. And a lot depends on how the poll questions are phrased. Still, it's a good idea and I'll think about whether there's a sensible way to do it.

I agree that some of what Evan characterised as Small Fedi isn't about small for small's sake, it's more about the view you describe -- what L. Rhodes calls "networked communities". Of course, the consequences of this result in slower growth than the Big Fedi view, so a smaller network in the short-to-medium term, so from his perspective I can see why I chose this framing.

And from the comment:

Can the Big Fedi people connect with everyone they want to, while the Small Fedi folk keep their comfortable distance and protect their safe spaces?

Yes, I think a schism's likely to happen -- "Meta's fediverse", instances that federate with Threads, will be more attractive to Big Fedi people, and the "free fediverses" that don't federate with Threads (or other surveillance capitialism companies) will be more attractive to people who don't buy into the bigger is better view.

[–] thenexusofprivacy@lemmy.world -3 points 11 months ago

Indeed, there are lots of people like that already on the fediverse, and blocking entire instances is a blunt but powerful tool that well-moderated fediverse instances currently rely on for protection. Today, people on instances where admins and moderators don't block instances that have multiple badly-behaving people have to deal with a lot more harassment and hate speech than people on instances who do. So we'll certainly see a situation where some instances block Threads and others don't. The open question, though, is how many instances will decide to also block instances that federate with Threads -- just as many instances decided to block instances that federated with Gab.

[–] thenexusofprivacy@lemmy.world -2 points 11 months ago (4 children)

It's not that he wants the fediverse to be unsafe. It's more that the Big Fedi beliefs he describes for the fediverse -- everybody having an account there (which by definition includes Nazis, anti-trans hate groups, etc) , relying on the same kind of automated moderation tools that we've don't lead to safety on other platforms -- lead to a fediverse that's unsafe for many.

And sure there are some people who say Fedi is fine as it is. But that's not the norm for people who disagree with the "Big Fedi" view he sketches. It's like if somebody said "People who want to federate with Threads are all transphobic." There are indeed some transphobic people who want to federate with Threads -- We Distribute just reported on one -- but claiming that's the typical view of people who want to federate with Threads would be a mischaracterization.

 

A response to Evan Prodromou's "Big Fedi, Small Fedi"

[–] thenexusofprivacy@lemmy.world 1 points 11 months ago

The OP talks about how Meta can get a lot of what they want -- including the regulatory aspects -- just by saying they'll integrate with the fediverse, and it's quite possible that's all they'll ever do. But there's a big potential upside for them if they decided to invest in it ... not so much today's fediverse (I agree about the inflated self-importance of a lot of the commentary -- no, they're not so desperate for content that they're trying to steal it from the fediverse) but the potential of decentralized surveillance capitalism. So, we shall see.

 

What's Meta up to?

  1. Embrace ActivityPub, , Mastodon, and the fediverse

  2. Extend ActivityPub, Mastodon, and the fediverse with a very-usable app that provides additional functionality (initially the ability to follow everybody you're following on Instagram, and to communicate with all Threads users) that isn't available to the rest of the fediverse – as well over time providing additional services and introducing incompatibilities and non-standard improvements to the protocol

  3. Exploit ActivityPub, Mastodon, and the fediverse by utilizing them for profit – and also using them selfishly for Meta's own ends

Since the fediverse is so much smaller than Threads, the most obvious ways of exploiting it – such as stealing market share by getting people currently in the fediverse to move to Threads – aren't going to work. But exploitation is one of Meta's core competences, and once you start to look at it with that lens, it's easy to see some of the ways even their initial announcement and tiny first steps are exploiting the fediverse: making Threads feel like a more compelling platform, and reshaping regulation. Longer term, it's a great opportunity for Meta to explore – and maybe invest in – shifting their business model to decentralized surveillance capitalism.

 

As you've probably heard, Threads (a fairly new social network from Facebook's parent company Meta) is testing integration with the fediverse. Depending on how you look at it, it's a great opportunity, a huge threat, or both!

Back in May and June, when Threads' first announced their plans, there were quite a few polls on Mastodon about people's reactions, most showing opinions split roughly equally. How do people feel today?

view more: next ›