this post was submitted on 27 Apr 2024
130 points (92.8% liked)

Technology

59589 readers
3024 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The link above is for the petition

Here is the letter:

Our Letter To WhatsApp:

WhatsApp needs to implement these product changes during polling days and in the month before and the month after elections:

  • Add friction to forwarding messages: Reduce the ease with which messages can be forwarded on the platform by adding one additional step which nudges users to pause and reflect before they forward content.
  • Add disinformation warning labels to viral content: Automatically add clear “Highly forwarded: please verify” warning labels to viral messages, in addition to the “forwarded many times” label currently in use.
  • Reduce WhatsApp’s broadcast capabilities: Disable the Communities feature and also limit the size of broadcast lists to 50 people and cap their usage to twice a day.

Without decisive action from WhatsApp, disinformation attacks will likely scale up in 2024, aimed at manipulating and undermining elections affecting half of the world’s population. WhatsApp must act to change its product to protect election integrity.

all 15 comments
sorted by: hot top controversial new old
[–] tedu@azorius.net 22 points 7 months ago

In general, I don't like rules about who's allowed to talk about elections, because they can just as easily be turned against the people, but these seem fairly balanced. They're not controlling the content of the messages.

[–] Gradually_Adjusting@lemmy.world 16 points 7 months ago (2 children)

"Please verify" is not enough of a red flag to overcome confirmation bias. People have to be reminded to seek disconfirming evidence. "Highly forwarded link is likely propaganda, consider the writers motivations and other views on the subject."

[–] otter@lemmy.ca 18 points 7 months ago (1 children)

A downside to a statement like this would be the 'crying wolf' effect. If that message pops up on information they know to be true, where it's being shared because it is important or relevant, then people are less likely to care.

A neutral message would help prevent that

[–] Gradually_Adjusting@lemmy.world 5 points 7 months ago

Tough one to get right, isn't it? I take your point, but I fear the power of confirmation bias might be too great.

[–] iturnedintoanewt@lemm.ee 2 points 7 months ago (1 children)

Whatsapp link previews are rendered at meta servers. They could display any propaganda or fake news warnings they wanted on those previews.

[–] Gradually_Adjusting@lemmy.world 6 points 7 months ago

The fact that we're having to ask Meta nicely to not screw up our elections after everything they've done is pretty dire straits. It's a nice gesture from Mozilla, anyway.

[–] moon@lemmy.ml 7 points 7 months ago

WhatsApp is a huge vector for misinformation across the world. This is exactly the kind of specific demand people should be making of them to force some level of responsible behaviour

[–] mannycalavera@feddit.uk 6 points 7 months ago

Meta: Yeah, nah....

[–] xep@fedia.io 5 points 7 months ago* (last edited 7 months ago)

Call me a pessimist, but it's highly unlikely WhatsApp will take any action that doesn't affect their balance sheet positively in the short term.

[–] gedaliyah@lemmy.world 4 points 7 months ago

It seems like messaging services are particularly prone the misinformation campaigns, since it is much more difficult to audit what is happening on the platform. How is a service like messenger or WhatsApp (both meta)going to monitor the content of messeges in a way that is safe to users? How would researchers identity and track information?

I know that the most outlandish content I see as a highly connected individual tends to come from these platforms. I do my best to educate when I see it, but I doubt it has much of a lasting impact.

It's depressing and a little frightening to know how easily and cheaply our electorate is manipulated, and to see it happening in real time.

[–] schnurrito@discuss.tchncs.de 0 points 7 months ago (2 children)

I thought Mozilla was a FOSS organization whose goal it was to defend an open Internet with free communication?

Here they are putting out a blog post that says "WhatsApp should use the power it has over its users to implement antifeatures that their users might not want and could remove if it were FOSS".

What the hell kind of world are we living in again?

[–] otter@lemmy.ca 11 points 7 months ago* (last edited 7 months ago)

https://www.mozilla.org/en-CA/mission/

Our mission is to ensure the Internet is a global public resource, open and accessible to all. An Internet that truly puts people first, where individuals can shape their own experience and are empowered, safe and independent.

I would say that is a better mission than just promoting "free communication". There's more nuance to this situation than that

[–] Jackthelad@lemmy.world 0 points 7 months ago (1 children)

Are you new to the hypocrisy of FOSS nerds?

[–] schnurrito@discuss.tchncs.de 1 points 7 months ago

I suspect that I am; I am not omnipresent and not aware of everything happening everywhere.

Am I right that the logic is approximately like this: FOSS is a left-wing anti-business cause, misinformation tends to help right-wing parties win elections, therefore it is compatible with FOSS values and principles to want to use the power that proprietary software developers have in order to censor ("stop the spread of") misinformation?