this post was submitted on 27 Jul 2025
387 points (99.2% liked)

Technology

73379 readers
4723 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.zip/post/44874398

you are viewing a single comment's thread
view the rest of the comments
[–] BlackLaZoR@fedia.io 32 points 2 days ago (5 children)

The difference is that Google scans your private correspondence and can report you to authorities for any reason, legit or not.

[–] Alphane_Moon@lemmy.world 6 points 2 days ago (4 children)

That's a fair argument. Although I personally wouldn't put too much emphasis on "can report you to authorities for any reason". That's true of any third party, your local mini-mart can report you to the authorities for any reason, legit or not.

I am referring more to the Lumo LLM initiative. It's a standard LLM pitch with some privacy copytext added on.

While I haven't tried Lumo, I do have experience with smaller cloud LLMs (e.g. Mistral, trying to not use American services) and they tend to be subpar for my work use cases.

I don't see how Lumo will compete with ChatGPT or Gemini (haven't tried Grok for obvious reasons).

[–] BlackLaZoR@fedia.io 10 points 2 days ago (2 children)

Although I personally wouldn't put too much emphasis on "can report you to authorities for any reason"

They literally sent police after some poor dude based on their correspondence with a doctor

https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse

[–] artyom@piefed.social 2 points 2 days ago

Google does not have the authority to "send the police". They reported content that looked like CSAM and the police did what police do and assumed the guy was a criminal.

The problem is not that they reported it, the problem is that they had it in the first place.

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)