this post was submitted on 06 May 2026
229 points (98.7% liked)
Technology
84377 readers
5800 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But...
Isn't that a good thing?
I mean, running an LLM locally is much more private than running it somewhere in the cloud at a provider that gets your raw data, isn't it?
All your data stays on your device, while making it much, much harder for Google to argument why it should be uploaded to their data centers.
You don't seriously believe that, do you? They just use your device's memory and CPU, thus your electricity to shovel through your data and then sending all valuable data to their servers.
For clarity sake, that’s not what’s happening here. (Don’t misunderstand this comment as defending google, I could write a book about how much they suck)
The model downloaded is a LLM called Gemini Nano, and it’s used for things like “help me write”, checking if an incoming message is scam, summaries, etc.
Don’t worry about it itself being a spyware. It’s not; but for argument sake, if we were to assume that it was: they already know a lot about you through their usual apps and services, and get a lot more info out of you through them. This LLM would hardly move that needle.
The actual issue is that they download it for everyone, even if their devices don’t match the minimum requirements. And without consent. And to enable it, you need to go through several menus, as the default behaviour is to use the cloud (this could change eventually, my understanding is that in this update they’re just laying the foundation)
But, it’s Google that we’re talking about. Last year they were sentenced to pay a fine for spying on users despite them having their tracking settings off. And it wasn’t the first time iirc. This kind of behaviour is par for the course with them
It’s already been pointed out in multiple threads that the terms of service specify that even if it uses the on board model it still sends your queries to Google.
Yeah, it's what I said. Right now it's defaulting to the cloud
It's not a good thing if you don't want a freaking LLM to begin with. Hidden 4GB download for a feature I can't give a single fuck about is ridiculous.
I'm reminded of when they pinky swore that they weren't dissecting your data in incognito tabs.
They lied. And nothing ever really happened to them for it. Proof is that they still have the audacity to do anti consumer shit like this and not even think twice.
Also if I was someone who wanted to run an LLM locally there are many options other than whatever crap google is putting out. You can't trust them at all with even a morsel of your data.
If the reporting is accurate, your data is still sent to Google's servers for processing. This doesn't appear to improve privacy, it's more like an extension of the user surveillance business model that Google has pursued in the past decade.
If they are doing this without user knowledge, I wouldn’t trust that everything the LLM ingests stays local either, until proven otherwise. Also, not everyone wants to have a local LLM running on their browser eating up 4GB of space.
If someone chooses to do that then yes its a better option, but 4GB of LLM shouldn't just be shipped in a browser.
Say the following single line below out loud:
I am the product.
If I choose to install and use an LLM on my device, sure. That doesn't mean Google should take it upon themselves to ship one baked into the browser, with no way to opt out or remove it without it being re-downloaded.
Assuming Google will respect privacy is certainly a take.
The model could interact with everything on the PC without overhead on connection or servers, or user consent and then report back compressed reports. And who knows, maybe even training the model in a distributed way with users interactions with the PC.
Sure, but privacy isn't the only issue. It still consumes a ton of energy all for basically nothing. So you are paying that electric bill, as well as as the wear and tear on your GPU.