this post was submitted on 21 Mar 2026
513 points (98.1% liked)
Technology
82989 readers
3297 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Absolute pricks. "Don't do evil" they said.
AI has biases. News are titled to be biased too. This is grounds for fake news.
Don't, do evil!
~~Don't~~ do evil
Hasn't that been Google's guiding principle for quite some time already?
"Don't be evil" hasn't been an official guiding principle for over a decade, no.
On reddit I would respond with r/woosh
Ah, I was waking up and didn't see the strike through
Both here and on reddit you can just say "whoosh" although it wouldn't really have totally made sense
I'd trust an LLM to summarize an article and give it an honest title over a piece of shit journalist that wrote it.
In the short-lived news app Artifact, that was one of my favorite features. It was done on demand, and if a high portion of early viewers asked for a rewritten title, the rewrite would become the default for future viewserves.
In the Artifact implementation, the LLM was specifically prompted by the app to summarize the article with an honest, non-clickbaity title. In Google's case, they claim they are prompting the LLM to title the link to better tempt the searcher to click on it based on what they were searching for. Kind of the opposite. Yes, LLMs could do what you say, but that doesn't seem to be how Google is setting it up.