this post was submitted on 25 May 2024
819 points (97.7% liked)

Technology

59589 readers
3077 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.

top 50 comments
sorted by: hot top controversial new old
[–] TachyonTele@lemm.ee 268 points 6 months ago* (last edited 6 months ago) (6 children)

The head of Google *Search right now is the same guy that was head of yahoo search when it was dying. To put all of this in perspective.

[–] habanhero@lemmy.ca 81 points 6 months ago (1 children)

The head of Google Search

FTFY

[–] TachyonTele@lemm.ee 104 points 6 months ago (16 children)

Thanks.
Being a CEO must be amazing. You can fail and even bring an entire company down, and keep on getting the same job somewhere else.

[–] snooggums@midwest.social 28 points 6 months ago

He has experience and obviously that means he learned a lesson after failing at a job that requires being a belligerent asshole to get.

load more comments (15 replies)
load more comments (4 replies)
[–] adam_y@lemmy.world 167 points 6 months ago (10 children)

Can we swap out the word "hallucinations" for the word "bullshit"?

I think all AI/LLM stuf should be prefaced as "someone down the pub said..."

So, "someone down the pub said you can eat rocks" or, "someone down the pub said you should put glue on your pizza".

Hallucinations are cool, shit like this is worthless.

[–] Eheran@lemmy.world 69 points 6 months ago (36 children)

No, hallucination is a really good term. It can be super confident and seemingly correct but still completely made up.

[–] kbin_space_program@kbin.run 36 points 6 months ago (1 children)

It is, but it isnt applicable in at least the glue-pizza situation as the probable source comment has been found on reddit.

A better use of the term might be how when you try to get Bing's image creator to make "Battletech" art, you just mostly get really obvious Warhammer 40k Space Marines and occasionally Iron Maiden album art.

load more comments (1 replies)
load more comments (35 replies)
[–] kbin_space_program@kbin.run 27 points 6 months ago (5 children)

Google search isnt a hallucination now though.

It instead proves that LLMs just reproduce from the model they are supplied with. For example, the "glue on pizza" comment is from a reddit user called FuckSmith roughly 11 years ago.

[–] DarkThoughts@fedia.io 17 points 6 months ago (5 children)

It instead proves that LLMs just reproduce from the model they are supplied with.

What do you mean by that? This isn't some secret but literally how LLMs work. lol What people mean by hallucinating is when LLMs "create" facts that aren't any. Be it this genius recipe of glue pizza, or any other wild combination of its model's source material. The whole cooking thing is a great analogy actually because it's like all of their fed information are the ingredients, and it just spits out various recipes based on those ingredients, without any guarantee that it is actually edible.

load more comments (5 replies)
load more comments (4 replies)
load more comments (8 replies)
[–] Reverendender@sh.itjust.works 116 points 6 months ago (2 children)

Testing in Prod. Stay classy, Google.

“The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web,” said a Google spokesperson in an emailed statement to Gizmodo, noting many of the examples the company has seen have been from uncommon queries.

This is entirely fair. There is no way that anyone at Google could have anticipated that humans would search for strange things on the internet.

[–] hydroptic@sopuli.xyz 31 points 6 months ago (1 children)

The vast majority of AI Overviews provide high quality information

According to some fuckwitted Google rep, and I wouldn't trust them any further than I could throw them.

load more comments (1 replies)
[–] uriel238@lemmy.blahaj.zone 12 points 6 months ago (3 children)

Although any sociologist or veteran of the internet will tell you humans will engage in any exploit that yields a funny result. The Diet Coke + Mentos rule.

And that means we'll actively search for hilarious Google AI responses.

Google is so f double-plus filthy rich, it is obligated to run its projects by experts or be relentlessly mocked. So it should have known this was the outcome.

Unless this is 5D chess and Google is willfilly using itself as a cautionary tale to discourage future webservice sites from arbitrarily inserting AI into its features.

load more comments (3 replies)
[–] sudo42@lemmy.world 78 points 6 months ago (1 children)

"Hey, we just promised you answers. We never promised you correct answers." -- Google Marketing, probably.

load more comments (1 replies)
[–] MargotRobbie@lemmy.world 66 points 6 months ago (2 children)

On the other hand, all these AI errors by Google have made for some great memes recently.

[–] pkmkdz@sh.itjust.works 35 points 6 months ago

LLM aka a Large Language Memes

load more comments (1 replies)
[–] Disaster@sh.itjust.works 58 points 6 months ago

I mean.. yeah layoff a whole bunch of people and start treating your employees like replaceable commodities.. then go ahead and arrogantly deploy technology you don't understand and :surprisepikachu: everything breaks.

But management get to do things without personal consequence, as they'll just lay off more workers to cover their absolute incompetence and things will continue to get worse.

Perhaps we should be replacing C-suite dipshits with AI's instead.

[–] Nualkris@lemm.ee 57 points 6 months ago (8 children)

Why does search need to be AI? I've had no problems finding any information I wanted under the former process.

[–] leadore@kbin.social 44 points 6 months ago (9 children)

I think the idea is that you won't even leave the Google page at all, they want to keep you on their site and serve you their ads instead of sending you to someone else's site and giving someone else that sweet sweet ad revenue.

load more comments (9 replies)
[–] Bogasse@lemmy.ml 23 points 6 months ago

I think it's been a long time since digital companies tried to solve actual problems.

[–] best_username_ever@sh.itjust.works 18 points 6 months ago (1 children)

You obviously haven’t used the ~~web3 nocode blockchain NFT~~ AI enough to have an informed opinion.

load more comments (1 replies)
[–] gcheliotis@lemmy.world 10 points 6 months ago

It’s become more efficient to get basic info on virtually any topic by just asking an LLM like ChatGPT and that could be a serious threat to Google Search. People might form the habit of asking AIs for everything and then go to Google Search only when they want to dig deeper / find relevant articles etc. So I assume they added their own AI right into Search in an effort to continue being the first (and perhaps only) place one goes to for information.

load more comments (4 replies)
[–] cordlesslamp@lemmy.today 52 points 6 months ago* (last edited 6 months ago) (7 children)

Today I caught myself unconsciously went to Duckduckgo to make a search (even tho the browser start page is already google).

Thinking back, I've been using duckduckgo more than google, and often because I can't find what I'm looking for on Google but ads and autogenerated fake webpages.

History will prove it once again, there's no such thing as "too big to fail".

[–] tb_@lemmy.world 12 points 6 months ago

There's no better time to switch your default search engine!

load more comments (6 replies)
[–] todd_bonzalez@lemm.ee 36 points 6 months ago

The AI overview has told me so many lies. You thought Facebook made people stupid? Buckle in!

[–] technocrit@lemmy.dbzer0.com 29 points 6 months ago* (last edited 6 months ago) (4 children)

TBH I hate the term "hallucination" in this context. It's just more BS anthropomorphizing. More marketing for "AI" (also BS). Can't we just call it like garbage or GIGO or something more accurate? This is nothing new. I know that scientific accuracy is anathema to AI marketing but just saying...

[–] utopiah@lemmy.world 15 points 6 months ago (2 children)

scientific accuracy is anathema to AI marketing

Even though I agree in this context “hallucination” is actually the scientific term. It might be poorly chosen but in LLM circles if you use the term hallucination, the vast majority of people, will understand precisely what you mean, namely not an error in programming, or a bad dataset, but rather that the language model worked well, generating sentences that are syntactically correct, that are roughly thematically coherent, and yet are factually incorrect.

So I obviously don't want to support marketing BS, in AI or elsewhere, but here sadly it matches the scientific naming.

PS: FWIW I believed I made a similar critic few months, or maybe even years, ago. IMHO what's more important is arguably questioning the value of LLMs themselves, but then it might not be as evident for many people who are benefiting from the current buzz.

load more comments (2 replies)
load more comments (3 replies)
[–] biofaust@lemmy.world 28 points 6 months ago (1 children)

This is what I love about Mike Judge's work. It turns out to be always the best metaphor/reference/prophecy of the boring dystopia. Since 1999.

[–] Bahnd@lemmy.world 34 points 6 months ago (1 children)

Idiocracy is the most unrealistic sci-fi/apocolyptic film ever made. President Comacho finds the most qualified person to help with a crisis, asks them for advice and then doesnt take credit for it. Noone put in a position of power would ever do that.

load more comments (1 replies)
[–] buddascrayon@lemmy.world 26 points 6 months ago

Like we’ve seen before with AI chatbots, this technology seems to confuse satire with journalism

Unfortunately so does my aunt. And she's allowed to vote.

[–] SeattleRain@lemmy.world 24 points 6 months ago (6 children)

Does anyone have a realistic idea of how this happened? I get Google has been fallen off for awhile but they're still a multi billion dollar company.

[–] Vivendi@lemmy.zip 26 points 6 months ago (3 children)

AI doesn't exist. It's a huge model that aggregates existing shit with some filler content to glue it all together. It is not sentient, it's not creative, it's literally a stochastic parrot

So, when the original content is garbage, the output is also garbage. Shit in shit out when you train from fucking Reddit

load more comments (3 replies)
[–] trashgirlfriend@lemmy.world 23 points 6 months ago

Worked for a company that had google as a client:

Google sucks, everyone who works there is an idiot who sniffs their ass all day, nothing works, nothing gets fixed, it's all just held together with duct tape.

[–] BaskinRobbins@sh.itjust.works 22 points 6 months ago (1 children)

I can't imagine a ton of the people working there give a shit anymore when it seems like thousands of people are being layed off weekly while the company takes in billions in profit

load more comments (1 replies)
[–] sugar_in_your_tea@sh.itjust.works 20 points 6 months ago

Easy: worse results with more ads means more searches and thus more ad impressions, therefore profit.

That'll only work for so long, but that seems to be what they're doing.

[–] dustyData@lemmy.world 13 points 6 months ago

Always remember that having more money doesn't mean someone (or some entity) is more capable or intelligent. It just means they have way more latitude to fuck up, higher potential to hurt more people, and less chance of facing negative consequences when they do.

[–] trollbearpig@lemmy.world 13 points 6 months ago* (last edited 6 months ago) (4 children)

I'm probably late, but in this case this is the combinations of 2 things.

  1. The usual capitalistic incentives ruined yet another company. There was a recent article about how Google pushed out the people who builded and maintaned search on favor of MBA growth focused assholes. Like they put the guy that was Yahoo's CEO while Yahoo search was crumbling, in charge of Google search to get him to increase the amount of searches they serve, and ads obviously. People keep suggesting to use DDG, or Kagi, or some other comercial product. And for now, we must because Google is basically useless right now. But just give time to the other companies to fall in the same trap hahaha.
  2. LLMs are not smart, not even close. They are just a parlor trick that has non technical people fooled. There is a lot of evidence to me, but to me the most obvious one is that they don't have anything resembling human short term memory. Like the way they make them look like they are having a conversation is by providing the entire conversation up to that point, including their own previous responses lol, as input/context so the bot autocompletes the conversation. It literally can't remember a single word of what you said on it's own. But sureee, they are just like humans lol.

So what we have here is obvious, we have a company trying to grow like cancer by any means necessary. And now they have a technology that allows them to create enough smoke and mirrors to fool non technical people. Sadly, as part of this they are also destroying the last places of the internet not fully controlled by corporations. Let's hope lemmy survives, but it's just a matter of time before they flood this place too.

load more comments (4 replies)
[–] menemen@lemmy.world 15 points 6 months ago* (last edited 6 months ago) (1 children)

Ah, so it only affects searching from the US for now. I already wondered why I couldn't reproduce the stuff I saw here and why I didn't really see a change at my Google results.

[–] billiam0202@lemmy.world 26 points 6 months ago

You probably live in one of those "socialist" countries that has, like, consumer protection laws and stuff.

[–] moistclump@lemmy.world 13 points 6 months ago (1 children)

Tinfoil hat time. Do you think Google intended this to work well? Or are we talking a lot more about Google and LLMs than we would have otherwise?

load more comments (1 replies)
load more comments
view more: next ›