OpenStars

joined 11 months ago
[–] OpenStars@discuss.online 10 points 6 months ago

I liked that insider peek from Jenson - well I liked reading all of this, but especially that:-).

[–] OpenStars@discuss.online 2 points 6 months ago* (last edited 6 months ago) (2 children)

Now who is anthropomorphizing? It's not about "blame" so much as needing words to describe the event. When the AI cannot be relied upon, bc it was insufficiently trained to be able to distinguish truth from reality, which btw many humans struggle with these days too, that is not its fault but it would be our fault if we in turn relied upon it as a source of authoritative knowledge, merely bc it was presented in a confident sounding manner.

No, my example is literally telling the AI that socks are edible and then asking it for a recipe.

Wait... while true that that sounds like not hallucination then, what does that have to do with this discussion? The OP wasn't about running an AI model in this direct manner, it was about doing Google searches, where the results are already precomputed. It does not become a "hallucination" until whoever asked for the socks to be considered as edible tries to pass those results off in a wider context - where they are generally speaking considered inedible - as being applicable, when they would not be.

[–] OpenStars@discuss.online 1 points 6 months ago (4 children)

I am not sure what you mean. e.g. https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence) says:

In natural language processing, a hallucination is often defined as "generated content that appears factual but is ungrounded". The main cause of hallucination from data is source-reference divergence... When a model is trained on data with source-reference (target) divergence, the model can be encouraged to generate text that is not necessarily grounded and not faithful to the provided source.

e.g., I continued your provided example of when "socks are edible" is a band name, but the output ended up in a cooking context.

There is a section on https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)#Terminologies but the issue seems far from settled that hallucinations is somehow a bad word. And it is not entirely illogical, since AI, like humans, necessarily has a similar tension between novelty and creativity - i.e. going beyond either of our training to deal with new circumstances.

I suspect that the term is here to say. But I am nowhere close to an authority and could definitely be wrong:-). Mostly I am saying that you seem to be arguing a niche viewpoint, not entirely without merit obviously but one that we here in the Fediverse may not be as equipped to banter back and forth on except in the most basic of capacities.:-)

[–] OpenStars@discuss.online -3 points 6 months ago (6 children)

"which contains false or misleading information presented as fact" (emphasis added) - the definition does not say how the misinformation was derived, only that it is in fact misinformation.

Perhaps it was meant humorously - e.g. if "Socks are edible" is a band name. Or perhaps someone is legitimately that dumb, that they believe that socks are genuinely edible. Or perhaps they were cooking up a recipe for maliciously harming someone by giving them intestinal upset. Or... are socks edible, if you cook them in an acidic substance that breaks apart their fabric?

If e.g. you got cancer and were going through chemo but someone came to visit you and gave you COVID and you died, was that "their fault", if they believed that COVID was merely a conspiracy theory? Perhaps... or perhaps it was your own fault, especially if you were aware that this has happened to multiple people before, and now you are just the latest casualty (bc you presumed that despite them doing it to others, they would never do it to you). Legalities of murder and blame aside, should we believe AI now that we know - regardless of how or why - it presents false information?

No, these "hallucinations" or "mirages" or whatever someone calls them makes them unreliable. Actually I think hallucination is a good name i.e. it cannot distinguish fact from fiction itself, therefore it cannot be trusted as it relates that info to you in a confident sounding manner.

[–] OpenStars@discuss.online 7 points 6 months ago* (last edited 6 months ago)

Give me your traffic/money! (users/advertisers) 🤑 💰

-Google

img

[–] OpenStars@discuss.online 9 points 6 months ago (13 children)

AI hallucination is a technical phrase, with the definition:

In the field of artificial intelligence, a hallucination or artificial hallucination is a response generated by AI which contains false or misleading information presented as fact. This term draws a loose analogy with human psychology, where hallucination typically involves false percepts.

So it's like how a person sees stuff that isn't there, and similarly with AI.

[–] OpenStars@discuss.online 9 points 6 months ago

Using just one example: I used to go to Google to search for news articles. Now, I cannot find those same articles using Google, but if I search really, Really, REALLY hard I can sometimes find them using DuckDuckGo (DDG). The search experience using Google was a million times better, ten years ago, than DDG is now, however DDG can work, whereas Google flat refuses to work no matter what I try.

And the reason why is illuminating: they try to push their SEO content, to "sell" me what they want me to see, rather than what I wanted to see. Even if I typed in the exact, precise title of what I wanted, but then lets say that I am off on one word like not sure if it was plural or not hence cannot put the title in quotes, Google will not show it often even on higher page numbers like 10, and instead just shows a steady stream of "popular" content. I recall a specific instance where I literally had the article pulled up on my phone, and I was trying to find the same article from a year or two in the past and even typing in the title, it just wouldn't do it, so I gave up and just typed out the URL manually. Sometimes also I will try to find a specific video, and it shows me videos that they think I want to see, but even with the title matching it really struggles to show older content, even when it was super popular at the time.

Tbf it has actually gotten much better lately, compared to a couple of years ago, though the way that it seems to have gotten better is with all these extra ad-ons that they've put onto their pages. Like it used to be that if you pick some random word - let's use "serenity" as the example - it would show you almost nothing related to the definition of that word until page 2 or 3, and instead show various pages about the (awesome) Joss Whedon movie of that name. Now, the little blurb ("widget"? I have no idea what that element is called) from Oxford Languages showing the dictionary definition as the second-to-top item, almost, after a very small "See results about Serenity 2019 film", and also a whole right-hand sidebar (on my desktop browser) about it, but the point is that it does show the definition, very high up in the list. Then for me I get imdb (2005) film, imdb (2019) film, wikipedia (2005) film, and then finally the Merriam-Webster definition page (btw I really hate how browsers won't allow us to select text that we would like to copy, but they have decided that they know better what they will allow us to do). And then ofc Serenity official trailer with Matthew McConaughey, Rotten Tomatoes review, again a Dictionary(.com) definition, the Serenity Symphonic Metal band, Amazon.com HD-DVD, Cambridge dictionary - this is a lot better than it used to be! And yeah, DDG is similar.

It is a constantly evolving landscape, and depends heavily on what types of content you are searching for too.

[–] OpenStars@discuss.online 61 points 6 months ago (7 children)

I highly doubt that this orders search results like it did ten years ago ignoring SEOs though. This looks to only fix the latest category of screw-up.

[–] OpenStars@discuss.online 6 points 6 months ago

On the one hand, there are true alternatives like Vimeo. On the other, sometimes you just need to access an existing video...

[–] OpenStars@discuss.online 9 points 6 months ago (2 children)

Um... tech-bros, apparently? :-(

view more: ‹ prev next ›