this post was submitted on 23 Feb 2024
97 points (91.5% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's a pretty extreme example but I can definitely see how an AI would fumble need to know historical context.
Calling it AI at all was a mistake, it has no thought capability, it just spits out what it was asked for like the world's dumbest genie.
Yeah. it's called Generative AI because all it does is generate random crap based on ingested material and questions. It's a fancy autocomplete and people are relying on it.
The term "AI" has a much broader meaning and use than the sci-fi "thinking machine" that people are interpeting it as. The term has been in use by scientists for many decades already and these generative image programs and LLMs definitely fit within it.
You are likely thinking of AGI, or artificial general intelligence. We don't have those yet, but these things aren't intended to be AGI so that's to be expected.
Thanks for saying this. I’m pretty tired of people talking about how we have changed the definition of ai to include this stuff whereas I remember talking about ai in terms of things like random forests and q learning 20 years ago
Hell, AI for years has meant "crappy script that can make a little guy run around and shoot his gun at you sometimes."
It’s probably not the base model. It’s probably the hidden prompt augmentation rules that everyone has been scrambling to add to these models.
Gemini, like Chat GPT, it it likely appending simple prompts with more detail behind the scenes so results come out more varied. For example, since the models regurgitate the content people promote on the internet, if you searched for “attractive person” last year, you’d probably get a white person 95% of the time. When something is over represented in mainstream media, gen AI will reflect that back.
Now, before something goes into the gen AI black box, it is secretly given racial and photo style modifiers so users get more diverse people and image styles unless the user manually specifies what they want to see.
Generally this works well, but it can backfire hilariously if you ask for it to draw a group of people who are famous to having a certain ethnicity.