this post was submitted on 01 May 2024
-41 points (26.4% liked)
Technology
59534 readers
3199 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Great, so the headline of the article directly feeds into the issue the scientists are warning about when it comes to public perception of AI morality
Just another example of journalism, ignoring the science and content of their own articles and going for Clickbait headlines instead.
I'm still to be convinced that all these AIs aren't just very good chat bots, they can line up words (or pixels) in a realistic way, but I feel there's no reasoning behind them.
A lot of people, and not just commoners, see "AI" and think "sci-fi robot!"
What you described is exactly what an LLM is. I’m piloting one for work, and sometimes it is useful, while other times it makes up random shit.
They aren't even "very good" I thought I would use it to generate a short story that used a few specific words. I used about half of the requested words, when asked, it said this is embarrassing and tried again, I eventually gave up retrying, it never got all of the words, and when asked which words it omitted it would get that wrong too. It feels like the quality has gone downhill from when they were first introduced.
AI is a marketing term, the association is a deliberate choice by companies trying to market "the future"