this post was submitted on 02 Mar 2024
334 points (95.9% liked)
Technology
59495 readers
3135 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Because of the way you phrase it.
You only tell chatGPT your side of the story. And chatGPT is just a word predictor. If you offer it 2 options, and for one of them you use words that are on average 20.69% more positive to describe the option than the other one, chatGPT just fills the blanks and will see that that option is more positive, therefore it will probably recommend that.
ChatGPT has no intelligence or reason, it's just a word predictor. It doesn't use logic. It won't do an analysis of the impact of each alternative, it just has some inputs and is asked to predict what the next word will be.
This is why it drives me insane that people call it AI
We're just word predictors too.
Yeah noticed this when I started to make chatgpt write more sentences in essay's I was doing. When you make chatgpt write the next sentence in a paragraph 9/10 times it just rewrites what you wrote in a different way.