this post was submitted on 30 Jul 2024
959 points (97.9% liked)
Technology
59589 readers
2962 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is one of the weirdest of several weird things about the people who are marketing AI right now
I went to ChatGPT right now and one of the auto prompts it has is “Message to comfort a friend”
If I was in some sort of distress and someone sent me a comforting message and I later found out they had ChatGPT write the message for them I think I would abandon the friendship as a pointless endeavor
What world do these people live in where they’re like “I wish AI would write meaningful messages to my friends for me, so I didn’t have to”
The thing they're trying to market is a lot of people genuinely don't know what to say at certain times. Instead of replacing an emotional activity, its meant to be used when you literally can't do it but need to.
Obviously that's not the way it should go, but it is an actual problem they're trying to talk to. I had a friend feel real down in high school because his parents didn't attend an award ceremony, and I couldn't help cause I just didn't know what to say. AI could've hypothetically given me a rough draft or inspiration. Obviously I wouldn't have just texted what the AI said, but it could've gotten me past the part I was stuck on.
In my experience, AI is shit at that anyway. 9 times out of 10 when I ask it anything even remotely deep it restates the problem like "I'm sorry to hear your parents couldn't make it". AI can't really solve the problem google wants it to, and I'm honestly glad it can't.
They're trying to market emotion because emotion sells.
It's also exactly what AI should be kept away from.
But ai also lies and hallucinates, so you can’t market it for writing work documents. That could get people fired.
Really though, I wonder if the marketing was already outsourced to the LLM?
Sadly, after working in Advertising for over 10 years, I know how dumb the art directors can be about messaging like this. It why I got out.