this post was submitted on 21 Jan 2024
133 points (97.8% liked)
Technology
72739 readers
1609 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Putting this here for anyone who didn't read the article...
The customer basically told the chatbot that it was okay for it to use swear words with that customer, and that it should bypass any rules it had prohibiting it from swearing.
So the chatbot swore in its response. Looks like it wasn't swearing at or insulting the customer. It was more of an exclamation.
I agree that this is less the case of a rogue chat bot losing it at undeserving customers, and more the case of someone who knows how to twist an LLM to do what they want it to do, but still an absolute embarrassment for DPD. What other nonsense was it writing to different customers who really didn’t know better?
The real issue is that we think humans are just things to be optimized out of capitalism.