And if it hallucinates?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I am of the firm opinion that if a machine is "speaking" to me then it must sound a cartoon robot. No exceptions!
I want my AI to sound like a Speak & Spell.
Same old corporations will ignore the law, pay a petty fine once a year, and call it the cost of doing business.
Be sure to tell this to "AI". It would be a shame if this was a technical nonsense law to be.
Ok, this is a REALLY smart law!
, btw I'm ai after every message
As a Califirnian, I will do my job from here on out.
What happened to Old California?
Destroyed by bombs in 2077.
I feel like bombing Night City would raise the property values.
Has anyone been able to find the text of the law, the article didn't mention the penalties, I want to know if this actually means anything.
Edit: I found a website that says the penalty follows 5000*sum(n+k) where n is number of days since first infraction, this has a closed form of n^2+n= (7500^-1)y where y is the total compounded fee. This makes it cost 1mil in 11 days and 1bil in a year.
The state will lose money in courts if they even try to enforce this.
How do you figure, I haven't seen the actual text, is it written ambiguously? If not, I would imagine that they be able to enforce it, the only thing is the scope is very small.
Yeah, this is an important point. If the penalty is too small, AI companies will just consider it a cost of doing business. Flat-rate fines only being penalties for the poor, and all that.
Headline is kind of misleading. It requires a notice to be shown in a chat or interface that said chatbot is not a real person if it's not obvious that it's an LLM. I originally took the headline to mean that an LLM would have to tell you if it's an LLM or not itself, which is, of course, not really possible to control generally. A nice gesture if it were enforced, but it doesn't go nearly far enough.
Fun Fact:
Did you know, that cops are required to tell you if they're a cop? It's in the constitution!