It wasn't just written by AI, it was badly written and quoting sources that don't exist
Not The Onion
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, ableist, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
“it wasn’t just drawn by a two-year-old, it also lacked skill and technique”
Communications minister, Solly Malatsi, withdrew the draft policy after finding that at least 6 of its 67 academic citations were AI-generated hallucinations that cited journal articles that don’t exist.
Setting aside the issue of whether where the text came from was appropriate, human or computer or whatnot, I also kind of feel like the legislators and legal people involved should be...you know, doing a careful reading of the actual legislation before it goes to the public for comments.
One thing that I have wondered based on the steady flow of cases in which lawyers
not just in South Africa
have been shown to be using hallucinated text, meaning that they were just feeding stuff into some LLM and using the output, is how much legal text prior to LLMs showing up was basically just copy-pasted from other random documents authored by someone else, with lawyers not really doing the expected work themselves of ensuring that the text was appropriate. Or dumping work that is supposed to be done by a lawyer on some paralegal and signing off on it, or something like that. I mean, LLMs are probably an appealing way to half-ass doing legal text, but they also probably aren't the first way to cut corners.
"The draft rules also outlined plans for tax breaks, grants, and subsidies to encourage private-sector collaboration in building AI infrastructure in the country."
This is proof that you do not need the Skynet-like AI to be intelligent, conscious, or even have genuine self-interest, in order for it to basically act in exactly the same way as if it did. The paper clip machine is here
Can I be a paperclip? It sounds less stressful.