And, just like enshittification, the term is being thrown about with such wild abandon that it barely means anything any more. Most of the time it seems to me that "Embrace, Extend, Extinguish!" Translates to "thing I like got popular and now may be used by thing I don't like."
FaceDeer
Cory's original usage of the word gave it a useful and specific meaning. But that has evolved extremely rapidly with popular usage into the word simply meaning "I don't like this thing." Which takes away the usefulness because now it's no longer describing a specific reason for not liking it.
It'd be like if every kind of ailment started being referred to as an "infection." Concussions, sprains, hypothermia, etc, all being passed off as "he got infected." We already have generic terms for that like "he got hurt," and now when someone does get literally infected we've lost the word that would be used to specify that.
Languages evolve, sure. But that doesn't mean it's always in a good direction. In this specific case evolution is enshittifying the language and that's worth a little (admittedly futile) push-back.
Well, semi-serious. Doing this to "save characters" is obviously silly, we've spent way more characters discussing how to save characters than could possibly have been saved (and it's not a valuable "savings" regardless). But I was paying attention to the practicality, because IMO the best silly things are things you can take seriously.
Indeed. Often the hardest part of an invention is the discovery that a thing is actually possible. Even if nobody knows how it was done they can now justify throwing resources into figuring it out and know what results to keep an eye out for.
That's one step too far, though. There'd be no way to distinguish them without that number.
It's still driving the state of the art forward, which will result in models that will be used by the public.
You could also drop the "." in this case, saving another three.
It's so annoying how suddenly everyone's so convinced that "AI" is some highly specific thing that hasn't been accomplished yet. Artificial intelligence is an extremely broad subject of computer science and things that fit the description have been around for decades. The journal Artificial Intelligence was first published in 1970, 54 years ago.
We've got something that's passing the frickin' Turing test now, and all of a sudden the term "artificial intelligence" is too good for that? Bah.
I've heard talk of mbin being a fork with more active development, is it getting "ahead" of kbin or is kbin taking changes back from it? No disrespect intended to Earnest, but a single developer probably can't keep up with all this on their own.
I'd take a step farther back and say the argument hinges on whether "consciousness" is even really a thing, or if we're "faking" it to each other and to ourselves as well. We still don't have a particularly good way of measuring human consciousness, let alone determining whether AIs have it too.
I knew you'd say that.
Not necessarily, but in this particular case it seems bad to me. We're losing a specialized term for something that IMO warrants having one.