30% of people will believe literally anything. 16% means even half of the deranged people aren't interested.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I just want good voice to text that runs on my own phone offline
AI in Movies: "The only Logical solution, is the complete control/eradication of humanity."
AI in Real Life: "Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices." Dave: "THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!"
People don't want the hardware if the software sucks.
Why would I need a GPU if the only games that exist to play on it are the equivalent of WildTangent malware games?
If AI matures into something that people actually like, you'll get a different answer here.
Most people have pretty decent ai hardware already in the form of a gpu.
Sure dedicated hardware might be more efficient for mobile devices, but that's already done better in the cloud.
Google coral TPU has been around for years and it's cheap. Works well for object detection.
There's a lot of use cases in manufacturing where you can do automated inspection of parts as they go by on a conveyor, or have a robot arm pick and place parts/boxes/pallets etc.
Those types of systems have been around for decades, but they can always be improved.
Personally I would choose a processor with AI capabilities over a processor without, but I would not pay more for it
"enhanced"
It just doesn't really do anything useful from a layman point of view, besides being a TurboCyberQuantum buzzword.
I've apparently got AI hardware in my tablet, but as far as I'm aware, I've never/mostly never actually used it, nor had much of a use for it. Off the top of my head, I can't think of much that would make use of that kind of hardware, aside from some relatively technical software that is almost as happy running on a generic CPU. Opting for AI capabilities would be paying extra for something I'm not likely to ever make use of.
And the actual stuff that might make use of AI is pretty much abstracted out so far as to be invisible. Maybe the autocorrecting feature on my tablet keyboard is in fact powered by the AI hardware, but from the user perspective, nothing has really changed from the old pre-AI keyboard, other than some additions that could just be a matter of getting newer, more modern hardware/software updates, instead of any specific AI magic.
I guarantee most this AI bullshit is nothing but a backdoor to harvest more user info, anyway.
Most people won't pay for it because a lot of AI stuff is done cloud side. Even stuff that could be done locally is done in the cloud a lot. If that wasn't possible, probably more people would wand the hardware. It makes more sense for corporations to invest in hardware.
People already aren't paying for them, nVidia's main source of income is industry use and not consumer parts, right now.