Switch to linux, use open source AI. It's better and private.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
ai is the 3d movies of this age.
Does anyone still know anyone with a 3D TV?
My uncle bought a $2,000 one but the cheap fuck only ever bought 1 pair of glasses.
Yessss I was just saying that to a friend. Its starting to really feel like we're gonna be looking back in a few years laughing at it as a trend. Time will tell!
Honestly, people are rightfully concerned about Microsoft locking down machines, and hackers, and rightfully so, but I think the real insanity is that I do really think LLMs is a tech bubble that I fully expect to burst, and attempting to redesign our lives around it will feel as silly as web3 in 2025.
I wonder when they start removing being able to make administrator account on regular licences and make you beg the ai for anything that requires elevated rights.
Yes, "control." That's what Microsoft wants you to have over "your" computer.
"Open the browser. No, not explorer, Edge! Open Edge, god damn it! Go to CNN.com. why did you open another browser window? No, I don't want to open another browser window. Open the news "Everything sucks and we are all going to die". Why did you open Bing? Stop asking for confirmation for everything...
If a tech executive says we're on the cusp of a technology breakthrough it means less than nothing and we should be more suspicious of it than already. These are people who don't know how to manage an organization based on the frequent layoffs (2009, 2014, 2023-2025 over 20k workers). People get fired because they fuck up, management layoff people because management fucked up.
Well, Microsoft can eat a bag of dicks.
The way that all this "AI" processing has been trained, it almost always fails for anyone who doesn't fit the white middle-class aesthetic. Voice-to-text generative AI processing will screw up for people with accents, including non-native speakers; also someone who slurs their words, or talks in African-American Vernacular English. Also, it requires someone to know how to speak and listen in a language. Clicking on icons and inputting commands is the same regardless of what language you speak. This just reeks of out-of touch nepo-baby executives.
I hate any voice-activated programs. Sometimes I'll ask my phone to call someone, and most of the time it does. But every now and then, it seems to completely forget my voice, the English language, how to access my contacts, how to spell anything, etc. I end up spending five minutes trying to force it to dial by my voice, screaming and cursing at it like a psychopath, when it would have taken me literally 3 seconds to just make the call manually.
If you try to do some sort of voice-to-text thing, it ALWAYS screws it up so bad, that you end up spending more time editing, than if you'd just typed it yourself in the first place.
Fuck voice-activated anything. It NEVER works reliably.
It isn't even unique to AI, human operators get things wrong all the time. Any time you put something involving natural language between the user/customer and completing a task, there's a significant risk of it going wrong.
The only time I want hands-free anything is when driving, and I'd rather pull over than deal with voice activation unless it's an emergency and I can't stop driving.
I don't get this fascination with voice activation. If you asked me to describe my dream home if money was no object and tech was perfect, voice activation would not be on the list. When I watch Iron Man or Batman talking to a computer, I don't see some pinnacle of efficiency, I see inefficiency. I can type almost as fast as I can speak, and I can make scripts or macros to do things far faster than I can describe them to a computer. Shortcuts are far more efficient than describing the operation.
If a product turns to voice activation, that tells me they've given up on the UX.
@sugar_in_your_tea @BarneyPiccolo especially in a language as widely used as English with regional nuance that an NLP could never distinguish. When I say "quite" is it an American "quite" or a British "quite"? Same for "rather"? What does it mean if we're tabling this thing in the agenda? When/for how long is something happening, momentarily? Neither the speaker nor the program will have a clue how these things are being interpreted, and likely will not even realise there are differences.
Even if they solve the regional dialect problem, there's still the problem of people being really imprecise with natural language.
For example, I may ask, "what is the weather like?" I could mean:
- today's weather in my current location (most likely)
- if traveling, today or tomorrow's weather in my destination
- weather projection for the next week or so (local or destination)
- current weather outside (i.e. heading outside)
An internet search would be "weather ". That's it. Typing that takes a few seconds, whereas voice control requires processing the message (a couple seconds usually) and probably an iteration or two to get what you want. Even if you get it right the first time, it's still as long or longer than just typing a query.
Even if voice activation is perfect, I'd still prefer a text interface.
My autistic brain really struggles with natural language and its context-based nuances. Human language just isn't built for precision, it's built for conciseness and efficacy. I don't see how a machine can do better than my brain.
Agreed. A lot of communication is non-verbal. Me saying something loudly could be due to other sounds in the environment, frustration/anger, or urgency. Distinguishing between those could include facial expressions, gestures with my hands/arms, or any number of non-verbal clues. Many autistic people have difficulty picking up on those cues, and machines are at best similar to the most extreme end of autism, so they tend to make rules like "elevated volume means frustration/anger" when that could very much not be the case.
Verbal communication is designed for human interactions, whether in long-form (conversations) or short-form (issuing commands), and they rely on a lot from the human experience. Human to computer interactions should focus on those strengths, not try to imitate human interaction, because it will always fail at some point. If I get driving instructions from my phone, I want it to be terse (turn right on Hudson Boulevard), whereas if my SO is giving me directions, I'm happy with something more long-form (at that light, turn right), because my SO knows how to communicate unambiguously to me whereas my phone does not.
So yeah, I'll probably always hate voice-activation, because it's just not how I prefer to communicate w/ a computer.
I love the idea!
I absolutely despise it when it is locked down from the user, owned by the corporation that produced it, and operating as an arm of the surveillance state.
Even discounting the need for safeguards, sanity-checks, and verifiability of information.
Those monstrosities are not allowed in my home until I can remove the spyware operating system.
I have not touched a Microsoft product or service for my personal life in 10 years. Last year I was fired, thus no longer being forced to use Teams.
Which means I haven't touched a Microsoft product, at all, in a year. Love it.
And I would like Microsoft to go fuck itself. 🖕🥰🖕
I hear you need help with how to fuck yourself.
Clippy doesn't forgive. Clippy doesn't forget.
Clippy goes in dry.
Clippy is your best friend when you want an abortion in a Red state.
I'm horrible for laughing to myself but my response was "Jesus don't men just punch their girlfriends in the gut anymore?"
Microsoft has no say what happens on my workstation, and never had any.
Beyond that sounding tedious as fuck, how much will that actually improve workflow? Or is this one of those features that sounds good to people with C level intelligence, and the rest of us just have to pretend we're using.
This has been a Microsoft wishlist feature since the 90s. I remember being a kid and reading articles in my dad's copies of PC Magazine that Bill Gates wanted a computer without a keyboard that you could just talk to and tell it what to do.
So yeah, C-level intelligence is exactly right.
Yeah, I absolutely hate talking to devices, it's inefficient and frustrating. Why would I want that as the primary interface to my computer?
Complex UX should be solved in two ways:
- simplify common operations - i.e. build widgets for weather, news, etc; the more open the system, the easier it is to offload this to the community
- improve docs to educate users on power-user functionality
If I'm asking an AI tool how to do something with your product, you need to fix your product.
Yeah, and I'm sure it also wanted middle managers to write COBOL.
"We are on the cusp of the next AI evolution, in which we, the tech company, can simply say the word 'Money' to our AI, and it will automatically transfer money directly from our investors into our wallets. Future versions won't require us to say anything, permitting AIs to write their own next press release for budding, just-around-the-corner technology in an E-mail to investors."
“CORTANA, OPEN XHAMSTER.COM”
loudly said george in the public school’s computer lab.
I can only imagine the utter chaos this would cause in a cube farm.
But, the only place where talking to your computer at length makes any sense whatsoever is where you're alone in a private office and nobody outside of the office can hear you. Nobody wants to hear other people talking to their computer, and nobody wants other people listening to what they're doing on the computer.
My spouse and I both work from home and keep our office doors open so that the cats can come and go. We have absolutely no interest in hearing each other work. I know couples that share a home office. It's like these fucknut executives at M$ think everyone either lives alone or has a private office in the east wing of their McMansion.
And all of that is ignoring the fact that you shouldn't need AI to interpret what somebody wants a computer to do. Discreet commands for discreet tasks have been a thing for as long as computers have existed and there's no reason for that to change, regardless of the input method. Making commands fuzzy and open to interpretation is not an improvement.
I was curious about an LLM-powered terminal, so downloaded it to check it out. The first thing I did was ask it to do something like "open my resume file," and instead doing something like "ls | grep -i resume" in the current directory, it ran the find command on root and started hitting all my NFS mounts as well.
The thought of how the computer would react to me telling my cat to get down off the desk is . . . both amusing and disturbing.