this post was submitted on 16 Mar 2025
456 points (97.9% liked)
Technology
66783 readers
4636 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A while ago I set up a Siri shortcut that opens ChatGPT in voice mode. Now I can just say “hey siri, ask the demon” and in a moment start talking to ChatGPT with no further commands and zero buttons pressed throughout. It answers in voice mode.
This is pretty useful for things like doing units conversions while my hands are sticky during cooking, or just doing simple information lookups while my hands are busy. I use ChatGPT responsibly, never trusting it for things that aren’t one-dimensional information retrievals and summarization. It works great for me for like 50-60% of the things I used to Google. Internet search is, once again, just for finding websites, like it should be.
What’s my point? We don’t need Siri Apple Intelligence to ship. There’s already something better. And it runs on my iPhone 14, which isn’t even compatible with Apple Flatulence.
I’ve never used Apple intelligence and Siri alone has done a fine job for unit conversions, info lookups, and most any other things. No need to fire up chat GPT for those basic uses
Mm. Siri doesn’t do information lookups in the sense that I mean. It resorts to “here’s something I found on the web” very very quickly, and that’s not very helpful.
Fair enough. It’s not going to summarize results or anything. I’ll give chatGPT that one. But this is also where chatGPT can easily pull wrong information into the summary. If the “here’s something I found on the web” isn’t accurate, the topic is likely to have the AI summary screw it up too. In which case you’re better off manually searching and filtering results yourself.
Hallucination exists but is massively exaggerated in popular discourse about AI. The worst examples of all time are paraded and amplified, meanwhile people use these tools successfully every day. I’ve spot checked results and not ever really gone wrong. I do prefer the tools that provide links to their sources though.