Was on Feedly, have now moved to News Explorer for iOS. Self hosted (runs on device), synchronizes between iOS devices using iCloud
Mbourgon
Wait, which group?
waveguide here's a good article on it: https://www.theverge.com/2024/5/9/24153092/stanford-ai-holographic-ar-glasses-3d-imaging-research
Several others, though a couple seem to be about a POC.
- https://www.adventhealth.com/business/adventhealth-central-florida-media-resources/news/surgical-team-adventhealth-performs-worlds-first-its-kind-procedure-using-apple-vision-pro-mixed
- https://time.com/7093536/surgeons-apple-vision-pro/
- https://pubmed.ncbi.nlm.nih.gov/39140319/
- https://appleinsider.com/articles/24/05/09/more-doctors-are-embracing-apple-vision-pro-for-precision-keyhole-surgeries/amp/
Some reasons.
- Apple needs new products - even something like this gives headlines, reminds people about the cool product, so maybe they choose a different one. Even if it doesn’t make money it keeps Apple as “new and innovative” and helps recruitment.
- Gets it out there for developers to try out, come up with use cases and killer apps.
- People (prosumers) come up with uses that Apple and Devs may not have thought of.
- Allows people from #4 to bring them to work - after all, that’s how Apple got big in the first place… People bringing their Apple ][ & visicalc, since their IT wasn’t responsive enough or people hated working on mainframes. It wouldn’t surprise me if one of the doctors brought it in himself thinking it might be useful.
- Allows Apple to come up with justification for the R&D money for the GUI, UX, hand gestures, etc that they’re going to need later. Gotta keep shareholders happy.
Here’s a good article about this specific waveguide: https://www.theverge.com/2024/5/9/24153092/stanford-ai-holographic-ar-glasses-3d-imaging-research
TLDR - they need special materials to allow small/thin glasses for XR goggles. This looks like it could be huge.
Yeah, I’ve seen where doctors are using it for surgery and I see all sorts of parallels to the portable computing movement of the 90s, which were about having tablets instead of a ton of manuals, and some of the AR/MR where it shows them where everything goes while looking at the part in question.
I went and did the Apple demo. I was there for something else at the time, and they had an opening, so I jumped on it. I highly recommend doing the demo, it’s honestly really freaking impressive. I’m not positive what the killer app is for it yet, or if this is just a step in long term AR/MR, but what they’ve done is really impressive. Yes, it’s expensive as hell, and my suspicion is that long term the displays will be replaced with a waveguide (Stanford’s looks pretty good at this point), so it won’t need the external-facing display, but they’ve got the head and hand-tracking in a good spot, as well as the gestures needed for it.
Maybe, the killer app will be the overlay itself, where it uses a camera/location/audio to see what’s going on and present more context. Looking at a menu? Okay, I’ve had this and this and liked it, but their X I’m not a fan of. I need Y from the grocery store, where is it on the shelves… more than anything, I think that they saw what Google glass could become capable of, and thought that the phone as it is now (screen, etc) was going to become obsolete at some point, and they were terrified of losing that race.
Ooh, a new Knives Out! Thanks!
I’d say he has a perfect track record - 0.00%
Dark humor is like food. Not everybody gets it.
Daaaaaamn. You win.