this post was submitted on 12 Jun 2024
77 points (92.3% liked)
Technology
59534 readers
3183 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Apple lay out some details here: https://security.apple.com/blog/private-cloud-compute/
They control the cloud hardware. Information used for cloud requests is deleted as soon as the request is done. Everything end-to-end encrypted. Server builds are publicly available to inspect. And all of this is only used unless the on-device processing can’t handle a request.
If somebody wanted to actually create a private AI system, this is probably how they’d do it.
You can disagree with this or claim somehow that they are actually accessing and selling people’s data, but Apple are going out of their way to show (and cryptographically prove) how they’re not. It would also be incredible fraudulent and illegal for them to make these claims and not follow through.
To add to that, apart from the Apple cloud processing, data can be sent to OpenAI if a prompt is deemed too complex, but even then you're asked whether or not you want it to talk to OpenAI's servers each time, and apparently OpenAI isn't allowed to store any of that data, tho idk how much I'd trust that part.
They also claim that whenever data is sent off device, only the data directly relevant to the prompt is sent.