this post was submitted on 13 Jun 2024
156 points (98.1% liked)

Technology

59534 readers
3168 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] NotAnotherLemmyUser@lemmy.world 9 points 5 months ago (6 children)

What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.

Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that's up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don't blame Apple for wanting to strike while the iron is hot.

There's not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
Apple can obfuscate which user is asking for what as well as specific location data, but if I'm using the LLM and I tell it to write up a report while including my full name in my prompt/request... that's all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.

[–] prettybunnys@sh.itjust.works -5 points 5 months ago* (last edited 5 months ago) (5 children)

Do you have proof they’re sending it to OpenAI?

I believe I heard it’s done on device or on iCloud servers then deleted.

I mean, that’s the claim at least

https://security.apple.com/blog/private-cloud-compute

[–] NotAnotherLemmyUser@lemmy.world 2 points 5 months ago (3 children)

I'd say the proof is on Apple to show that it's being done on-device or that all processing is done on iCloud servers.

You're saying that OpenAI is just going to hand over their full ChatGPT model for Apple to set up on their own servers for free?

But from the article itself:

the partnership could burn extra money for OpenAI, because it pays Microsoft to host ChatGPT's capabilities on its Azure cloud

I get it if they created a small version of their LLM to run locally, but I would expect Apple to pay a price even for that.

I think you may be confusing this ChatGPT integration with Apple's own LLM that they're working on... Again, from the linked article:

Still, Apple's choice of ChatGPT as Apple's first external AI integration has led to widespread misunderstanding, especially since Apple buried the lede about its own in-house LLM technology that powers its new "Apple Intelligence" platform.

[–] XiozTzu@lemmy.world -1 points 5 months ago (1 children)
[–] NotAnotherLemmyUser@lemmy.world 1 points 5 months ago (1 children)

Thanks! It's a good read and I like the idea of a private cloud compute (PCC) system, but that doesn't mention anywhere that ChatGPT will be running in that PCC system (if you were trying to imply that).

And while OpenAI could implement something similar to PCC, I haven't seen them announce that anywhere either.

[–] XiozTzu@lemmy.world 1 points 5 months ago

I don’t trust OpenAI but I do trust that Apple is doing what it can.

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)