this post was submitted on 10 Jun 2024
88 points (85.5% liked)

Technology

59605 readers
3366 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Stovetop@lemmy.world 62 points 5 months ago (2 children)

Finally Apple is ready to use all that training data they say they don't collect.

[–] kibiz0r@midwest.social 9 points 5 months ago (1 children)

Article says it’s likely an OpenAI partnership.

[–] PrivateNoob@sopuli.xyz 19 points 5 months ago (1 children)
[–] AliasAKA@lemmy.world 2 points 5 months ago (2 children)

Depends. If they get access to the code OpenAI is using, they could absolutely try to leapfrog them. They could also just be looking at ways to get near ChatGPT4 performance locally, on an iPhone. They’d need a lot of tricks, but succeeding there would be a pretty big win for Apple.

[–] technocrit@lemmy.dbzer0.com 3 points 5 months ago (1 children)

People are really racing to destroy the planet so their phone can make a crappy summary of what's on wikipedia.

[–] AliasAKA@lemmy.world 4 points 5 months ago (1 children)

Not even a summary of what’s on Wikipedia, usually a summary of the top 5 SEO crap webpages for any given query.

[–] Lucidlethargy@sh.itjust.works 1 points 5 months ago

Well yeah, but to be fair we now know exactly how much glue to put into our zesty pizza sauce.

[–] abhibeckert@lemmy.world 1 points 5 months ago* (last edited 5 months ago) (1 children)

near ChatGPT4 performance locally, on an iPhone

Last I checked, iPhones don't have terabytes of RAM. Nothing that runs on a small battery powered device is ever going to be in the ballpark of ChatGPT. At least not in the foreseeable future.

[–] AliasAKA@lemmy.world 1 points 5 months ago

They don’t, but with quantization and distillation, as well as fancy use of fast ssd storage (they published a paper on this exact topic last year), you can get a really decent model to work on device. People are already doing this with things like OpenHermes and Mistral (given, 7B models, but I could easily see Apple doubling ram and optimizing models with the research paper I mentioned above, and getting 40B models running entirely locally). If the start of the network is good, a 40B model could take care of a vast majority of user Siri queries without ever reaching out to the server.

For what it’s worth, according to their wwdc note, they’re basically trying to do this.

[–] Brunbrun6766@lemmy.world 7 points 5 months ago (2 children)

But every apple user has assured me that iPhones are so much more secure and that apple isn't like mean ol Google and toooootally doesn't collect all the same data from you.

[–] Alphane_Moon@lemmy.world 11 points 5 months ago

They will also assure you that Apple totally doesn't not collaborate with the CCP and allows them full access to all Chinese users data.

Apple users like to assure people of many things. :)

[–] harsh3466@lemmy.ml 5 points 5 months ago

Apple user here. I can assure you, Apple sucks just as bad as MS and Google.