this post was submitted on 10 Jun 2024
88 points (85.5% liked)

Technology

59495 readers
3135 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 22 comments
sorted by: hot top controversial new old
[–] Stovetop@lemmy.world 62 points 5 months ago (2 children)

Finally Apple is ready to use all that training data they say they don't collect.

[–] kibiz0r@midwest.social 9 points 5 months ago (1 children)

Article says it’s likely an OpenAI partnership.

[–] PrivateNoob@sopuli.xyz 19 points 5 months ago (1 children)
[–] AliasAKA@lemmy.world 2 points 5 months ago (2 children)

Depends. If they get access to the code OpenAI is using, they could absolutely try to leapfrog them. They could also just be looking at ways to get near ChatGPT4 performance locally, on an iPhone. They’d need a lot of tricks, but succeeding there would be a pretty big win for Apple.

[–] technocrit@lemmy.dbzer0.com 3 points 5 months ago (1 children)

People are really racing to destroy the planet so their phone can make a crappy summary of what's on wikipedia.

[–] AliasAKA@lemmy.world 4 points 5 months ago (1 children)

Not even a summary of what’s on Wikipedia, usually a summary of the top 5 SEO crap webpages for any given query.

[–] Lucidlethargy@sh.itjust.works 1 points 5 months ago

Well yeah, but to be fair we now know exactly how much glue to put into our zesty pizza sauce.

[–] abhibeckert@lemmy.world 1 points 5 months ago* (last edited 5 months ago) (1 children)

near ChatGPT4 performance locally, on an iPhone

Last I checked, iPhones don't have terabytes of RAM. Nothing that runs on a small battery powered device is ever going to be in the ballpark of ChatGPT. At least not in the foreseeable future.

[–] AliasAKA@lemmy.world 1 points 5 months ago

They don’t, but with quantization and distillation, as well as fancy use of fast ssd storage (they published a paper on this exact topic last year), you can get a really decent model to work on device. People are already doing this with things like OpenHermes and Mistral (given, 7B models, but I could easily see Apple doubling ram and optimizing models with the research paper I mentioned above, and getting 40B models running entirely locally). If the start of the network is good, a 40B model could take care of a vast majority of user Siri queries without ever reaching out to the server.

For what it’s worth, according to their wwdc note, they’re basically trying to do this.

[–] Brunbrun6766@lemmy.world 7 points 5 months ago (2 children)

But every apple user has assured me that iPhones are so much more secure and that apple isn't like mean ol Google and toooootally doesn't collect all the same data from you.

[–] Alphane_Moon@lemmy.world 11 points 5 months ago

They will also assure you that Apple totally doesn't not collaborate with the CCP and allows them full access to all Chinese users data.

Apple users like to assure people of many things. :)

[–] harsh3466@lemmy.ml 5 points 5 months ago

Apple user here. I can assure you, Apple sucks just as bad as MS and Google.

[–] Nachorella@lemmy.sdf.org 42 points 5 months ago (2 children)

Wowee, yet another LLM. I bet this one's going to be super different somehow.

[–] CatZoomies@lemmy.world 19 points 5 months ago

Apple: “It’s the best LLM we’ve ever shipped. We think you’re gonna love it.”

[–] Lucidlethargy@sh.itjust.works 4 points 5 months ago

It's just ChatGPT. Nothing is new here. Apple hasn't innovated anything truly new in over a decade. The closest you can get is their processors, and none of those are anything like the innovations they brought us 15-20 years ago.

[–] mannycalavera@feddit.uk 20 points 5 months ago

Can't wait for the Apple in a world first we present to you press event for AI.

[–] MonkderDritte@feddit.de 12 points 5 months ago

StartUp: innovates

Big corpo: throws money

Somehow it always ends in centralization.

[–] cybersandwich@lemmy.world 2 points 5 months ago (3 children)

Buncha wet blankets on Lemmy. JFC.

I know there is a ton of hype around AI, but at least there is actually something there (unlike crypto).

This is the most exciting thing to happen with computing in a while and if you read Lemmy you'd think everything is bleak and hopeless.

There is so much opportunity to change the way we interact with computers and innovate.

[–] FenrirIII@lemmy.world 3 points 5 months ago

It's exciting, but in the hands of morons (executives).

[–] Clbull@lemmy.world 0 points 5 months ago

I'm actually worried about the next few years.

All it takes is for a deep learning algorithm to learn and perform menial tasks better than humans and that is it. Suddenly whole industries of workers could be made redundant, which would spike unemployment rates. We are not ready for that.

And before you say Universal Basic Income will save us, UBI is little more than a leftist pipe dream that would bankrupt any nation that tries to pursue it as an actual policy.

[–] technocrit@lemmy.dbzer0.com -2 points 5 months ago* (last edited 5 months ago) (1 children)

AI peeps so insecure that they gotta attack crypto (vaguely). Despite its many problems. crypto is a viable alternative to the legacy system of stat currency that's literally creating war and destroying the planet. It's the only truly international currency. AI solves no such problem nearly as important.

[–] cybersandwich@lemmy.world 1 points 5 months ago

Lol?

You think the current currency system is the cause of war?