this post was submitted on 23 May 2024
703 points (98.8% liked)

Technology

59534 readers
3199 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] chrash0@lemmy.world 15 points 6 months ago (1 children)

there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

[–] UnderpantsWeevil@lemmy.world -1 points 6 months ago (2 children)

AI generally means machine learned neural networks these days

Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can't imagine needing that kind of configuration for my internet browser.

not sure how they’re going to handle low-resource machines

One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn't any better. Do I really want Firefox chewing hundreds of MB of memory so it can... what? Simulate a 600 processor cluster doing weird finger art?

[–] chrash0@lemmy.world 10 points 6 months ago (1 children)

i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in

[–] UnderpantsWeevil@lemmy.world -1 points 6 months ago

I don't doubt it's possible. I'm just not sure how it would be useful.

[–] iopq@lemmy.world 7 points 6 months ago

I use my local machine for neutral networks just fine