this post was submitted on 19 Jun 2025
243 points (90.4% liked)

Technology

71754 readers
3709 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 3abas@lemm.ee -1 points 2 days ago (3 children)

You can run a model locally on your phone and it will answer most prompts without breaking a sweet, it's actually way less energy than googling and loading the content from a website that's hosted 24/7 just waiting for you to access the content.

Training a model is expensive, using it isn't.

[–] squaresinger@lemmy.world 3 points 2 days ago

Nice claim you have there. Do you have anything to back that up?

If it's so easy, it shouldn't be hard for you to link a model like that.

[–] Witziger_Waschbaer@feddit.org 4 points 2 days ago

Can you link me to what model you are talking about? I experimented with running some models on my server, but had a rather tough time without a GPU.

[–] bystander@lemmy.ca 4 points 2 days ago

I would like to learn about this a bit more, I keep hearing it in conversations here and there. Do you have links around studies/data on this?