this post was submitted on 05 Sep 2025
45 points (85.7% liked)
Technology
75434 readers
2065 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So, at some point, we do have to move on policy, but frankly, I have a really hard time trying to predict what skillset will be particularly relevant to AI in ten years. I have a hard time knowing exactly what the state of AI itself will be in ten years.
Like, sure, in 2025, it's useful to learn the quirks and characteristics of LLMs or diffusion models to do things with them. I could sit down and tell people some of the things that I've run into. But...that knowledge also becomes obsolete very quickly. A lot of the issues and useful knowledge for, working with, say, Stable Diffusion 1.5 are essentially irrelevant as regards Flux. For LLMs, I strongly suspect that there are going to be dramatic changes surrounding reasoning, and retaining context. Like, if you put education time into training people on that, you run the risk that they don't learn stuff that's relevant over the longer haul.
There have been major changes in how all of this works over the past few years, and I think that it is very likely that there will be continuing major changes.
I agree, I looked into some AI stuff and it was really complex.
I’ve seen this story before and that complex stuff kind goes away and then it was a waste of time learning it.