this post was submitted on 28 Oct 2024
1536 points (98.8% liked)
Technology
59589 readers
3332 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly, he's wrong though.
I know tons of full stack developers who use AI to GREATLY speed up their workflow. I've used AI image generators to put something I wanted into the concept stage before I paid an artist to do the work with the revisions I wanted that I couldn't get AI to produce properly.
And first and foremost, they're a great use in surfacing information that is discussed and available, but might be buried with no SEO behind it to surface it. They are terrible at deducing things themselves, because they can't 'think', or coming up with solutions that others haven't already - but so long as people are aware of those limitations, then they're a pretty good tool to have.
It's a reactionary opinion when people jump to the 'but they're stealing art!' -- isn't your brain also stealing art when it's inspired by others art? Artists don't just POOF, and have the capability to be artists. They learn slowly over time, using others as inspiration or as training to improve. That's all stable diffusors do - just a lot faster.
Speaking as someone who worked on AI, and is a fervent (local) AI enthusiast... it's 90% marketing and hype, at least.
These things are tools, they spit out tons of garbage, they basically can't be used for anything where the output could likely be confidently wrong, and the way they're trained is still morally dubious at best. And the corporate API business model of "stifle innovation so we can hold our monopoly then squeeze users" is hellish.
As you pointed out, generative AI is a fantastic tool, but it is a TOOL, that needs some massive changes and improvements, wrapped up in hype that gives it a bad name... I drank some of the kool-aid too when llama 1 came out, but you have to look at the market and see how much fud and nonsense is flying around.
As another (local) AI enthusiast I think the point where AI goes from "great" to "just hype" is when it's expected to generate the correct response, image, etc on the first try.
For example, telling an AI to generate a dozen images from a prompt then picking a good one or re-working the prompt a few times to get what you want. That works fantastically well 90% of the time (assuming you're generating something it has been trained on).
Expecting AI to respond with the correct answer when given a query > 50% of the time or expecting it not to get it dangerously wrong? Hype. 100% hype.
It'll be a number of years before AI is trustworthy enough not to hallucinate bullshit or generate the exact image you want on the first try.
Its great at brainstorming, fiction making, a unreliable intern-like but very fast assistant and so on... but none of that is very profitbable.
Hence you get OpenAI and such trying to sell it as an omiscient chatbot and (most profitably) an employee replacement.