this post was submitted on 24 Aug 2025
390 points (93.7% liked)
Technology
74450 readers
3064 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
“AI”
Sharpening, Denoising and upscaling barely count as machine learning. They don’t require AI neural networks.
Barely count or not they absolutely ruin every piece of media I've seen them used in. They make people look like wax figures and turn text into gibberish.
Sharpening is a simple convolution, doesn't even count as ML.
I really hate that everything gets the AI label nowadays
The “ai bad” brainrot has everyone thinking that any algorithm is AI and all AI is ChatGPT.
My simple rule is that if it uses a neural network model of some kind, then it can be accurately called AI.
just today someone told me that Vocaloid was also AI music, they are either too dumb to make some basic fact-checking or true believers trying to hype up AI by any means necessary
Thisthisthis
Sharpening and denoising don't. But upscalers worth anything do require neural nets.
Anything that uses a neural network is the definition of AI.
Not true
Company I used to work for had excellent upscalers running on FPGAs that they developed 20+ years ago.
The algorithms have been there for years, just AI gives it bit of marketing sprinkle to something that has been a solved problem for years.
Well, the algorithms that make up many neural networks have existed for over 60 years. It's only recently that hardware has been able to make it happen.
Not true and I did say "any upscaler that's worth anything". Upscaling tech has existed at least since digital video was a thing. Pixel interpolation is the simplest and computationally easiest method. But it tends to give a slight hazy appearance.
It's actually far from a solved problem. There's a constant trade-off beyond processing power and quality. And quality can still be improved by a lot.
Right. Even back in the eighties UK broadcasters were "upscaling" American NTSC 480i60 shows to 576i50. The results were varied. High-ticket shows like Friends and Fraiser looked great, albeit a bit soft and oversaturated, while live news feeds looked terrible. If you've never seen it, The Day Today has a perfect example of what a lot of US programmes lookd like converted to PAL.
Ya, I knew there were analogue "upscalers", but I'm not familiar enough with them to confidently call them an upscaler vs a signal converter.
Depends on what you're trying to upscale.
But you can use AI for that