this post was submitted on 03 Apr 2026
63 points (97.0% liked)

Technology

83406 readers
4206 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 29 comments
sorted by: hot top controversial new old
[–] Photonic@lemmy.world 5 points 3 hours ago
[–] darkangelazuarl@lemmy.world 14 points 6 hours ago (1 children)

So the machine learning version of AI can be very useful for this because it can identity cancer or other issues earlier in some contexts because it looks for patterns that we don't see. But yeah this is just another tool that should be used with professionals in the field.

[–] saimen@feddit.org 2 points 2 hours ago* (last edited 2 hours ago)

Exactly. This "AI replacing humans" rhetoric is just marketing from the tech bros because otherwise they would only be selling very expensive tools which gives professionals a minor edge but mostly aren't worth the (high) costs (at the moment).

[–] db2@lemmy.world 27 points 7 hours ago (1 children)

I can't wait for this bubble to blow up in all their dumb faces.

[–] EvilBit@lemmy.world 34 points 7 hours ago (5 children)

For what it’s worth, “AI” in this context is probably not the content-stealing Generative AI that everyone is trying to cram everywhere it doesn’t belong. This is a much more legitimate application of a similar technology.

I’m not mad about the idea of AI in radiology because it’s a really good fit. A human radiologist can’t compare a hundred similar slices and cross-correlate possible anomalies, whereas AI can. This improves detection and outcomes and is exactly where medical technology is supposed to help.

That said, I don’t think we’ll replace radiologists across the board for a long time. This will be a very useful tool and will probably reduce the number of radiologists required and modify their roles significantly, but it’ll be more like how a single worker with editing software can do work that would have required a small team in the pre-digital days of film.

[–] saimen@feddit.org 3 points 2 hours ago

The number of radiological examinations are steadily increasing so there won't be less radiologist needed but AI is needed to cope with the increasing workload.

AI has much better sensitivity than humans (finding something out of the norm) and humans have much better specificity (basically saying what a certain finding is). So I could imagine AI screening every examination and a radiologist just goes through the findings verifying them. For specific things this is already done for years (eg pulmonary nodules).

[–] Grandwolf319@sh.itjust.works 2 points 2 hours ago (1 children)

That assumes it’s done additively.

I think a lot of these AI automation promises come down to:

Are you adding a tool thereby increasing the overall quality of service and cost.

Or are you trying to reduce cost even if it means reducing service quality.

The first one doesn’t take any job away and makes everything just a bit better but more expensive.

The second one is a race to the bottom strategy that just comes down to capitalism doing its thing.

[–] EvilBit@lemmy.world 1 points 2 hours ago

Too many billionaires are salivating over the latter.

[–] phutatorius@lemmy.zip 13 points 6 hours ago (1 children)

Yeah, it sounds more like ML. That's a good thing, For one thing, it's reproducible.

LLMs are intrinsically unfit for use in any situation where human life or health is at stake.

[–] EvilBit@lemmy.world 4 points 5 hours ago

Exactly. People keep shoehorning Large Language Models into non-linguistic domains, and that’s dangerous. Human language, with respect to the training sets used, is inherently subjective and imperfect. Healthcare is very fault-intolerant.

[–] db2@lemmy.world 11 points 6 hours ago* (last edited 6 hours ago) (2 children)

The replacing part is the problem. Using a local system to help is fine, but it still requires humans who know what they're doing and what they're looking at.

[–] iopq@lemmy.world 1 points 5 hours ago (1 children)

Sometimes, for example human + AI systems used to be better than either one in isolation, but chess AI improved so much that the human partner is actually not helping anymore

[–] saimen@feddit.org 4 points 2 hours ago

But chess is an isolated "system" with clear rules. Reality and especially medicine is so much more complicated.

[–] EvilBit@lemmy.world 1 points 5 hours ago* (last edited 5 hours ago)

It doesn’t replace any individual directly. It improves one person’s capability to the extent that there may be fewer needed to do a job. And that’s not a bad thing in my opinion, especially because it can improve the quality of that person’s work at the same time.

Edit to elaborate: I am opposed to replacing humans with AI in general. AI is a tool. But if that tool can empower someone to do more and better work, then I’m not opposed. Using stolen intellectual property to replace creatives with an inherently non-creative slop machine is greedy and evil. Using machine learning trained on medical data sets to let a radiologist more comprehensively and deeply review a frankly overwhelming amount of data to better save lives? I’m cool with that. But I also think that, in line with my stance that AI is a tool, there will likely be a well-trained human operating these tools for a long time before radiologists cease to exist.

[–] frongt@lemmy.zip 1 points 5 hours ago

If it's done properly, sure.

Last time this was in the news, they found that AI had an insanely good accuracy at identifying cancer! Until they realized it was because they included the hospital info in the training data, so it was identifying "cancer" by seeing they were at a cancer treatment facility.

[–] Sanctus@anarchist.nexus 8 points 7 hours ago

Dont listen to shit coming out of America. We're probably 85% of the reason why the world sucks.

[–] surfrock66@lemmy.world 6 points 7 hours ago (1 children)

Devil's advocate...this is one of the few good opportunities for ML. If you train a model on a specific dataset with expert validation, this has the opportunity to save lives.

First, radiology isn't one thing; different radiologists with different expertise looking at the same imaging can see different things. Second, there are not enough radiologists; my wife is an ER doctor who only does overnights and her hospital network has a central radiology center that reviews all films from all the hospitals, and it's always backlogged and waiting on results impacts outcomes in a real way. Third, there are simply human limits to what we can visually perceive, take a look at this study: https://pmc.ncbi.nlm.nih.gov/articles/PMC3964612/

Radiological ML models could change healthcare. Imagine a world where part of your annual preventative care you go and get a full body CT. The ML model can compare your CT with references in your demographics AND your prior years, and find changes/issues before they're crises. That's simply not an amount of data analysis that could be done by an army of radiologists, and has the opportunity to spot things like tumors or organ swelling way earlier. I get that the late stage capitalism reality is "they'll use the data to farm money out of you" but from an actual technological standpoint, this could have real life-saving and improving implications for a lot of people and removes a huge bottleneck in healthcare.

[–] EvergreenGuru@lemmy.world 7 points 6 hours ago (2 children)

Even if AI does the job of reading medical imaging extremely well, I’d still want a radiologist to double-check the scans.

[–] saimen@feddit.org 1 points 2 hours ago* (last edited 2 hours ago)

And it clearly would be necessary because even these higly sophisticated models would only look for what they are trained for and will have a lot of arbitrary/non relevant findings.

Also someone has to take responsibility. The moment software firms are willing to take full responsibility without disclaimers I will start to believe they might be able to replace some people.

[–] surfrock66@lemmy.world 1 points 6 hours ago (1 children)

I see it as bulk imaging goes through the ML, which flags things, orders additional more detailed targeted scans, and those elevate to radiologists.

[–] SpaceNoodle@lemmy.world 1 points 6 hours ago

Instead, they're just gonna throw away the radiologists and make everything computer.

[–] sturmblast@lemmy.world 1 points 4 hours ago
[–] thoralf@discuss.familie-will.at 2 points 5 hours ago

Well, the management says so. I doubt that any trained physician would say the same.

[–] PityPityBangBang@lemmy.world -2 points 7 hours ago (3 children)

I remember a radiologists posting on reddit their income.

something like $800,000/year.

be sad to see that get taken from people that work that job

[–] Photonic@lemmy.world 2 points 3 hours ago

That’s just in the USA. This will affect doctors worldwide.

That being said, the work load on radiologists has been increasing year over year, so we need something to help take it off. In the UK it can sometimes take 3 months to get a report on a scan.

[–] ChaosMonkey@lemmy.dbzer0.com 2 points 6 hours ago

At least you would suppose they have some savings and own some property. Loosing your job when you live paycheck to paycheck is far worse.

[–] iopq@lemmy.world 0 points 5 hours ago (1 children)

They came for the poor, and being a radiologist I didn't speak out. They came for the millionaires and there was nobody to speak for me /s

[–] a4ng3l@lemmy.world 1 points 4 hours ago

The things with AI is that it has yet to come for the poors…