this post was submitted on 03 Apr 2026
72 points (97.4% liked)

Technology

83406 readers
4206 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] db2@lemmy.world 30 points 9 hours ago (1 children)

I can't wait for this bubble to blow up in all their dumb faces.

[–] EvilBit@lemmy.world 35 points 8 hours ago (5 children)

For what it’s worth, “AI” in this context is probably not the content-stealing Generative AI that everyone is trying to cram everywhere it doesn’t belong. This is a much more legitimate application of a similar technology.

I’m not mad about the idea of AI in radiology because it’s a really good fit. A human radiologist can’t compare a hundred similar slices and cross-correlate possible anomalies, whereas AI can. This improves detection and outcomes and is exactly where medical technology is supposed to help.

That said, I don’t think we’ll replace radiologists across the board for a long time. This will be a very useful tool and will probably reduce the number of radiologists required and modify their roles significantly, but it’ll be more like how a single worker with editing software can do work that would have required a small team in the pre-digital days of film.

[–] saimen@feddit.org 4 points 4 hours ago

The number of radiological examinations are steadily increasing so there won't be less radiologist needed but AI is needed to cope with the increasing workload.

AI has much better sensitivity than humans (finding something out of the norm) and humans have much better specificity (basically saying what a certain finding is). So I could imagine AI screening every examination and a radiologist just goes through the findings verifying them. For specific things this is already done for years (eg pulmonary nodules).

[–] Grandwolf319@sh.itjust.works 3 points 4 hours ago (1 children)

That assumes it’s done additively.

I think a lot of these AI automation promises come down to:

Are you adding a tool thereby increasing the overall quality of service and cost.

Or are you trying to reduce cost even if it means reducing service quality.

The first one doesn’t take any job away and makes everything just a bit better but more expensive.

The second one is a race to the bottom strategy that just comes down to capitalism doing its thing.

[–] EvilBit@lemmy.world 1 points 3 hours ago

Too many billionaires are salivating over the latter.

[–] phutatorius@lemmy.zip 15 points 8 hours ago (1 children)

Yeah, it sounds more like ML. That's a good thing, For one thing, it's reproducible.

LLMs are intrinsically unfit for use in any situation where human life or health is at stake.

[–] EvilBit@lemmy.world 5 points 6 hours ago

Exactly. People keep shoehorning Large Language Models into non-linguistic domains, and that’s dangerous. Human language, with respect to the training sets used, is inherently subjective and imperfect. Healthcare is very fault-intolerant.

[–] db2@lemmy.world 13 points 8 hours ago* (last edited 8 hours ago) (2 children)

The replacing part is the problem. Using a local system to help is fine, but it still requires humans who know what they're doing and what they're looking at.

[–] EvilBit@lemmy.world 1 points 6 hours ago* (last edited 6 hours ago)

It doesn’t replace any individual directly. It improves one person’s capability to the extent that there may be fewer needed to do a job. And that’s not a bad thing in my opinion, especially because it can improve the quality of that person’s work at the same time.

Edit to elaborate: I am opposed to replacing humans with AI in general. AI is a tool. But if that tool can empower someone to do more and better work, then I’m not opposed. Using stolen intellectual property to replace creatives with an inherently non-creative slop machine is greedy and evil. Using machine learning trained on medical data sets to let a radiologist more comprehensively and deeply review a frankly overwhelming amount of data to better save lives? I’m cool with that. But I also think that, in line with my stance that AI is a tool, there will likely be a well-trained human operating these tools for a long time before radiologists cease to exist.

[–] iopq@lemmy.world 0 points 6 hours ago (1 children)

Sometimes, for example human + AI systems used to be better than either one in isolation, but chess AI improved so much that the human partner is actually not helping anymore

[–] saimen@feddit.org 6 points 4 hours ago

But chess is an isolated "system" with clear rules. Reality and especially medicine is so much more complicated.

[–] frongt@lemmy.zip 1 points 6 hours ago

If it's done properly, sure.

Last time this was in the news, they found that AI had an insanely good accuracy at identifying cancer! Until they realized it was because they included the hospital info in the training data, so it was identifying "cancer" by seeing they were at a cancer treatment facility.