this post was submitted on 07 Mar 2026
446 points (98.9% liked)

Technology

82363 readers
4371 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] TheObviousSolution@lemmy.ca 4 points 37 minutes ago

Just have them add a disclaimer or have the hosts be liable for what their chatbots say, stop adding bureaucracy just asking to get selective prosecuted and abused.

[–] ieGod@lemmy.zip 2 points 49 minutes ago

I don't see how you police/enforce this. The technology is out of the bag, people will find ways to access. Do we need age/location verification for this now too? What if I'm running a local agent? I don't agree with this.

[–] supersquirrel@sopuli.xyz 67 points 7 hours ago (2 children)

I think a better solution is to ban techbros from giving serious economic or cultural advice and take computers away from business majors.

[–] HeyThisIsntTheYMCA@lemmy.world 21 points 5 hours ago (1 children)

Please don't take them entirely away. Maybe just internet access? 30ish years had to do accounting by hand. In those green ledgers. It took approximately twelve times longer to do it by hand than to do it with a computer. And it made me shrimp like 5 times worse. I needed an architect's table what angled the top of it in order to work properly but I could neither get one supplied by the employer nor afford to give one to the employer.

Not all technology is bad

[–] isVeryLoud@lemmy.ca 9 points 3 hours ago (1 children)

Oddly specific gripe, I'll allow it.

[–] HeyThisIsntTheYMCA@lemmy.world 7 points 3 hours ago

thank you i have others in jars in the back

[–] jaybone@lemmy.zip 3 points 3 hours ago (1 children)

I don’t get how some of these tech company CEOs who came up as engineers can be pushing this bullshit. I get once the company got big they started hiring business bros. But some big companies still have CEOs that were once engineers. You’d think they would know better.

[–] NannerBanner@literature.cafe 1 points 1 hour ago

What kind of engineer? Because while the physical world, with all of its mechanical and civil and aerospace engineers, has its shit figured out with professional standards and very clearly defined responsibilities and duties, the world of social engineers, tire engineers, procurement engineers, supply chain engineers, sandwich engineers, project engineers, lead engineers, and yes, software engineers, definitely is a little too loose with any definition for me to care that these ceos were once 'engineers.'

[–] artyom@piefed.social 77 points 7 hours ago* (last edited 7 hours ago)

Hell yeah, let's hold them accountable for disinformation. They'll be gone completely in a matter of months.

Want to get rid of that responsibility? Direct the user to the source. Oh wait, that's just a search engine.

[–] HootinNHollerin@lemmy.dbzer0.com 53 points 7 hours ago

Would be nice if regular legal and health advice was in any way affordable though

[–] mrmaplebar@fedia.io 23 points 7 hours ago (1 children)

This reads as a way to protect white collar industries from the effects of AI without addressing the root problem--that AI does not actually think, and that it is little more than a meat grinder full of scraped data.

[–] SeeMarkFly@lemmy.ml 7 points 6 hours ago (2 children)

In other words, Artificial Stupidity. Why is it CALLED intelligent?

[–] atopi@piefed.blahaj.zone 1 points 3 hours ago (1 children)

it had that name for a really long time

a couple decades ago, a program learning was really impressive

[–] SeeMarkFly@lemmy.ml 1 points 1 hour ago* (last edited 1 hour ago)

I remember when LISP was available for my Atari 800.

Yes, I had the FULL 64K of memory installed.

[–] sauerkrautsaul@lemmus.org 4 points 5 hours ago
[–] tinkermeister@lemmy.world 13 points 6 hours ago* (last edited 6 hours ago) (2 children)

I may have become too cynical but, as is often the case when you dig deeper, this sounds like the result of lobbyists trying to protect licensing rather than people.

We can be dumb, but we’ve been doing web searches for legal and medical advice for ages because it is too damned expensive and time consuming to go to professionals for every little thing. Not to mention, doctors have so little time for you that it is hard to get them to listen to the whole story to make connections between symptoms.

The LLMs already tell you that they aren’t licensed professionals and, for many, provide citations for their sources (miles better than your typical health website).

As a personal anecdote, my son was having stomach pain but was planning to tough it out. He checked with ChatGPT and it recommended he go to the ER. He did, and if he hadn’t, he would likely be dead now. He spent 3 days in the hospital having his bowels unobstructed through a tube in his nose.

There is value in people having that kind of information at their fingertips.

Regulation is absolutely needed, but I would rather they focus on protecting us from AI being used for military purposes, mass surveillance, etc. rather than protecting citizens from ourselves.

[–] tempest@lemmy.ca 13 points 5 hours ago (1 children)

Are you in the US? My take away here is American healthcare is bad but we're treating the symptom not the disease.

[–] tinkermeister@lemmy.world 1 points 3 hours ago

Yeah, I’m in the US and I agree. Though it is going to take some serious change to treat the problem. In the meantime, this is at least a stopgap solution for people who don’t have a lot of options.

[–] HeyThisIsntTheYMCA@lemmy.world 3 points 5 hours ago* (last edited 5 hours ago) (1 children)

Wait, he thought he could sit that pain through at home? Your son is tough as nails. Give him a hug for me and everyone else who's had that four day n-g tube delight.

[–] tinkermeister@lemmy.world 2 points 3 hours ago (1 children)

Yeah, he is pretty tough. I wish I could hug him, he is about a 10 hour drive from me. That tube was nightmarish from what he’s told me.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 1 hour ago (1 children)

if i were his parent, i would be giving him gentle reminders to drink more water. after teasing him for eating way too much corn or broccoli or whatever bastard fiber caused his obstruction (assuming he's in a mental place he can handle the teasing)

[–] tinkermeister@lemmy.world 1 points 1 hour ago

He’s in his 20s so he is only slightly more likely to take my advice than he was as a teenager 😆

[–] webkitten@piefed.social 4 points 5 hours ago (2 children)

This bill gave us the "best" interaction:

https://bsky.app/profile/badmedicaltakes.bsky.social/post/3mghyg5eufk2m

A Bluesky skeet from @badmedicaltakes.bsky.social:

"Twitter user eoghan:

How dare poor people get free medical advice

<quote tweet from Twitter user Polymarket: BREAKING: New York bill would ban AI from answering questions related to medicine, law, dentistry, nursing, psychology, social work, engineering, & more.>

Twitter user YBrogard79094:
JUST MAKE HEALTHCARE ACCESSIBLE

Twitter user eoghan:

AI is literally free healthcare. Being a communist must be exhausting"

[–] deliriousdreams@fedia.io 2 points 2 hours ago

Some horses you can't even lead to water. Let alone make them drink.

[–] Hiro8811@lemmy.world 3 points 3 hours ago

You can google your simptoms and there probably are some reliable sites but a hallucinating chatbot is a bad idea. Not to mention some people suggested treating covid with chlorine, vinegar etc....

[–] DarrinBrunner@lemmy.world 8 points 6 hours ago* (last edited 6 hours ago)

Sounds like a start. More is needed though.

The bill targets AI chatbots that impersonate licensed professionals — such as doctors and lawyers — and bars them from providing “substantive response, information, or advice” that would violate professional licensing laws or constitute the unauthorized practice of law.

It also mandates that chatbot owners provide “clear, conspicuous, and explicit” notice to users that they are interacting with an AI system, with the notice displayed in the same language as the chatbot and in a readable font size. However, the bill clarifies that this notice for users, which indicates that they are interacting with a non-human system, does not absolve the chatbot owners of liability.

[–] phx@lemmy.world 6 points 6 hours ago (1 children)

AI in the legal field could be useful for assisting an actual legal professional in compiling precedent based against on-the-books laws, so long as it cites sources and they verify them.

In the medical field, it could be useful for spotting anomalies between multiple images such as X-rays or cross-referencing medical documents WHEN USED BY A PROFESSIONAL.

But the thing is, it should be a tool - carefully used - to enhance the existing profession, not replace actual professionals.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 3 hours ago* (last edited 3 hours ago) (1 children)

But the thing is, it should be a tool - carefully used - to enhance the existing profession, not replace actual professionals.

except in practice, the "professionals" just take the LLM's word as unassailable and disengage their brains. funny that, the gap between theory and reality

[–] phx@lemmy.world 2 points 2 hours ago (1 children)

Yup, but those are the cases that make the news. There's always gonna be some stupid/lazy ones

tell me you haven't worked with anyone in the medical industry without telling me you haven't worked with anyone in the medical industry

source: 20 years as a medical accountant

load more comments
view more: next ›