this post was submitted on 01 Jan 2024
102 points (90.5% liked)

Technology

59589 readers
2962 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Coroner calls on Google and Amazon to act after British woman’s suicide::Chloe Macdermott researched suicide methods on a forum and bought lethal substance online from US

all 23 comments
sorted by: hot top controversial new old
[–] jet@hackertalks.com 73 points 10 months ago (2 children)

This is the classic, we want to control information so that only painful and tragic exit methods are known about by the general public.

I.e. Paternalistic gatekeeping suicide is a sin.

[–] otter@lemmy.ca 6 points 10 months ago (1 children)

That assumes that the person was going to die regardless, while in reality lots of people can and do get the support that they need. This is different from medically assisted dying.

Hiding information doesn't help, but encouraging support and controlling the market for the poisons can help.

"Suicide is a sin" isn't the only reason we support those dealing with the issues, even if that might be the motivation of people in some places.

[–] jet@hackertalks.com 30 points 10 months ago (1 children)

Helping people, yes

Giving people options, yes

Showing people a better way, yes

Removing options, gatekeeping

Restricting information, gatekeeping

[–] otter@lemmy.ca 14 points 10 months ago (2 children)

Removing options, gatekeeping

There's some nuance here too

Say barriers on bridges and high areas that the public can access. It's removing an option yes, but it might be enough friction to stop the person till they can receive the help they need

[–] jet@hackertalks.com 21 points 10 months ago

I concede the benefit of barriers to prevent accidents, or to discourage people from jumping from this point right here. Delaying the impulse. We don't deny people the knowledge of gravity, and we don't legistate the removal of high places. If someone really wants to jump they have options, hiking to a cliff etc.

Let's say there is a magic pill, that is painless, no side effects, etc. let's say we made this available for people's pets in pain, but not for humans in pain. In this fictional universe the gatekeeping of "enough pain" to justify a dignified and self selected exit is a net evil. As long as a human has agency they should have a choice without officials gatekeeping their knowledge. (I.e. we shouldn't nanny adults)

[–] unrelatedkeg@lemmy.sdf.org 9 points 10 months ago

I don't think the main point of the barriers is preventing suicide specifically, but safety in general. Preventing suicide is more of a bonus.

[–] Wrench@lemmy.world 69 points 10 months ago (1 children)

Yes. Let's burn all the books too in case someone does something bad with that knowledge. And ban everything but soft pillows from being sold in any store.

What a stupid fucking coroner.

[–] jet@hackertalks.com 23 points 10 months ago (3 children)

Soft pillows could be used for smothering

[–] LolaCat@lemmy.ca 6 points 10 months ago

We should ban everything besides whoopie cushions then; you could probably still smother someone with one but it’d be a lot funnier

[–] SuckMyWang@lemmy.world 2 points 10 months ago

Ban the pillows too. We’ll have to use heshen sacks full of straw instead

[–] PlutoniumAcid@lemmy.world 2 points 10 months ago (1 children)

Oh no, not the fluffy cushion!

[–] HerbalGamer@sh.itjust.works 2 points 10 months ago

smothering

Yeah, a wooden board wouldn't work for that now would it?

[–] andrew_bidlaw@sh.itjust.works 50 points 10 months ago

She died as a martyr so we can have even more of that universal mass survelliance.

[–] cashews_best_nut@lemmy.world 18 points 10 months ago (2 children)

It was probably sodium nitrite. There's been an increasing number of people using it to exit as it's used in meat seasoning. In the UK you can only buy quite low concentration bulk packets.

[–] ikidd@lemmy.world 11 points 10 months ago (1 children)

The UK would try to ban nitrogen because it can be used to commit suicide. No more Guinness at the pub I guess.

[–] Rai@lemmy.dbzer0.com 4 points 10 months ago

No more balloons for your party either!

[–] cinabongo@lemmy.world 9 points 10 months ago (1 children)

Early last year I was almost able to buy it in high concentration from apc pure, but did not have a good lie lined up when they asked me why I was purchasing it. Seems like it's no longer available, but I bet it can be got with some effort.

[–] ElPussyKangaroo@lemmy.world 14 points 10 months ago

I hope you're doing okay now.

[–] N00dle@lemmy.world 17 points 10 months ago

Feels like such a reach. I have no love for Amazon but what are they supposed to do here? Ask you at check out "Are you trying to commit suicide with these items?" Typing is most keywords for suicide and Google pops up with suicide prevention info on top screen already. People who want to do something will find a way.

[–] SlopppyEngineer@lemmy.world 10 points 10 months ago

I'm guessing that guy never heard about the Streisand Effect and the expression "if it bleeds, it leads"

[–] autotldr@lemmings.world 6 points 10 months ago

This is the best summary I could come up with:


Google and Amazon must act after a British woman made a suicide pact with two people she met online and bought the poison that killed her on the internet, a coroner has said.

Chloe Macdermott, 43, died on 23 May 2021 after buying a lethal substance from the US on Amazon.

She had been struggling with her mental health for several years before she began researching ways to end her life on an online forum, an inquest at inner west London coroner’s court was told this month.

The coroner Paul Rogers recorded a conclusion of suicide and issued a prevention of future deaths report to Google and Amazon, saying he believes they have the power to prevent another similar tragedy.

Posts are made by users containing details of methods of suicide without any effective administration to remove such harmful content.”

The availability of the poison online and the ability of Britons to have it delivered from the US “without effective border and/or custom controls” was also a matter of concern, Rogers said.


The original article contains 411 words, the summary contains 172 words. Saved 58%. I'm a bot and I'm open source!

[–] trackcharlie@lemmynsfw.com 3 points 10 months ago

Grow the fuck up. What idiot coroner thinks that an individuals personal decision to end their life is literally anyone elses fucking responsibility?

Should have their license pulled for being this much of a fucking idiot.