Devial

joined 3 weeks ago
[–] Devial@discuss.online 1 points 1 day ago* (last edited 1 day ago)

To address your two points, where did people get the idea that the word porn implies artistic merit or consent?

I didn't say merit (or consent, though I assume that one's a typo), I said artistic intent. Which every creative work by definition has.

There is nothing ethically wrong with porn in a vacuum, so categorising CSAM as a category of something that isn't inherently ethically wrong in my opinion makes it a bad term. CSAM clearly and strictly be delineates it from consensual porn.

CP can stand for a lot of things but it's common parlance now. CSAM just causes confusion.

Ah yes. The Acronym with MORE common definitions somehow causes less confusion. That makes perfect sense. Of course. That explains why so many people in this thread were confused by it. Oh no wait. They weren't.

Also really? Now you're stooping to the old "why so mad bro?". You're the one having a meltdown, I'm wasting time at work by sharing an opinion.

You're the one who got upset enough about me using a common abbreviation, that no one in the thread was remotely confused by, to kick off this entire shit. You decided you needed to pedantically comment on this. I'm simply defending myself from your pedantic grammar nazi shit.

[–] Devial@discuss.online 2 points 1 day ago* (last edited 1 day ago) (2 children)

I'm not comparing you to Ben Shapiro, I'm comparing your grammar nazi pedantism to a single specific instance of his grammar nazi pedantism.

I also gave several explicit reasons why using CP over CSAM is idiotic, not just "my friends say so"

So that's 2 for 2 for wildly and dishonestly misrepresenting my points.

But hey, if you want to be like that sure.

You're right, everyone else is wrong, you do you and keep using CP instead of CSAM, and keep getting irrationally upset and angry at people who think CSAM is a better term. Happy now ?

[–] Devial@discuss.online 1 points 1 day ago* (last edited 1 day ago) (4 children)

Big "Ben Shapiro ranting about renewable energies because of the first law of thermodynamics" energy right here.

And your point is literally the opposite. Lolita could be argued to be child porn, as it's pornographic material showing (fictional/animated) children. It is objectively NOT CSAM, because it does not contain CSA, because you can't sexually abuse a fictional animated character.

CP is also a common acronym that can mean many other things.

Porn also implies it's a work of artistic intent, which is just wrong for CSAM.

The majority of people can be wrong.

No they can't, not with regards to linguistics. Linguistics is a descriptive science, not a prescriptive one. Words and language, by definition, and convention of every serious linguist in the world, mean what the majority of people think them to mean. That's how language works.

[–] Devial@discuss.online 5 points 1 day ago* (last edited 1 day ago)

Also, the data set wasn't hosted, created, or explicitly used by Google in any way.

It was a common data set used in various academic papers on training nudity detectors.

Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it's content ? Because that's sure what it feels like reading your comments......

[–] Devial@discuss.online 2 points 1 day ago* (last edited 1 day ago)

So you didn't read my comment then did you ?

He got banned because Google's automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn't even a manual decision to ban him.

His ban had literally nothing whatsoever to do with the fact that the CSAM was part of an AI training data set.

[–] Devial@discuss.online 1 points 1 day ago (6 children)

Material can be anything. It can be images, videos theoretically even audio recordings.

Images is a relevant and sensible distinction. And judging by the downvotes you're collecting, the majority of people disagree with you.

[–] Devial@discuss.online 6 points 1 day ago (8 children)

Which of the letters in CSAM stand for images then ?

[–] Devial@discuss.online 17 points 1 day ago* (last edited 1 day ago) (3 children)

They didn't get mad, they didn't even know THAT he reported it, and they have no reason or incentive to swipe it under the rug, because they have no connection to the data set. Did you even read my comment ?

I hate Alphabet as much as the next person, but this feels like you're just trying to find any excuse to hate on them, even if it's basically a made up reason.

[–] Devial@discuss.online 129 points 2 days ago* (last edited 2 days ago) (30 children)

The article headline is wildly misleading, bordering on being just a straight up lie.

Google didn't ban the developer for reporting the material, they didn't even know he reported it, because he did so anonymously, and to a child protection org, not Google.

Google's automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account.

Google's only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google's part, I find it very understandable that the appeals team generally speaking won't accept "I didn't know the folder I uploaded contained CSAM" as a valid ban appeal reason.

It's also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.

[–] Devial@discuss.online 9 points 2 days ago* (last edited 2 days ago)

They reacted to the presence of CSAM. It had nothing whatsoever to do with it being contained in an AI training dataset, as the comment I originally replied to states.

[–] Devial@discuss.online 10 points 2 days ago* (last edited 2 days ago) (2 children)

They didn't react to anything. The automated system (correctly) flagged and banned the account for CSAM, and as usual, the manual ban appeal sucked ass and didn't do what it's supposed to do (also whilst this is obviously a very unique case, and the ban should have been overturned on appeal right away, it does make sense that the appeals team, broadly speaking, rejects "I didn't know this contained CSAM" as a legitimate appeal reason). This is barely news worthy. The real headline should be about how hundreds of CSAM images were freely available and sharable from this data set.

[–] Devial@discuss.online 25 points 2 days ago* (last edited 2 days ago) (4 children)

Did you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged.

Google didn't even know he reported this, and Google has nothing whatsoever to do with this dataset. They didn't create it, and they don't own or host it.

view more: next ›