givesomefucks

joined 2 years ago
[–] givesomefucks@lemmy.world 16 points 1 week ago

Nothing instills investors with confidence more than a nervous employee with a death grip on the door ~~handle~~ button.

[–] givesomefucks@lemmy.world 48 points 1 week ago (2 children)

Not so. The Tesla safety operators appear to always have their thumb on the “Open Door” button of their door, which is sort of weird to see. It’s like every safety operator is getting ready to say the Irish goodbye to their passengers at any given moment.

Turns out, Full Self Driving disengages if you open the door. So that is probably being used as a janky, factory-standard kill switch for the autonomous driving system that causes the vehicle to come to controlled stop.

There is absolutely nothing stopping the human employee from sitting behind the wheel except optics...

Like, if there's three passengers, does the employee keep shotgun?

Does a passenger get the driver's seat? Or do they all have to sit in the backseat so people can see it's driving itself?

Is the plan for highspeed situations really just opening a fucking door at speed?

The level of stupidity is, as always, fascinating...

[–] givesomefucks@lemmy.world 13 points 1 week ago

This study explores the relationship between artificial intelligence (AI) and workers’ well-being and health using longitudinal survey data from Germany (2000–2020).

It's crazy the AI bros will say that this means AI is safe...

But then ask them how much AI has improved since 2020 and without a shred of realization they'll tell you only an idiot would compare 2020 AI to 2025 AI...

Like, it's not the study's fault. It's the fault of anyone trying to apply it to modern AI.

[–] givesomefucks@lemmy.world 23 points 1 week ago (8 children)

Hadn't thought of this before.

The AI summary stops people from going to the website, which means the website the AI used isn't getting any page views.

On a long enough timeline, it would kill webpages, then the AI has no new info to steal.

[–] givesomefucks@lemmy.world 29 points 2 weeks ago (1 children)

I mean, there's someone here who has (not even exaggerating) 15+ accounts that they just rotate thru.

It's a hassle to block them all because I still see new ones, but I'll take that over "proving myself" as a unique person with something like this.

[–] givesomefucks@lemmy.world 17 points 2 weeks ago (2 children)

Yep.

The wealthy are pushing it because a lot of the dumbest will readily off load any critical thinking they can

Which makes them worse at critical thinking, and more unlikely to want to unload it. And it just keeps going like that. Making dumb people dumber all the time

100% intentional, but the AI users simply can't understand what's happening.

[–] givesomefucks@lemmy.world 44 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Like, why did you manufacture this intractable problem by mandating clothing and shaming nudity in the first place?

You think humans invented clothing because of shame?

That's completely backwards, we invented clothing for protection, and not seeing everything all the time led to shame when someone could see us.

Like, I'm pretty sure hermit crabs feel something similar to shame when they don't have a shell, they need something to drive them to not only protect themselves, but to ensure they can reproduce and raise their young. That's why humans instinctually get weird about exposed genitals and boobs, those are the most important parts of a human from an evolutionary perspective.

I get what you're trying to say, it's just you're going about it completely backwards

[–] givesomefucks@lemmy.world 14 points 3 weeks ago (1 children)

It's not for the right reason...

Beta was better than VHS, but Beta didn't have porn, so everyone got VCRs.

Meta is so big they could get sued for letting users openly make porn like this, but they don't want everyone using the ones who can make porn. Because that's the biggest threat to their advantage in the market.

If Meta thought they could get away with it, they'd dive dick first into porn.

[–] givesomefucks@lemmy.world -3 points 3 weeks ago (1 children)

if you feed it the rules of chess and the dimensions of the board it should be able to “play in its head”.

You'd save a lot of time typing, if you spent a little more reading...

[–] givesomefucks@lemmy.world -3 points 3 weeks ago (5 children)

Yeah, but it's chess...

The LLM doesn't have to imagine a board, if you feed it the rules of chess and the dimensions of the board it should be able to "play in its head".

For a human to have that kind of working memory would be a genius level intellect and years of practice at the game.

But human working memory is shit compared to virtually every other animal. This and processing speed is supposed to be AI's main draw.

[–] givesomefucks@lemmy.world 3 points 3 weeks ago (7 children)

Although the chatbot had been given a "baseline board" to learn the game and identify pieces, it kept mixing up rooks and bishops, misread moves, and "repeatedly lost track" of where its pieces were. To make matters worse, as Caruso explained, ChatGPT also blamed Atari's icons for being "too abstract to recognize" — but when he switched the game over to standard notation, it didn't perform any better.

For an hour-and-a-half, ChatGPT "made enough blunders to get laughed out of a 3rd grade chess club" while insisting over and over again that it would win "if we just started over," Caruso noted. (And yes, it's kind of creepy that the chatbot apparently referred to itself and the human it was interfacing with as "we.")

It's fucking insane it couldn't keep track of a board...

And it's concerning how confident it is that it will work, because the idiots asking it stuff will believe it. It'll keep failing and keep saying next time will work, because it's built to maximize engagement.

 

While the District Attorney’s Office agrees that it is highly inappropriate to teach while intoxicated, it is, unfortunately, not illegal.”

view more: next ›