zbyte64

joined 1 year ago
[–] zbyte64@awful.systems 15 points 1 year ago (2 children)

Buy a supreme Court justice with the first 100M, then the rest of the court with the other Billion.

[–] zbyte64@awful.systems 7 points 1 year ago

Genie comes from a time when offering any monetary loan is considered usury, doesn't care about your governments made of mortals.

[–] zbyte64@awful.systems 0 points 1 year ago (1 children)

Huh, I guess you can do just about anything if you're willing to pay the political cost.

[–] zbyte64@awful.systems 27 points 1 year ago (4 children)

Those suggesting other candidates should probably address the three issues AOC brought up:

  • Endorsements doesn't transfer
  • Campaign contributions doesn't transfer
  • Possible state level shenanigans by GOP to keep candidates chosen after some technicality off ballot

None of these issues exist if Biden simply steps down.

[–] zbyte64@awful.systems 3 points 1 year ago* (last edited 1 year ago)

Money paw finger curls in

Don't worry, Nancy wants Newsom to be the candidate.

[–] zbyte64@awful.systems 7 points 1 year ago

You did not just fall out of a coconut tree. You exist in a context of all that came before you.

[–] zbyte64@awful.systems 10 points 1 year ago

Elon: The EV market has soured on me, I'm willing to trade the tax credits for something...

Don: I got just the thing: you can sell SpaceX satellites to the Russians

Elon: Perfect!

sucking sounds resume

[–] zbyte64@awful.systems 36 points 1 year ago (2 children)

I love it when terrible people do terrible things to their own wealth. What I don't love is how after a certain number of billions, you have an infinite wealth cheat code to fuck around with.

[–] zbyte64@awful.systems 2 points 1 year ago* (last edited 1 year ago) (1 children)

Why should I need to prove a negative? The burden is on the ones claiming an LLM is sentient. LLMs are token predictors, do I need to present evidence of this?

[–] zbyte64@awful.systems 1 points 1 year ago (3 children)

Humans predict things by assigning meaning to events and things, because in nature, we're constantly trying to guess what other creatures are planning. An LLM does not hypothesize what your plans are when you communicate to it, it's just trying to predict the next set of tokens with the greatest reward value. Even if you were to use literal human neurons to build your LLM, you would still have a stochastic parrot.

[–] zbyte64@awful.systems 20 points 1 year ago

I mean if you ignore all the papers that point out how dubious the gen AI benchmarks are, then it is very impressive.

[–] zbyte64@awful.systems 3 points 1 year ago

WDYM? The big corporations get a free pass to ignore their climate pledges and do the exact opposite.

view more: ‹ prev next ›