drosophila

joined 5 months ago
[–] drosophila@lemmy.blahaj.zone 31 points 2 days ago

Well, Fromsoft had a good run.

Maybe we'll get one more game out of them.

I asked them to support JPEGXL by default.

Redfall being a prime example. We kept hearing how Microsoft was happy to leave those studios to it, to give them the time and resources they needed and they still released dog shit.

Yeah, the studio that developed Prey (a dumbass name that zenimax forced them to use) went on to develop Redfall after Microsoft bought them.

Clearly they were a bunch of idiots before the acquisition who had no idea what they were doing, and the only problem afterward was that Microsoft didn't boss them around enough.

[–] drosophila@lemmy.blahaj.zone 5 points 2 weeks ago* (last edited 2 weeks ago)

Solar panels aren't worth it for a normal EV, but supposedly the Aptera is so small, lightweight, and aerodynamic (with that teardrop shape) that they actually add a significant amount of range.

[–] drosophila@lemmy.blahaj.zone 9 points 3 weeks ago* (last edited 3 weeks ago)

It's intentionally stupid, which is why it's not a permanent change.

They just want people to talk about it, send pictures of it to their friends, etc, and be an avenue for reminding people that goldfish crackers exist.

[–] drosophila@lemmy.blahaj.zone 7 points 1 month ago (2 children)

They've done that periodically for years.

I don't dual boot anymore but when I did I kept each installation on a separate hard drive for that reason.

[–] drosophila@lemmy.blahaj.zone 20 points 1 month ago (2 children)

https://xkcd.com/963/

Fortunately I haven't had to open it in a very long time.

[–] drosophila@lemmy.blahaj.zone 4 points 2 months ago (1 children)

I think you may have misread their comment.

[–] drosophila@lemmy.blahaj.zone 5 points 2 months ago* (last edited 2 months ago) (2 children)

Some ARM CPUs that are advertised as microcontrollers have 32 bit address spaces and roughly the same power as an i486.

[–] drosophila@lemmy.blahaj.zone 5 points 2 months ago* (last edited 2 months ago)

This model isn’t “learning” anything in any way that is even remotely like how humans learn. You are deliberately simplifying the complexity of the human brain to make that comparison.

I do think the complexity of artificial neural networks is overstated. A real neuron is a lot more complex than an artificial one, and real neurons are not simply feed forward like ANNs (which have to be because they are trained using back-propagation), but instead have their own spontaneous activity (which kinda implies that real neural networks don't learn using stochastic gradient descent with back-propagation). But to say that there's nothing at all comparable between the way humans learn and the way ANNs learn is wrong IMO.

If you read books such as V.S. Ramachandran and Sandra Blakeslee's Phantoms in the Brain or Oliver Sacks' The Man Who Mistook His Wife For a Hat you will see lots of descriptions of patients with anosognosia brought on by brain injury. These are people who, for example, are unable to see but also incapable of recognizing this inability. If you ask them to describe what they see in front of them they will make something up on the spot (in a process called confabulation) and not realize they've done it. They'll tell you what they've made up while believing that they're telling the truth. (Vision is just one example, anosognosia can manifest in many different cognitive domains).

It is V.S Ramachandran's belief that there are two processes that occur in the Brain, a confabulator (or "yes man" so to speak) and an anomaly detector (or "critic"). The yes-man's job is to offer up explanations for sensory input that fit within the existing mental model of the world, whereas the critic's job is to advocate for changing the world-model to fit the sensory input. In patients with anosognosia something has gone wrong in the connection between the critic and the yes man in a particular cognitive domain, and as a result the yes-man is the only one doing any work. Even in a healthy brain you can see the effects of the interplay between these two processes, such as with the placebo effect and in hallucinations brought on by sensory deprivation.

I think ANNs in general and LLMs in particular are similar to the yes-man process, but lack a critic to go along with it.

What implications does that have on copyright law? I don't know. Real neurons in a petri dish have already been trained to play games like DOOM and control the yoke of a simulated airplane. If they were trained instead to somehow draw pictures what would the legal implications of that be?

There's a belief that laws and political systems are derived from some sort of deep philosophical insight, but I think most of the time they're really just whatever works in practice. So, what I'm trying to say is that we can just agree that what OpenAI does is bad and should be illegal without having to come up with a moral imperative that forces us to ban it.

[–] drosophila@lemmy.blahaj.zone 11 points 2 months ago

Make the page 15x more bloated with JavaScript popups and it'll be "modern".

view more: next ›