barsoap

joined 1 year ago
[–] barsoap@lemm.ee 4 points 1 month ago* (last edited 1 month ago) (9 children)

As I stated it’s MORE complex today, not less, as the downvoters of my posts seem to refuse to acknowledge.

The reason you're getting downvoted is because you're saying that "64-bit CPU" means something different than is universally acknowledged that it means. It means pointer width.

Yes, other numbers are important. Yes, other numbers can be listed in places. No, it's not what people mean when they say "X-bit CPU".

claiming that new CPU architectures haven’t increased in bit width for 30 years is simply naive and false, because they have in many more significant ways than the base instruction set.

RV128 exists. It refers to pointer width. Crays existed, by your account they were gazillion-bit machines because they had quite chunky vector lengths. Your Ryzen does not have a larger "databus" than a Cray1 which had 4096 bit (you read that right) vector registers. They were never called 4096 bit machines, they Cray1 has a 64-bit architecture because that's the pointer width.

Yes, the terminology differs when it comes to 8 vs. 16-bit microcontrollers. But just because data bus is that important there (and 8-bit pointers don't make any practical sense) doesn't mean that anyone is calling a Cray a 4096 bit architecture. You might call them 4096 bit vector machines, and you're free to call anything with AVX2 a 256-bit SIMD machine (though you might actually be looking at 2x 128-bit ALUs), but neither makes them 64-bit architectures. Why? Because language is meant for communication and you don't get to have your own private definition of terms: Unless otherwise specified, the number stated is the number of bits in a pointer.

[–] barsoap@lemm.ee 4 points 1 month ago (11 children)

The Intel 80386DX did NOT have any 80 bit instructions at all, the built in math co-processor came with i486.

You're right, I misremembered.

And in that regard, the Databus is a very significant part, that directly influence the speed and number of clocks of almost everything the CPU does.

For those old processors, yes, that's why the 6502 was 8-bit, for modern processors, though? You don't even see it listed on spec sheets. Instead, for the external stuff, you see number of memory controllers and PCIe lanes, while everything internal gets mushed up in IPC. "It's wide enough to not stall the pipeline what more do you want" kind of attitude.

Go look at anything post-2000: 64 bit means that pointers take up 64 bits. 32 bits means that pointers take up 32 bits. 8-bit and 16-bit are completely relegated to microcontrollers, I think keeping the data bus terminology, and soonish they're going to be gone because everything at that scale will be RISC-V, where "RV32I" means... pointers. So does "RV64I" and "RV128I". RV16E was proposed as an April Fool's joke and it's not completely out of the question that it'll happen. In any case there won't be RV8 because CPUs with an 8-bit address bus are pointlessly small, and "the number refers to pointer width" is the terminology of . An RV16 CPU might have a 16 bit data bus, it might have an 8 bit data bus, heck it might have a 256bit data bus because it's actually a DSP and has vector instructions. Sounds like a rare beast but not entirely nonsensical.

[–] barsoap@lemm.ee 3 points 1 month ago (1 children)

There's no non-reference designs of Radeon PROs, I think. Instincts, even less. If the ranges bleed into each other they might actually sell reference designs down into the gamer mid-range but I admit that I'm hand-waving. But if, as a very enthusiastic enthusiast, you're buying something above the intended high-end gaming point and well into the pro region it's probably going to be a reference design.

And as a side note finally they're selling CPUs boxed but without fan.

[–] barsoap@lemm.ee -5 points 1 month ago (4 children)

I wonder, what is easier: Convincing data centre operators to not worry about the power draw and airflow impact of those LEDs on the fans, or convincing gamers that LEDs don't make things faster?

Maybe a bold strategy is in order: Buy cooling assemblies exclusively from Noctua, and exclusively in beige/brown.

[–] barsoap@lemm.ee 11 points 1 month ago (3 children)

Kodak isn't dead they're just not dominating the imagining industry any more. They even multiplied, there's now Kodak Alaris in addition to the original Kodak.

Between them they still are dominating analogue film which still has its uses and it could even be said that if they hadn't tried to get into digital they might've averted bankruptcy.

There's also horse breeders around which survived the invention of the automobile, and probably also a couple that didn't because their investments into car manufacturing didn't pan out. Sometimes it's best to stick to what you know while accepting that the market will shrink. Last year they raised prices for ordinary photography film because they can't keep up with demand, their left-over factories are running 24/7.

[–] barsoap@lemm.ee 38 points 1 month ago (24 children)

And modern X86 chips are in fact NOT 64 bit anymore, but hybrids that handle tasks with 256 bits routinely, and some even with 512 bits, with instruction extensions that have become standard on both Intel and AMD

On a note of technical correctness: That's not what the bitwidth of a CPU is about.

By your account a 386DX would be an 80-bit CPU because it could handle 80-bit floats natively, and the MOS6502 (of C64 fame) a 16-bit processor because it could add two 16-bit integers. Or maybe 32 bits because it could multiply two 16-bit numbers into a 32-bit result?

In reality the MOS6502 is considered an 8-bit CPU, and the 386 a 32-bit one. The "why" gets more complicated, though: The 6502 had a 16 bit address bus and 8 bit data bus, the 368DX a 32 bit address and data bus, the 368SX a 32 bit address bus and 16 bit external data bus.

Or, differently put: Somewhere around the time of the fall of the 8 bit home computer the common understanding of "x-bit CPU" switched from data bus width to address bus width.

...as, not to make this too easy, understood by the instruction set, not the CPU itself: Modern 64 bit processors use pointers which are 64 bit wide, but their address buses usually are narrower. x86_64 only requires 48 bits to be actually usable, the left-over bits are required to be either all ones or all zeroes (enforced by hardware to keep people from bit-hacking and causing forwards compatibility issues, 1/0 IIRC distinguishes between user vs. kernel memory mappings it's been a while since I read the architecture manual). Addressable physical memory might even be lower, again IIRC. 2^48^B are 256TiB no desktop system can fit that much, and I doubt the processors in there could address it.

[–] barsoap@lemm.ee 32 points 1 month ago (6 children)

I expect them to merge enthusiast into the pro segment: It doesn't make sense for them to have large RDNA cards because there's too few customers just as it doesn't make sense for them to make small CDNA cards but in the future there's only going to be UDNA and the high end of gaming and the low end of professional will overlap.

I very much doubt they're going to do compute-only cards as then you're losing sales to people wanting a (maybe overly beefy) CAD or Blender or whatever workstation, just to save on some DP connectors. Segmenting the market only makes sense when you're a (quasi-) monopolist and want to abuse that situation, that is, if you're nvidia.

[–] barsoap@lemm.ee 4 points 1 month ago* (last edited 1 month ago) (1 children)

I've had phones newer than the first RAZR that didn't have bluetooth but it's been a while and bluetooth is incredibly cheap -- in fact it probably comes for free with the GSM module, just needs the right software the hardware is already capable of doing it. Separate bluetooth modules (ESP32) cost what 1.30, antenna, maybe ten cents.

[–] barsoap@lemm.ee 3 points 1 month ago* (last edited 1 month ago)

Probably. Proxima fusion is using simulation-driven engineering to pave their way through the design space, no matter how you approach it it's gotta involve dimension reduction in some way and that's ML. They speak of AI but well it's a press piece.

LLMs or diffusion models? Nah, don't think so. This is actual engineers throwing statistics at a particular problem to identify what prototypes they should build, not techbros throwing shit at the wall.

[–] barsoap@lemm.ee 1 points 1 month ago

That's been the history of tokamaks because they're dealing with an inherently unstable situation. It's like balancing a ball on another ball, saying "yep ok I've figured that out", scaling it up and discovering that between those two balls were actually five other that now that the system is bigger have quite a relevant impact.

Contrast with stellerators, which are more like balancing a ball in a bowl. Long considered impossible because the magnetic field just has a too complex geometry the Max Planck institute proved that they work as the theory says, and they're currently working on commercialisation.

[–] barsoap@lemm.ee 6 points 1 month ago* (last edited 1 month ago)

Maybe Tom Scott should make a video about the Asse salt mine. It's where the "yellow barrel == nuclear waste" meme comes from look here a picture.

This stuff is the driving factor behind nuclear energy being a political no-go in Germany: We just don't trust anyone, including ourselves, to do it properly. Sufficiently failure-proof humans have yet to be invented. Then, aside from that: Fission is expensive AF, and that's before considering that they don't have to pay for their own insurance because no insurance company would take on the contract.

Fusion OTOH has progressed to a point where it's actually around the corner, when the Max Planck institute is spinning out a company to commercialise it you know it's the real deal. And they did.

[–] barsoap@lemm.ee 12 points 1 month ago

Allowing limited liability companies to exist without requiring them to be covered by liability insurance is institutionalised market failure.

view more: ‹ prev next ›