on their conscience
π€£
Thanks for the laugh, I needed that. π
on their conscience
π€£
Thanks for the laugh, I needed that. π
In that case, your second and subsequent points should have no text, since the source material has no text for them. And the last point can't arguably have text at all either way. π
More seriously, the source material has both texts and images, and it was your choice to only represent half of that. You could have easily written:
meme explanation
Note: Descriptive information is in italics.
text | image |
---|---|
understanding a meme with text | Small brain |
understanding a meme without text | Normal brain |
understanding a meme without text | Nor image |
understanding a meme without meme | β |
Or:
This meme is taking the classical "expanding brain" meme, and removing increasingly more content with each panel, implicitly prompting the reader to interpolate more information at each step, to practically illustrate the concept of the meme itself. The last panel has nothing at all.
Technically, "without text nor image". Your list is implying otherwise.
Alright, well, I haven't got more time to spend on hacking the prompts to get it to disclose intel. So that's the best I got.
So fwiw, it has ChatGPT-3
Umbrella corp. That has to be satire. Right? Right??
If they are shocked by this, wait until they understand what meta does with their PII and behavioural information... I'm sure they will understand any day now. Any day...
Oh, no, we totally can. We just need to buy local. What? It is "too expensive!!"? Yeah, well, tell me about being part of the problem...
Have fun with the initramfs.
The ARM architecture does apparently (I'm no expert) have some inherent power-efficiency advantages over x86
Well, the R from ARM means RISC, and x86 (so, by extension, x86_64) is a CISC architecture, so they are not even in the same "family" of designs.
Originally, CISC architectures were more popular, because it meant less instructions to write, read, store, etc. Which is beneficial when hardware is limited and developers write in assembly directly.
Over time, the need for assembly programming faltered, and in the 90s, the debate for CISC vs RISC resurfaced. Most developers then wrote code in C and C++, and the underlaying architecture was losing relevance. It is also worth noting that due to a higher number of instructions, the machine code is more granular, and as a result, RISC code can inherently be further optimised. It also means that the processor design is simpler than for CISC architectures, which in turn leaves more room for innovation.
So, all else being equal, you'd expect Qualcomm to have an advantage in laptops with this chip, but all else isn't equal because the software isn't there yet, and no one in the PC market is quite in a position to kickstart the software development like Apple is with Macs.
Now, a key consideration here is that the x86 architecture has been dominating the personal computer market for close to half a century at this point, meaning that a lot of the hardware and software is accommodating (wrt functionality, optimisation, etc) for it specifically.
Therefore, RISC architectures find themselves at a disadvantage: the choice in Operating Systems is limited, firmware and drivers are missing, etc. Additionally, switching to RISC means breaking legacy support, or going through emulation (like the Apple M3 does).
However, in our modern ecosystem, the potential gain from switching to a RISC architecture is considerable (storage is cheaper than ever, RAM is cheap and fast, and seldom anyone is writing assembly anymore. Plus, those who do might enjoy the higher degree of control the additional granularity affords them, without having to do everything by hand, given the degree of assistance modern IDEs offer), and it will gradually become a necessity for every vendor.
For now however, the most popular computer Operating System worldwide has poor performance on ARM, and no support for other RISC architectures (such as RISC-V) that I know of.
The challenge here is in breaking a decades long dominance that originated from a monopoly: if you have paid attention to what Apple has been doing, they initially used large parts of FreeBSD to build a new Operating System that could run on their custom processors (Motorola 68k), and then built the rest of their Operating System (Darwin and Aqua) on top of it. This afforded them the possibility to switch to Intel CPUs in 2005, and back to ARM in 2020 with their M series CPUs.
The quality of their software (in large parts derived from the quality of free software and of staggering design work) has allowed them to grow from a virtually negligible share of computer users to the second place behind windows.
Now, other Operating Systems (such as Linux) have the same portability characteristics as FreeBSD, and can feasibly lead to such a viable commercial OS offering with support for several hardware architectures.
"All" that is needed is a consistent operating system, based on whichever kernel fits, to supplement MacOS in the alternative offering to windows.
Most software would be available, and a lot of firmware would too, thanks to ARM being used nearly exclusively in mobile phones, and most mobile phones running a Linux kernel.
Once we have a (or better, a few) Linux or BSD based operating system(s) with commercial support, consistent design, and acceptable UX for "normies", such CPUs will become a very valid offering.
If Diablo 3 and Duke Nukem Forever have taught me anything, it is that the more you wait, the better the game is.
Also, if you want HL3, here is the closest we ever got.
There's a wonderfully complex system of deferred responsibilities making sure that the people who actually caused this can have all the plausible deniability in the world, see themselves as having nothing to do with it, and enjoy a very relaxed life with riches we can only imagine.