this post was submitted on 13 Apr 2024
491 points (96.4% liked)
Technology
59589 readers
3376 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And it’s not RAM, it’s UM for an SoC. The usage of memory changed with the introduction of Apple Silicon.
"Unified" only means there's not a discrete block for the CPU and a discrete block for the GPU to use. But it's still RAM- specifically, LPDDR4x (for M1), LPDDR5 (for M2), or LPDDR5X (for M3).
Besides, low-end PCs with integrated graphics have been using unified memory for decades- no one ever said "They don't have RAM, they have UM!"
Yes, that’s true, but it’s still an indicator of an uninformed reporter.
Apple Silicon chips pass data from one dedicated cores directly to another without the need of passing through memory, hence the smaller processor cache. There are between 18 and 58 cores in the M3 (model dependent). The architecture works very differently than the conventional CPU/GPU/RAM model.
I can run FCP and Logic Pro and have memory to spare with 16GB of UM. The only thing that pushes me into swap is Chrome. lol
It's a pointless distinction.
And in this case, it makes 8gig look even worse.
Maybe you’re not familiar with the apps I’m referring to. Final Cut Pro and Logic Pro are professional video and audio workstations.
If I tried to master an export from Adobe Premiere Pro in Protools on PC I’d need 32GB of RAM to to prevent stutter. I only use ~12GB of 16GB doing the same on Apple Silicon.
8GB of UM is not for someone running two pro apps at once. It’s for grandma to use for online banking and check her email and Facebook.
My dude, you're literally in here arguing that because Apple has a blob for both CPU memory and GPU memory that somehow makes that blob "not RAM." Apple's design might give fantastic performance, but that's irrelevant to the fact that the memory on the chip is RAM of known and established standards.
Read my other replies to this comment. There’s no GPU. It’s an SoC.
BCM2835 is SoC too. And RK3328. And Mali-450 is GPU.
https://www.apple.com/newsroom/2023/10/apple-unveils-m3-m3-pro-and-m3-max-the-most-advanced-chips-for-a-personal-computer/
Each power intensive process is given its own dedicated core. The OS is designed specifically to send dedicated processes to the associated core. For example, your CPU isn’t bogged down decrypting data while loading an application.
You can’t compare it to anything else out at this time. Just learn about it, or don’t. Guessing is just a waste of time.
https://docs.kernel.org/scheduler/sched-capacity.html
Basic priority-based scheduling.
Sent to one of two processors on a PC, or 18-52 dedicated cores in an M chip.
Great topic switch. Also what century do you live?
The topic is substantiating that 8GB of UM on an Apple Silicon Mac being acceptable for a base model.
I’ve explained how the UM is used strictly as a storage liaison due to the processor having a multitude of dedicated cores, with the ability to pass data directly without utilizing UM.
I don’t know what you want from me, but maybe you should just do your own homework instead of being combative with people who understand something better than you.
I really doubt they run apps with cache turned into scratchpad memory.
Like has been done on laptops with on-board video cards since, well, forever?
It’s different. The GPU is broken into several parts and integrated into the SoC along with the CPU’s dedicated processes. Data is passed within the SoC without entering UM. It’s exclusively used as a storage liaison.
You should check out Apple Silicon M-Series. Specs don’t translate to performance in the way conventional PC architecture does. I guarantee you’ll see PC manufacturers going to 2nm SoC configurations soon enough. The performance is undeniable.
Soooo Integrated Graphics?
Negative.
https://www.apple.com/newsroom/2023/10/apple-unveils-m3-m3-pro-and-m3-max-the-most-advanced-chips-for-a-personal-computer/
So it's not on same chip with CPU?
A CPU performs integer math.
A GPU performs floating-point math.
Those are only two of the 18-52 cores (model dependent) of Apple M chips. The OS is designed around this for maximum efficiency. Most Macs don’t even have a fan anymore.
There. Is. No. Comparison. In. PC.
A GPU performs integer math.
A CPU performs floating point math.
All four statements are true.
That’s correct. My mistake.
Dude it is just GDDR#, the same stuff consoles use PC's have had this ability for over a decade there mate apple is just good at marketing.
What's next? When VRAM overflows it gets dumped into regular ram? Oh wait PC's can do that too...
With independent CPU and GPU, sure. There’s no SoC that performs anywhere near Apple Silicon.
According to benchmarks the 8700G vs M3 is on average 22% slower single core, and is 31% faster multicore, FP32 is 41% higher than the M3 and AI is 54% slower 8700G also uses 54% more energy
What about those stats says AMD can't compete? 8700G is a APU just as is the M3
I’m talking about practical use performance. I understand your world, you don’t understand mine. I’ve been taking apart and upgrading PCs since the 286. I understand benchmarks. What you don’t understand, is how MacOS uses the SoC in a way where benchmarks =/= real-world performance. I’ve used pro apps on powerful PCs and powerful Macs, and I’m speaking from experience. We can agree to disagree.
I grew up with a Tandy 1000 and was always getting yelled at for taking it apart along with just about every PC we owned after than too.
Benchmarks are indicative of real world performance for most part. If they were useless we wouldn't use them, kinda like userbenchmark.
The one benefit apple does have is owning its own ecosystem where they can modify the silicon/OS/Software to work with each other better.
Does not mean the M3 is the best there is and can't be touched, that is just misleading
8700G is gonna stomp the M3 using Maxton's software suite just as the M3 will stop the 8700G using Apples software suite.
Then also on-top if that the process node for manufacturing said silicon is different (3nm vs 4nm) that alone allows for a 20% (give or take some) performance difference just like every process node change in the past decade or so
I'll take the loss on the experience part as the only apple product I own is an Apple TV 4k, but there are many nuances you've obviously glossed over
Is the M3 a good piece of silicon? Yes Is it the best at EVERYTHING? Of course not Should apple give up because they are not the best? Fuck no
Man, you’re kinda off the point. This is about how much UM is appropriate for a base model. I’m simply saying the architecture of an SoC utilizes UM as a storage liaison exclusively, since CPU and GPU are cores of the same chip. It simply does not mean the same thing as 8GB of RAM in standard architecture. As a pro app user, 16GB is enough. 8GB is plenty for grandma to check her Facebook and online banking.
Am I missing the point really? UM is not a new concept. Specifically look at the PS5/X:SX
https://www.pcgamer.com/this-amd-mini-pc-kit-is-likely-made-out-of-b0rked-ps5-chips/
Notice the soldered RAM and lack of video card? Kinda like what the M series does.
And when all is said and done, 8gb is not nearly enough and apple should be chastised for just like Nvidia when they first decided to make 5 different variations of the 1060 making sure 4 of those variations will become ewaste in a few short years and again with the 3050 6gb vs 3050 8gb
They both have have independent CPU and GPU. UM is not used to pass from CPU to GPU on an SoC system, it’s exclusively a storage liaison. Therefore it’s used far less than in non SoC applications.
The CPU and GPU are one chip. Learn about Apple Silicon SoC rather than trying to find a comparison. You won’t find one anywhere yet.