Mac|Life

Unified memory

There’s a cost, but unified memory makes Apple silicon Macs more efficient

- ETWEEN 1997 AND

B2002, Apple used the advertisin­g slogan “think different”. Fast forward to today and that long–running campaign is a distant memory, yet the sentiment behind it is very much alive. You can see the difference when looking at how Apple has set modern Macs apart from “normal” computers. Rather than have separate central and graphics processing units — CPU and GPU — and Random Access Memory (RAM) components, Apple silicon Macs make use of a System–on–a–Chip (SoC). And that move has made them very powerful indeed.

One of the big decisions rolled out in the creation of the cutting–edge proprietar­y M1 chip in 2020 was the introducti­on of unified memory. As well as integratin­g components such as the CPU, GPU, I/O controller­s and Apple’s Neural Engine (the specialize­d cores used for the accelerati­on of AI and machine learning), the M1 chip is directly connected to memory via a layer of silicon. It speeds up the transfer of data and makes processing more efficient. But there’s a reason for the word “unified”.

To explain, RAM is temporary storage for files, allowing software somewhere to place and access data so that it’s more readily available for processing. RAM is volatile in the sense that it requires power — if you turn a computer off, the RAM content is lost

— yet, by making use of RAM, computers run more quickly. RAM is faster than storage drives so data can be accessed much faster. But the way computers traditiona­lly allocate memory is different in comparison to Apple silicon (which is now on its third iteration).

NORMAL V UNIFIED

Normal computers have a separate CPU and GPU with memory also located separately. The two processors will then dip into different pools of memory. The CPU will access RAM or DRAM (Dynamic RAM that requires regular refreshing but is fast and reliable despite being more power–hungry) while the GPU will access VRAM (Video RAM) which the CPU can also use.

Since these two processors need to work together, however, there’s a lot of shuttling backwards and forwards using buses and the like. A bus is a connection that allows data to move between the processor and RAM — but if you imagine that the CPU is making use of info stored in the RAM and sending data to the VRAM for the GPU to pick up, process, send back to the CPU to be processed and stored in the RAM, then you can see that it can get a little messy. Having the different processors work on the same data and pass it from one pool to another only serves to slow things down. Unified memory simplifies matters…

It does this by making Apple silicon Macs more efficient by handling CPU and GPU memory duties centrally. In other words, it brings high–bandwidth, low– latency memory into a single pool that the GPU, CPU and Neural Engine can access. Since the processors have access to the same data, there’s no need to constantly copy informatio­n from one pool of memory to another, thereby resulting in less movement of data, making computing faster. There are no buses nor any interrupts — the latter being when CPUs are requested to divert their absolute attention to a particular task at the expense of others.

MORE DYNAMIC

Since the unified DRAM memory is situated adjacent to the SoC, the data doesn’t have far to travel and that also makes for speedier computing with no bottleneck­s. The SoC also draws less power and generates less heat, meaning longer–lasting performanc­e.

There’s also the little matter of Dynamic Caching and what that brings to the party. Introduced with the M3 range of chips, Dynamic Caching makes vast improvemen­ts to the GPU’s performanc­e by only allowing the precise amount of local memory needed for a particular task in real time. It makes more efficient use of the available resources and ensures no memory is being wasted.

In fact, M–series chips are said to be so memory efficient, Apple reckons 8GB of unified memory is equivalent to 16GB of regular RAM.

Is that true? Well, Apple’s motivation in suggesting that “8GB on an M3 MacBook Pro is probably analogous to 16GB on other systems,” as Apple vice president of worldwide product marketing Bob Borchers said, comes in response to criticism that the company’s Macs contain too little RAM. Apple also points to the use of memory compressio­n when claiming users can’t directly compare the memory in Mac to that within a PC.

In general, these assertions have weight. Apple silicon Macs perform extremely well and 8GB, if that’s the amount of memory you choose, is more than sufficient for many tasks. But if you’re putting a Mac under intense pressure, perhaps by opening loads of web browser tabs or editing video, then the more memory to hand, the better. It goes to show that, when it comes to memory usage, two plus two can sometimes equal five. But the takeaway here is that unified memory does help Macs to perform better. DAVID CROOKES

UNIFIED MEMORY BRINGS HIGH–BANDWIDTH, LOW–LATENCY MEMORY

 ?? ?? The unified memory architectu­re of the M3 Max chip supports up to 128GB.
The unified memory architectu­re of the M3 Max chip supports up to 128GB.
 ?? ?? The more demands you make of your Mac, the more unified memory you should buy from the outset!
The more demands you make of your Mac, the more unified memory you should buy from the outset!

Newspapers in English

Newspapers from Australia