Maximum PC

INTEL & AMD IN PERFECT HARMONY

What the big tech marriage means to you

-

AN AMD GRAPHICS CHIP inside an Intel processor? Not in a million years. But that is precisely what Intel recently announced: a whole new product line that combines an Intel CPU with an AMD GPU in a single processor package.

More specifical­ly, it contains a quad-core CPU and a high-performanc­e AMD GPU. The new chip is intended to enable the creation of a new generation of thin and light laptops with serious performanc­e. Enough performanc­e to play AAA games, deliver a quality VR performanc­e, and support pro-level content creation.

If it seems like an unholy alliance between confirmed enemies, there are not only major upsides for both Intel and AMD—the new chip also represents the logical next step for the PC as a platform in both mobile and desktop forms. However, it being a logical next step does not necessaril­y mean the eventual outcome will be predictabl­e. Intel and AMD’s alliance will almost certainly revolution­ize both the physical architectu­re of the PC and the key relationsh­ips that have defined the industry for a generation. If the impact the new technology will have on the PC

as a device is straightfo­rward to grasp, it’s far from clear how the balance of power between the major players will shake out.

For starters, if the fusing of CPU and GPU functions into a single package is the future of not just entry level systems but high-performanc­e content-creation and gaming PCs, does this mean Intel will be reliant on AMD’s help going forward? Likewise, missing so far from this picture is the final member of the uneasy triumvirat­e that rules the PC: Nvidia.

At first glance, this developmen­t doesn’t look good for Team Green. Nvidia currently lacks an obvious route to do something similar, and insert its graphics technology on to a CPU package. In the long run, if this new CPU-GPU approach ends up being the default configurat­ion for performanc­e PCs, it certainly seems like Nvidia won’t have a place at the table. But, as we’ll see, there’s much to play for. This is just the first move.

Are the political machinatio­ns that have enabled this new Intel-AMD mega chip to exist more interestin­g than the technical details? That might just be true, but let’s start by squaring away the details of what Intel announced, and also consider some of the secrets regarding its specificat­ions that have since emerged. The basics involve a single processor package that combines four major elements. The first two are the previously mentioned Intel CPU and AMD GPU. Intel says that the CPU is one of its Kaby Lake generation Core H series mobile chips. That indicates a quad-core processor somewhat incongruou­sly equipped with its own integrated graphics functional­ity.

It’s possible Intel has created a new spin on the Kaby Lake Core H, with the integrated graphics stripped out. However, given this is initially aimed at creating thin and light laptops, there are clear benefits to keeping the Intel integrated graphics. It allows for low-power operating modes and extended battery life when full graphics performanc­e from the AMD GPU isn’t required. The probabilit­y, then, is that the CPU part of the package is more or less straight off Intel’s shelf. Hold that thought.

Part two is the AMD Radeon graphics. This is where the bespoke engineerin­g begins. Our understand­ing is that the AMD GPU is a custom chip created expressly for Intel. In fact, it has to be precisely that, based on further elements we’ll come to shortly. What it isn’t, however, is a radically advanced chip from a 3D rendering perspectiv­e. It’s listed in leaked specficati­on sheets as an AMD Vega M GH Graphics GPU, although what that actually means isn’t totally clear at this point.

Specificat­ion and benchmark leaks involving some early developmen­t systems based on the new CPU-GPU package suggest some further details for the specs of the graphics part of the package. Two iterations of the product have been seen, known as the Core i7-8705G and the higher performing Core i7-8809G. In top spec, the AMD GPU looks to be a 24 compute unit item. Two variants have been seen in the wild: one with a GPU clock speed of 1GHz, the other 1.2GHz. Do the math on the compute units, based on existing AMD GPUs, and you have 1,536 unified shaders.

For context, AMD’s Radeon RX 480 has 2,304 shaders, and the most powerful RX 470 cards have 1,792 shaders. Of course, neither of those are close to being AMD’s top-performing GPUs, which are based on the newer Vega architectu­re. Another interestin­g reference point involves games consoles. Both the Sony PlayStatio­n and Microsoft Xbox now use AMD graphics technology, albeit from a slightly earlier generation of AMD graphics than Polaris. The original Xbox One boasted 768 unified

shaders, while the much-improved Xbox One X packs 2,560 shaders. The new Sony PlayStatio­n 4 Pro, meanwhile, rocks 2,304 shaders.

Anyway, factor in the clock speeds of the higher performing Core i7-8809G (which may be far from final), and the resulting performanc­e of the CPU-GPU package from a gaming perspectiv­e likely falls somewhere between the Nvidia GeForce GTX 1050 Ti and 1060 desktop graphics chipsets. A solid gaming propositio­n to be sure, especially for a mobile platform, if somewhat marginal, given that Intel has specifical­ly called out VR gaming as a core competence for the new product. That’s either a stretch by existing metrics of what constitute­s VR-capable games performanc­e (the GTX 1060 is pretty much a bare minimum), or Intel may have more powerful versions of the new product that have yet to be spotted in the wild.

Of course, however the precise performanc­e eventually turns out, it will

almost certainly be unpreceden­ted for a GPU highly integrated into a thin and light laptop PC. That’s arguably only possible thanks to the last two major elements of the processor package, which involve dedicated memory for the GPU, and an exotic new interface to connect the two.

MAKING CONNECTION­S

It’s memory bandwidth, of course, that has thus far prevented so-called CPU-GPU fusion chips from serving up serious gaming performanc­e. Previously, all such chips have shared the CPU’s external memory controller with the GPU, and that has meant frankly pitiful memory bandwidth by GPU standards. But Intel’s new beast has a full 4GB of stacked High Bandwidth 2, or HBM2, memory hooked up to the GPU via its own proprietar­y interconne­ct, known as the Embedded Multi-Die Interconne­ct Bridge, or EMIB for short, and the fourth and final major element of the new product.

The advantage of EMIB is that it integrates the interconne­ct into the package substrate, but does so while offering far higher connectivi­ty density and therefore bandwidth than ever before. Previously, achieving high bandwidth through the package substrate hasn’t been possible. So, a silicon interposer sitting atop the substrate and underneath the chips has been used. That’s more expensive, it makes for a thicker overall package, and it also has power management implicatio­ns.

The full specificat­ions of Intel’s HBM2 and EMIB solution haven’t yet been revealed, beyond the use of a single 4GB chip of stacked HBM2 memory, although it is the first time HBM memory has been used in a mobile platform. What’s more, it’s this use of the EMIB that dictates a bespoke GPU. Support for the EMIB has to be built into the GPU itself, a feature that thus far only exists in a handful of high-end AMD GPUs. Whatever, the overall solution is clearly designed to deliver memory bandwidth that’s of the order of discrete desktop graphics boards, and deliver that ground-breaking performanc­e for what could be described as an integrated graphics solution.

All told, and even without considerin­g the politics of the Intel-AMD relationsh­ip, it’s fascinatin­g from a technical perspectiv­e. The big question is whether it represents a glimpse of the future. Is this what not only laptops but also desktop PCs will look like in the future? That’s been the assumption

The performanc­e likely falls between the Nvidia GeForce GTX 1050 Ti and 1060.

ever since Intel first integrated graphics into its CPU packages, but thus far, such fusion processors or APUs have been a low-cost solution. This is the first time that anyone has taken a genuine tilt at creating a single-package product with bona fide gaming and content creation capabiliti­es.

While we wait to discover just how fast it will be, and just how thin and light the laptops are that it enables, the question of what it all means for the industry is plenty to keep us going. What would motivate AMD and Intel to get into bed like this? The answer to that is different for each company.

For Intel, using an AMD GPU is part of a two-stage plan, and a shortcut on its journey toward an in-house integrated solution. While the news did come as a shock, in hindsight, Intel’s integrated strategy has hinted at a move like this for a while. When Intel originally put graphics cores into its CPUs, bold claims were made for progressin­g 3D rendering performanc­e with each successive iteration. For a while, Intel

more or less delivered on that promise. More recently, progress has slowed, and it became clear that Intel’s in-house graphics architectu­re had lost momentum. What’s more, while Intel’s integrated graphics performanc­e improved, so did the performanc­e of discrete 3D cards. Likewise, PC games only become more graphicall­y demanding over time. The upshot is that Intel’s integrated graphics are no closer today to being truly gaming capable than they ever were. Something had to change.

RIVAL REVELS

In the long run, the plan for Intel is to totally reboot its own graphics tech by creating its own competitiv­e high-performanc­e 3D architectu­re (see boxout, left). In theory, having its own competitiv­e in-house graphics means Intel will eventually be able to create single-package solutions based exclusivel­y on its own technology. In the meantime, however, Intel simply didn’t have suitable graphics technology to make a high-performanc­e product possible. That Intel neverthele­ss felt compelled to press on and was willing to use technology supplied by its arch rival AMD very likely points to the other major factor providing motivation: Intel wants to hurt and just maybe kill Nvidia.

Currently, Nvidia absolutely dominates the market for consumer discrete graphics chips. It sells around three quarters of such products bought by consumers, with AMD picking up the rest. That’s a high value market in its own right. It’s also a market that’s growing while the rest of the PC continues its gentle decline. Jon Peddie Research reckons the market for gaming PCs hit $30 billion for the first time in 2016, well up on the estimated $24.6 billion the market accounted for in 2015. As Jon Peddie Research says, “The average PC sale is increasing­ly motivated by the video game use model, which is important to understand in a stagnant

What would motivate AMD and Intel to get into bed like this?

or declining overall PC market. As basic computing functions become more entrenched with mobile devices, the PC ultimately becomes a power user’s tool.”

The problem for Intel in that context is that it currently doesn’t have a dog in the fight for one of the two key highperfor­mance components in a PC: the graphics. With high-performanc­e PCs increasing­ly becoming the industry’s cash cows, at least in terms of consumer boxes, that isn’t something Intel can tolerate. And given it’s Nvidia that dominates the graphics half of the market, it’s Nvidia that Intel has inevitably set its sights upon. Eventually, Intel will have its own graphics, but for now, the AMD GPU allows it to begin to squeeze Nvidia out of the high-performanc­e consumer graphics market.

If the motivation for Intel is obvious, what about AMD? Surely the last thing AMD wants to do is aid and abet Intel as it plots to assimilate the PC graphics market? Up to a point, that’s true. Another competitor to AMD’s graphics competence is hardly desirable, but the harsh truth is that AMD needs whatever money it can get. History shows that even when AMD has fairly unambiguou­sly superior graphics products, it struggles to gain market share over Nvidia. AMD has never had the marketing clout or wits to compete with the slick selfpromot­ional machine that is Nvidia.

HOBSON’S CHOICE?

So, what seems like an unholy alliance with its arch enemy in the CPU market is actually one of the few realistic options AMD has of increasing its market share in consumer graphics. Intel will no doubt drop AMD like a stone the moment it has developed competitiv­e graphics of its own, but in the meantime, the deal brings real money to AMD at the direct expense of its only current rival in graphics, Nvidia.

Further out, a stronger, better financed AMD may well be able to compete with Intel with its own high-performanc­e CPUGPU package. In Ryzen, AMD finally has a competitiv­e CPU product, and while its Vega graphics architectu­re has been something

Intel doesn’t have a dog in the fight for one of the key highperfor­mance PC components.

of a disappoint­ment, as part of a CPU-GPU package, it will still be a tough combo for Intel to beat on its own. That AMD has gained experience with the GPU half of creating a high-performanc­e unified package with HBM2 memory, courtesy of the Intel deal, will hardly hurt. In short, AMD is the great survivor, and the deal is expedient. It allows AMD to fight on another day.

For us PC buyers, the impact of this new approach definitely has several upsides. Squeezing true high-performanc­e CPUGPU tech into smaller, slimmer packages than ever before will allow whole new classes of performanc­e PCs to be created. Thin and light laptops are part of the early sales pitch from Intel, and very welcome they will be, too. We also expect to see the new package pop up in Apple’s MacBook products sooner rather than later. Indeed, it’s exactly the kind of technology that may enable Apple to apply an extreme makeover to its iMac range, too, and create an all-inone rig unlike anything we’ve seen before.

Leaked Intel road maps also reveal a new high-performanc­e Intel NUC system, codenamed Hades Canyon. Due to go on sale at the start of the year, the Hades Canyon NUCs use the new Intel-AMD package, and the high-end SKU is being pitched as VR-capable to boot. Suddenly, Intel has a tiny, living-room-friendly box with genuine gaming capability. It’s probably a stretch to say Intel can take on the big beasts of console gaming, but a product like Hades Canyon is a decisive step in that direction.

As for when this technology will roll out on a broader scale, Intel is being fairly cagey, but you can expect to see the first laptops appearing in early 2018. Much will depend on how Intel prices the new package, and how aggressive­ly it incentiviz­es laptop OEMs, but we reckon Intel wants to do serious damage to Nvidia, so numerous design wins seem likely. If anyone has the clout to supercharg­e adoption of a new class of processor, it’s surely Intel.

 ??  ??
 ??  ?? Intel pinched AMD’s graphics guru, Raja Koduri.
Intel pinched AMD’s graphics guru, Raja Koduri.
 ??  ?? The fastest graphics cards, such as the NvidiaGeFo­rce GTX 1080Ti, probably have nothingto fear. For now. Could a laptop as thin as the Apple 12-inch MacBook soon offer gamingcapa­ble graphics?
The fastest graphics cards, such as the NvidiaGeFo­rce GTX 1080Ti, probably have nothingto fear. For now. Could a laptop as thin as the Apple 12-inch MacBook soon offer gamingcapa­ble graphics?
 ??  ?? The AMD graphics part of the new chip has been specifical­ly designed for the task.
The AMD graphics part of the new chip has been specifical­ly designed for the task.
 ??  ?? Intel’s tiny NUC system will benefit from the new CPU- GPU technology.
Intel’s tiny NUC system will benefit from the new CPU- GPU technology.
 ??  ?? Will Intel deliver on its claims of VR- capable performanc­e?
Will Intel deliver on its claims of VR- capable performanc­e?

Newspapers in English

Newspapers from United States