IDF 2016: Powerful chips, super-fast data lasers and robot brains
Brad Chacos reveals the highlights from this year’s Intel Developer Forum
The Intel of today isn’t the Intel you know, and that truth was hammered home at the company’s recent annual Intel Developers Forum in San Francisco.
Sure, it’s still by far the most prodigious PC chipmaker in the world, but its focus has shifted away from computers alone to embrace the idea of bringing smarts to all sorts of devices. While the IDFs of yesteryear leaned heavily on PC processors and new tech designed to make computers more potent, at IDF 2016 PCs shared the stage with drones, DJ tables, robots, Raspberry Pi-esque maker boards, and even 5G networks.
And that’s not even mentioning the announcement about Intel and ARM – or the surprise mic-drop moment from AMD. The times are definitely changing, but at the same time, IDF has always been about what’s coming next in the world of computing – and IDF 2016 delivered wild visions of the future in spades. Let’s dig in, starting with some radical new PC hardware.
The star of IDF’s day one keynote wasn’t a ferocious new processor or an arcane Internet of Things invention. Instead, it was Project Alloy, a wireless VR headset created by Intel with help from Microsoft.
Project Alloy uses dual Intel RealSense 3D cameras to detect the outside world, offering ‘five-finger detection’ to help you manipulate virtual objects. Whereas the Oculus Rift and HTC Vive focus on straight virtual reality – placing you wholly inside virtual worlds – and Microsoft’s HoloLens uses augmented reality to overlay digital objects in the physical world, Project Alloy is a marriage of the two. Intel’s headset uses its cameras to display real-world objects inside a 3D-rendered virtual world.
Intel didn’t dive into specifics; we don’t know when Project Alloy will be released, or for how much, or even what chip powers it. But the company plans to open-source the design of this potential PC saviour sometime mid-2017.
Open-sourced VR headsets are only part of the equation, though. Hardware is useless without software. Fortunately, Windows chief Terry Myerson strode on stage shortly after Project Alloy’s reveal to announce that Microsoft is bringing Windows Holographic to the masses. Windows Holographic, which powers Project Alloy and Microsoft’s own HoloLens, uses augmented reality to show digital objects overlaid in the physical world, such as Minecraft blocks or wall-sized calendars. Microsoft will push Windows Holographic to every Windows 10 PC sometime in 2017 – presumably around the time Intel open-sources Project Alloy’s design.
Virtual reality demands more computing performance than most tasks. At IDF, Intel formally showed off its next generation of CPUs, dubbed Kaby Lake.
Intel actually spent more time talking up the card’s graphics performance than its computing chops, which may not be surprising when you consider that the chips were hastily added to Intel’s road map as Moore’s Law slows. The seventhgeneration Core processors feature hardware-accelerated video decoding and graphics cores powerful enough to push 4K video, Intel says. The company also showed those integrated graphics cores running Overwatch smooth as silk, though Intel didn’t say which graphics settings or resolution were used in the demo. Don’t expect Kaby Lake’s built-in graphics to play games at 4K resolution, is what we’re saying.
Laptops based on Kaby Lake – like the Asus Transformer 3 pictured above – will start shipping sometime this autumn.
But Kaby Lake is evolution. AMD’s forthcoming Zen architecture is a CPU revolution for the company, and the firm piggybacked on IDF for a major reveal of its own. It’s been teasing Zen details for a while now, but pulled back the curtain pretty far at an evening event in San Francisco.
The highlight was a demonstration of two PCs – one powered by an octa-core Zen chip, the other by Intel’s octa-core Core i76900K – set to 3GHz clock speeds and facing off in a multithreaded Blender rendering task. AMD’s Zen chip beat out Intel’s latest, greatest octa-core processor by a hair.
Considering that the internet rumour mill pegged Zen performance as roughly on par with old-school Intel ‘Ivy Bridge’ chips, that’s incredibly exciting. For more details read our feature on page 80.
Intel’s truly awe-inspiring hardware is destined for datacentres, though. During day two’s keynote, Intel took the wraps off ‘Knight’s Mill’ – a powerful, secretive new Xeon Phi chip loaded with dozens of CPU cores and cutting-edge stacked memory in order to chew through artificial intelligence tasks.
Knight’s Mill isn’t a direct replacement for the 72-core Knight’s Landing chip, nor Knight’s Hill (aka Knight’s Landing’s eventual successor). Instead, the processor’s cores focus on ‘low-precision calculations’, which can be strung together for approximations that can help the chip make decisions in neural networks. It’s a direct response to the meteoric rise of Nvidia GPUs for AI tasks.
After 16 long years of testing and teasing, Intel is finally making good on its promise to move beyond copper. At IDF 2016, the company announced that it has begun shipping silicon photonics modules, which use light and lasers to turbocharge data transfers between computers.
This initial broadside focuses on optical communications technology inside of data centres, at blistering 100Gb/s rates. And while it’s based on the widely used ethernet protocol, servers will require special switches to support silicon photonics.
But the really intriguing titbit is what lies beyond this rollout: over time, Intel will bake optical communications directly inside its chips, which means blazing-fast beams of light will push the data inside PCs.
The future of memory was explored, too. No, we’re not talking about Intel’s revolutionary 3D XPoint memory (though it made an appearance in an enterprise-only role). Instead, we’re talking about DDR5 RAM.
“What? Isn’t DDR4 memory just starting to roll out?” you ask. Yes indeed, dear hypothetical reader, but DDR5 isn’t expected until 2020. But seeing its mere existence on an IDF 2016 slide is eye-opening, as many hardware experts expected DDR4 to be the last major DDR RAM iteration before the technology gives way to better, brighter things (like the aforementioned 3D XPoint, or phase-change memory).
DDR5 DRAM will have many benefits: users will be able to cram more memory into PCs, and applications will run faster.
Intel CEO Brian Krzanich shows off the new Joule chip