Popular Mechanics (South Africa)
The immediate future of the smart car
Making the connections
THE CONTRAST BETWEEN the blinking screens of CES and gleaming aluminium of Detroit’s North American Auto Show has never been more stark. Mercedes-benz, for instance, paraded its new MBUX in Vegas and had a G-class preserved in amber in the Motor City. Cars have been unveiled at CES for years now. It is, after all, a piece of technology people spend the most time with.
BETTER SECURITY
If you thought Samsung bought Harman purely for its audio technology, you were wrong (and so were we, kind of). DRVLINE is the first major product of the $8 billion acquisition and came after the establishment of the Samsung Automotive Innovation Fund (a $300 million project). This new platform is said to be modular, scalable and open, allowing automakers to build advanced systems.
DRVLINE is a step towards an ecosystem to support full autonomy with hardware and software partners filling the entire spectrum of advanced driving technology. These partners include Renovo Auto and Almotive (software), Graphcore and Infineon (in-car computing), Valens and Autotalks (communication) and Quanergy as one of the many sensor suppliers. Now add in Samsung’s Knox security platform and you have a near-impenetrable seal over the hardware and software platforms.
On the driving side, the platform is designed to scale from Level 3 automation up to levels 4 and 5. The first hardware component is a forward-facing camera featuring lane-departure warning, adaptive cruise control, collision warning and pedestrian warning algorithms. This unit combines Samsung’s legacy camera technology with Harman’s ADAS 360 solution blending machine learning and data science with augmented reality to create a self-learning, virtual co-passenger.
Harman’s digital cockpit platform leverages the company’s Ignite platform for a customisable in-car user experience. Ignite is an industry leader and was the first system to allow for the Android OS to be integrated on four displays. Samsung will also be on-hand to deploy its 5G-ready connectivity solution.
MIND MELDING
Nissan is one of those manufacturers that does things in leaps instead of incremental steps. Brain-to-vehicle is one such leap. It’s an interesting idea to use the driver’s actual brain to train the autonomous artificial intelligence, but it’s genius to integrate that same idea into the manually controlled systems. This way the car can augment your own abilities by immediately responding if you miss something like, say, a pedestrian entering the roadway while you were reading an advertising board.
If people are the weakest link in the traffic chain, using technology as an additive solution instead of taking full control seems like the least offensive or intrusive solution. This technology is said to shorten reaction times and enhance driving pleasure and will feature in Nissan’s IMX concept as well as future iterations of the Leaf electric car.
HARDWARE ACCELERATION
Graphics processors are having a huge moment right now and it’s thanks to artificial intelligence. AI is all about interpreting massive amounts of data from an array of sensors and intelligently making adjustments in reaction to it. A GPU is exceptionally good at crunching through masses of predictible or consistent data, which is why the world’s premier graphics processor manufacturer launched two new automobile-focused products and two new partnerships at CES. Uber has been trying to get the autonomous show on the road since 2015 and has now selected Nvidia as hardware partner. Over this period, selfdriving Ubers have completed more than