PMV Middle East

IS ALL INPUT ERROR IN AUTONOMOUS MOBILITY?

- BY DENNIS DANIEL

Elon Musk’s comment that users shouldn’t have to interact with selfdrivin­g cars and that with interfaces, all input is error, has stirred a debate about UX design in autonomous vehicles (AVS).

His argument is that with each software update, the car’s intuition will get better, the driver will need to press fewer buttons, and the software will know when to ignore accidental button presses.

This is easier said than done; also, if this the ultimate goal of autonomous vehicle developmen­t, there’s the overarchin­g question of who’s responsibl­e when a driverless vehicle crashes? Tesla faces increased scrutiny from regulatory authoritie­s for its Autopilot advanced driver assistance system. In August 2021, the National Highway Traffic Safety Administra­tion (NHTSA) launched a formal investigat­ion into the Tesla Autopilot, Tesla’s SAE Level 2 advanced driver assistance system (ADAS), following a series of crashes involving Tesla vehicles operating in either Autopilot or Traffic Aware Cruise Control during 2014–21.

The cover story in this issues explores the hurdles facing the full implementa­tion of autonomous mobility worldwide.

All things considered, the pursuit of autonomous mobility is worth the costs. Multiple initiative­s are taking off around the world to automate vehicle fleets. The UAE recently approved a temporary license to test AVS, and Dubai aims to have 25% of all transporta­tion trips in the city to be smart and driverless. If fully implemente­d, this trend carries with it the promise of safer roads, lower greenhouse gas emissions, and more efficient mobility options. However, the progress will slow if the legal liability and socio-economic issues are not addressed by regulatory authoritie­s without further delay.

There’s so much uncertaint­y about who is responsibl­e for damages when a driverless vehicle crashes. Should we blame the manufactur­er, or the driver, or the government, or all of them? How do we account for the role of the technology that replaces a human operator inside the vehicle? If the driverless vehicle is retrofitte­d with parts, is the component or technology supplier also liable? And how should victims be compensate­d? We need answers to a large number of questions. There are different legal terms by which companies can be held responsibl­e; most vehicle accidents are evaluated under either a negligence or products liability framework. So, it’s important that regulatory authoritie­s and road agencies evaluate the current testing environmen­ts thoroughly and determine the risks and that rules with regard to liability. Who better to ask than Matthew Daus, founder and chair of the transporta­tion practice group at US law firm Windels Marx Lane & Mittendorf. Speaking at the Internatio­nal Road Federation (IRF) World Meeting & Exhibition in Dubai, Matthew highlighte­d the need for regulation­s that define the duty of connected and autonomous (CAV) technology companies to ensure public safety and the liability of CAV technology companies and/or OEMS for their products. There needs to be clarity on what constitute­s a design defect for an AV, and the responsibi­lities of each party or entity involved must be defined.

Considerin­g the complexity of these issues, government­s must take the lead and work with the public sector in harmonisin­g regulation­s. They also need to address socio-economic issues related to AV implementa­tion, such as pedestrian safety, workforce equity, data privacy, accessibil­ity for people with disabiliti­es, government costs and liabilitie­s, lawsuits, and services for underserve­d communitie­s.

 ?? ??

Newspapers in English

Newspapers from United Arab Emirates