PMV Middle East
IS ALL INPUT ERROR IN AUTONOMOUS MOBILITY?
Elon Musk’s comment that users shouldn’t have to interact with selfdriving cars and that with interfaces, all input is error, has stirred a debate about UX design in autonomous vehicles (AVS).
His argument is that with each software update, the car’s intuition will get better, the driver will need to press fewer buttons, and the software will know when to ignore accidental button presses.
This is easier said than done; also, if this the ultimate goal of autonomous vehicle development, there’s the overarching question of who’s responsible when a driverless vehicle crashes? Tesla faces increased scrutiny from regulatory authorities for its Autopilot advanced driver assistance system. In August 2021, the National Highway Traffic Safety Administration (NHTSA) launched a formal investigation into the Tesla Autopilot, Tesla’s SAE Level 2 advanced driver assistance system (ADAS), following a series of crashes involving Tesla vehicles operating in either Autopilot or Traffic Aware Cruise Control during 2014–21.
The cover story in this issues explores the hurdles facing the full implementation of autonomous mobility worldwide.
All things considered, the pursuit of autonomous mobility is worth the costs. Multiple initiatives are taking off around the world to automate vehicle fleets. The UAE recently approved a temporary license to test AVS, and Dubai aims to have 25% of all transportation trips in the city to be smart and driverless. If fully implemented, this trend carries with it the promise of safer roads, lower greenhouse gas emissions, and more efficient mobility options. However, the progress will slow if the legal liability and socio-economic issues are not addressed by regulatory authorities without further delay.
There’s so much uncertainty about who is responsible for damages when a driverless vehicle crashes. Should we blame the manufacturer, or the driver, or the government, or all of them? How do we account for the role of the technology that replaces a human operator inside the vehicle? If the driverless vehicle is retrofitted with parts, is the component or technology supplier also liable? And how should victims be compensated? We need answers to a large number of questions. There are different legal terms by which companies can be held responsible; most vehicle accidents are evaluated under either a negligence or products liability framework. So, it’s important that regulatory authorities and road agencies evaluate the current testing environments thoroughly and determine the risks and that rules with regard to liability. Who better to ask than Matthew Daus, founder and chair of the transportation practice group at US law firm Windels Marx Lane & Mittendorf. Speaking at the International Road Federation (IRF) World Meeting & Exhibition in Dubai, Matthew highlighted the need for regulations that define the duty of connected and autonomous (CAV) technology companies to ensure public safety and the liability of CAV technology companies and/or OEMS for their products. There needs to be clarity on what constitutes a design defect for an AV, and the responsibilities of each party or entity involved must be defined.
Considering the complexity of these issues, governments must take the lead and work with the public sector in harmonising regulations. They also need to address socio-economic issues related to AV implementation, such as pedestrian safety, workforce equity, data privacy, accessibility for people with disabilities, government costs and liabilities, lawsuits, and services for underserved communities.