The Jerusalem Post

BGU helps to stop deceiving of self-driving cars

Researcher­s say advanced driving-assistance systems consider projection­s as real objects

- • By ILANIT CHERNICK

Researcher­s at Ben-Gurion University of the Negev’s Cyber Security Research Center have found that a phantom image projected on the road in front of a semi-autonomous vehicle can cause its autopilot to brake suddenly, endangerin­g the lives of drivers and passengers.

“Autopilots and advanced driving-assistance systems (ADASs) in semi-autonomous or fully autonomous cars consider depth-less projection­s of objects (phantoms) real objects,” the researcher­s said in a statement released Tuesday by BGU.

In a bid to detect phantoms, the researcher­s are developing a “neural network that examines and detects an object’s reflected light, surface and context and decides whether the detected object is a real object or phantom.”

“This is done purely based on video camera’s output,” Ben Nassi, a PhD student at the Cyber Security Research Center, told The Jerusalem Post.

Nassi is a student of BGU Cyber Security Research Center and Deutsche Telekom Innovation Labs@ BGU director Prof. Yuval Elovici, whose team has been “able to train a model that accurately detects phantoms.”

During their research the team demonstrat­ed how attackers can exploit this perceptual challenge and manipulate a car into endangerin­g its passengers as part of the research project, which they dubbed “Phantom of the ADAS.”

During their research, the group also demonstrat­ed that attackers “can fool a driving assistance system into believing fake road signs are real by disguising phantoms for 125 millisecon­ds in advertisem­ents presented on digital billboards located near roads.”

With the deployment of semi/ fully autonomous cars already taking place in countries around the world, “the deployment of vehicular communicat­ion systems is delayed.

“Vehicular communicat­ion systems connect the car with other cars, pedestrian­s and surroundin­g infrastruc­ture,” BGU said in a statement. “The lack of such systems creates a ‘validation gap,’ which prevents semi/fully autonomous vehicles from validating their virtual perception with a third party, requiring them to rely solely on their sensors.”

“This is an ongoing research project” that has taken them almost a year, Nassi said.

“We were inspired by the fact that cars cannot validate their virtual perception with current infrastruc­ture due to the absence of deployed vehicular communicat­ion protocols,” he said. “This is not a bug. This is not the result of poor code implementa­tion. This is a fundamenta­l flaw in object detectors that essentiall­y use feature matching for detecting visual objects and were not trained to distinguis­h between real and fake objects.

“This type of attack is currently not taken into considerat­ion by the automobile industry.”

Nassi said the “attackers are people with the interest to endanger a car by creating an accident using a phantom projection.”

Phantom attacks “can be applied remotely using a drone equipped with a portable projector or by hacking digital billboards that face the Internet and are located close to roads, thereby eliminatin­g the need to physically approach the attack scene, changing the exposure versus applicatio­n balance,” he said.

Nassi said such attacks don’t leave any evidence at the attack scene, they don’t require any complex preparatio­n, and they can be applied with cheap equipment.

While previous attacks that exploited this validation gap required skilled attackers to approach the scene of the attack, he demonstrat­ed remote attacks can fool advanced systems by using easily accessible equipment like a drone and a projector.

In practice, depth-less objects that are projected on the road are considered real even though depth sensors exist.

The researcher­s, the university said, “believe that this is the result of a ‘better safe than sorry’ policy that causes the car to consider a visual 2-D object real.”

Addressing the issue of what automobile companies using automated technology should be doing to correct this major flaw, Nassi suggested a countermea­sure “that is able to determine whether a detected object is real or not.”

 ?? (Elijah Nouvelage/Reuters) ?? A PROTOTYPE of Google’s self-driving vehicle. BGU researcher­s have demonstrat­ed how attackers can exploit an autonomous car’s perceptual challenge and manipulate it into endangerin­g its passengers.
(Elijah Nouvelage/Reuters) A PROTOTYPE of Google’s self-driving vehicle. BGU researcher­s have demonstrat­ed how attackers can exploit an autonomous car’s perceptual challenge and manipulate it into endangerin­g its passengers.

Newspapers in English

Newspapers from Israel