Autonomous vehicles could know everything about us
What if your car looked up data on people you passed? asks Lorraine Sommerfeld.
I thought I could safely leave the topic of autonomous vehicles behind for at least a month or two. Until a news release landed in my inbox that made me shudder. A developer of autonomous software for the automotive industry — unnamed here for reasons that will be clear in a moment — contacted me to propose an ethical dilemma. We know and understand the hardware needed to create a safe autonomous vehicle. The sensors, cameras, lidar (light detection and ranging) and other systems gather information about the car and its surroundings, which is fed into the brain of the car, where the software and mapping sends the car down the road. When all goes well, the car will not smash through potholes or downed trees, won’t go the wrong way down a one-way street, and most importantly, won’t careen into pedestrians nor hurtle its occupants off a cliff. With driver error being the single biggest factor in collisions, we know that full autonomy will save millions of lives. Millions. But the hurdles are significant, and now people who ask me when we will be fully autonomous are greeted with a shrug. This technology has expanded at warp speed and in every direction, and I threw my crystal ball on the floor ages ago. Who knows? Similarly, we get to consider all the implications of that full autonomy. Will manufacturers bother with steering wheels? Will we read, watch TV, be assaulted by projected ads? Can we shop online while on the road and hit a drive-thru to pick up everything we just ordered a few minutes ago? All these imaginings of what’s taking place inside the car mean that computer programming is doing the deciding on everything taking place outside of it. I thought MIT’s Moral Machine online quiz was harsh, soul-searching, and ultimately revealing. It explores the concept of decision-making software for autonomous vehicles by asking you to choose who to sacrifice if you had the option of hitting, among other things, a dog, a pregnant woman, a doctor or a homeless person. Wondering about who or what we would save or kill, depending on how that information had been coded, was disturbing. More disturbing? The company that contacted me suggested a further step: For all these systems to function, your vehicle will be integrated with the IoT. What’s that? Wikipedia says “The internet of things (IoT) is the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these things to connect, collect and exchange data.” Fair enough. Everything is joined at the hip, or at least the head. Your car will be part of the circuit we rely on every day. But what if, I was asked to suppose, that MIT survey wasn’t just a game? When you factor in the IoT, it means as your vehicle senses someone in its range, it could conceivably use advanced — and advancing — technology to know who that person was and everything about them that was available, just as if you’d Googled them. The dark side of facial recognition software has already been challenged by the American Civil Liberties Union (ACLU) and thrown out of public places. If you’re walking the streets of Prague to escape Interpol or an abusive relationship, the world just got far too small. And now cars could rat you out in a heartbeat? You could know you were driving past a drug dealer or a politician. This capacity spins the Moral Machine experiment out of the theoretical and into the practical. Being suitably horrified and fascinated, I contacted the software developer, who will remain nameless. This concept could blow the concept of personal privacy — already teetering — to smithereens. Except ... “No, it would never be used that way,” I was told. Wait. The news release messaging exactly that was being instantly walked back? I wasn’t to worry, the person said, laws and courts would keep that information private. I asked about North Korea and Russia, both hotbeds of privacy and reasonableness. Oh, no. Never. Forgive me, but the line between information gathering and information using gets blurrier every day, and the corruption factor of political regimes leaves me unconvinced we won’t be dealing with this at some point. Soon.