Weeks after my first ride in a Waymo, GM’s Cruise shut down all of its self-driving cars after a human-driven car knocked a person into the robotaxi’s path. The taxi, knowing it was in an accident, pulled over, driving an additional 20 feet with the person pinned beneath the car as it moved.
This accident is a horrible reminder of the challenging and unexpected edge cases humans deal with while driving. As regulations in self-driving evolve, we have to decide how cars should respond and, in some cases, whose lives the car should prioritize. For those who remember 2004’s I, Robot in the movie, a robot saved the main character over a child because its predictive analysis gave him a higher chance of survival. The problem is the logical decision requires doses of emotion – but even then, those decisions might not be correct.