A Waymo vehicle hit a child. What can we learn from the incident?
On January 23rd, outside an elementary school in Santa Monica, California, a Waymo vehicle hit a child.
That’s what we know for sure.
It sounds shocking, horrifying even. And it’s already giving plenty of groups cover to demand that California revoke Waymo’s license to operate its cars.
But the details matter. And once you start digging a bit, the scary headline about a kid struck down by a heartless robot clearly isn’t the whole story.
In fact, accidents like this provide a lens through which to improve both human and robot driving—and even save lives.
Braking Hard
The specifics of the incident in Santa Monica are still coming out. As it does with any potential safety incident involving a self-driving car, the National Highway Transportation Safety Administration is actively investigating.
That investigation—as well as a voluntary statement from Waymo—is already revealing quite a lot of nuance.
It appears that the incident happened during drop off time at the SoCal school. A Waymo vehicle appears to have been driving among vehicles operated by parents delivering their kids.
As often happens during stressful school dropoffs (I have three kids, so believe me, I know!), a large SUV had double-parked, blocking part of the roadway.
As the Waymo approached the double-parked SUV, a child ran out from behind the SUV and into the roadway, directly in front of the Waymo.
The next bit is crucial. Waymo says that its vehicle “…braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.”
Waymo doesn’t specify the exact distances involved. But dropping 11mph in a split second represents a slamming-on of the brakes, not a gentle slowdown. It’s an aggressive move. And it may very well have saved a life.
Waymo says that–because its vehicle was traveling only 6mph when it made contact with the child–”the pedestrian stood up immediately” and “walked to the sidewalk” on their own.
Waymo called 911 and reported the incident to authorities. The company initially said that the child sustained “minor injuries,” but it’s not clear what injuries, if any, actually happened.
The Problem With People
To be clear, any time a child gets hit by a car, it’s a horrible incident. It’s good that the NHTSA is investigating. As a parent, I feel for the parents involved here–seeing your kid hit by any vehicle must be terrifying.
But before drawing any broader conclusions about the safety of self-driving cars, it’s important to consider the question: “Would a human driver have handled this situation any better?”
SafeKids, an advocacy organization, reports that between 2013 and 2022 almost 200 school-aged kids were killed in school zone accidents.
And that’s only kids. Just days before the Waymo incident, two parents were killed in a crosswalk after dropping their child off at a different California school.
Why do so many people die on the way to school? Speed and distraction are the two biggest factors.
SafeKids reports that as many as 10% of drivers are distracted while driving in school zones–mostly by phones and other devices.
3% of drivers observed by the group were even seen using two devices at the same time–perhaps fumbling with a Bluetooth headset while also trying to sign their kid into school on their cellphone.
And most school zones, the group reports, have speed limits that are way too high–under 20mph is ideal, but most are 25mph+
Not that drivers follow those, anyway–other data shows that when drivers hit kids in school zones, they’re traveling an average of 27 miles per hour.
Human drivers, in other words, make tons of mistakes. Especially with the stress of traffic and the pressure to avoid the dreaded “late pass,” it’s all too easy for parents to speed and to take their eyes off the road during dropoff.
Sadly, when kids are involved–with their propensity to dart into the road, as happened in Santa Monica–that combo of speed and distraction means that people die.
Worse With a Person?
Again, that begs the question, in the context of Waymo’s incident, of whether a person would have done better than an AI-powered robot.
Let’s assume, for a moment, that a human was behind the wheel of the vehicle in Santa Monica. What might have gone down differently?
The average human reaction time while driving is about ¾ of a second. When the child darted into the road, that means their car–going 17mph–would have traveled about 19 feet before the driver would even perceive the presence of a pedestrian.
Perhaps they would have immediately slammed on the brakes. But the NHTSA itself says that most people don’t. Whether through surprise or simply a delay in processing, drivers consistently underbrake, even in potentially fatal accidents.
With a person behind the wheel, it’s thus likely that the child in Santa Monica would have been hit at a much higher speed.
Waymo says that its own independent models show “a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
And again, most drivers in school zones aren’t “fully attentive.” As SafeKids points out, they’re distracted, rushing, and speeding.
Waymos aren’t perfect by any means. But they consistently follow speed limits–sometimes to a fault.
And because they’re constantly scanning the road, they react faster than people–and hit the brakes hard when they see something even remotely concerning. They never check their phones or try to shave while ferrying passengers around.
When a 5,000 robot kits a kid, there’s a natural human tendency to vilify the robot. But in this specific case, the question of whether a person could have done better is far from clear.
Optimize for Safety
That doesn’t mean we should crucify autonomous vehicles–nor does it mean we should let them off the hook.
The NHTSA’s investigation will probably come down to a question not of whether Waymo outperformed a human in this incident, but rather whether self-driving cars could do more to keep kids safe near schools.
Indeed, NHTSA says it’s specifically investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Given that Waymos can be programmed to behave a certain way in specific circumstance—and will do so consistently once the parameters are set—they provide a unique opportunity to set even higher safety standards than we apply to humans.
Again, SafeKids says that most school zones have speed limits above the 20mph ideal. There’s no reason, though, that Waymo couldn’t program their cars to consistently travel at a slower speed when in a school zone at pickup or dropoff times.
Perhaps Waymos could always travel 15mph when traversing an active school zone.
That might bug the hell out of parents navigating the pickup line, but it would keep kids safer in the event of an accident. Waymos near schools could even serve as moving “traffic calming” devices, forcing distracted, impatient human drivers behind them to slow down, too!
Likewise, Waymo could set parameters that instruct their vehicles to slow to a crawl when approaching a double parked car near a school. SafeKids specifically calls out double parking as a big risk factor for accidents near schools.
Thankfully–whether through Waymo’s ingenious driving (in the company’s telling) or dumb luck–this incident ended with a kid walking away alive. But that’s not a reason to dismiss what happened.
Rather, incidents like this provide a unique opportunity to define society’s rules for challenging circumstances like driving near kids–and then program them into a machine that (unlike people) will actually follow them.
Asking the tough questions required to set those guidelines–and holding the reality that scary incidents are also learning experiences–is a lot harder than simply blaming the robots and reverting to the human-powered status quo.
But with kids dying in school zones every year, learning the right lessons from accidents like this is absolutely crucial–even life-saving.