Waymo Robotaxi Hits Child Near Santa Monica School, Prompting Federal Safety Probe
On a clear Friday morning in Santa Monica, parents double-parked along a narrow residential street as they hurried children toward an elementary school. A crossing guard waved small clusters of students through gaps in traffic. From behind a tall SUV, a child suddenly ran into the road.
Rolling toward the scene was a white Waymo robotaxi, driving itself with no one behind the wheel.
The vehicle, traveling about 17 mph, braked hard when the child emerged and slowed to under 6 mph before making contact, according to the company. The child suffered minor injuries and was able to stand and walk to the sidewalk.
The collision, which occurred Jan. 23 within a few blocks of the school, has now triggered a federal safety investigation into how one of the nation’s most advanced self-driving systems behaves around children.
The National Highway Traffic Safety Administration has opened a preliminary evaluation into Waymo’s automated driving system, focusing on its conduct in school-zone conditions and near “young pedestrians and other vulnerable road users,” the agency said in a summary of the probe.
The review marks an escalation in federal scrutiny of Alphabet-owned Waymo at a sensitive moment for the autonomous vehicle industry, which has promoted robotaxis as safer than human drivers even as regulators confront a series of incidents involving children, school buses and pedestrians.
A minor injury, major questions
Waymo disclosed the Santa Monica crash publicly in a Jan. 29 blog post that described the episode as one “in which one of our vehicles made contact with a young pedestrian.”
The company said the child “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path.” According to Waymo, its system “immediately detected the individual and applied hard braking, reducing the vehicle’s speed from approximately 17 mph to under 6 mph before contact was made.”
“Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911,” the company wrote. Waymo said the vehicle remained stopped, then pulled to the side of the road and waited until law enforcement officers allowed it to leave.
The child’s name, age and specific injuries have not been released. Police and federal filings describe the injuries only as minor. The exact intersection and school have also not been publicly identified, citing privacy and the ongoing investigation.
Waymo reported the incident to NHTSA the same day, the company said.
In its public account, Waymo framed the crash as an unavoidable event that its technology helped mitigate rather than prevent. Citing what it called a “peer-reviewed model,” the company said a fully attentive human driver in the same situation would likely have hit the child at about 14 mph, more than twice the speed of the Waymo vehicle at impact.
“We believe this incident demonstrates the material safety benefit of the Waymo Driver,” the company wrote.
The claim goes to the heart of the dispute now facing regulators and the public: whether vehicles marketed as safer than human drivers are exercising enough caution in one of the most sensitive environments on American roads.
Federal investigators focus on school zones
NHTSA’s Office of Defects Investigation has opened case number PE26-001 to examine the Santa Monica collision. A preliminary evaluation is the first formal step in the agency’s defect process and can lead to an engineering analysis and, ultimately, a recall order if a safety defect is found.
In a summary of the investigation, NHTSA said it will review whether the Waymo vehicle “exercised appropriate caution” given its proximity to an elementary school during morning drop-off, the presence of a crossing guard and the number of children and double-parked vehicles in the area.
Investigators will look at the system’s intended and actual behavior in “school zones and adjacent neighborhoods,” including how it handles speed limits, occluded views created by parked cars, and its response to pedestrians who may suddenly enter the roadway. The agency also plans to examine the robotaxi’s post-crash actions, including its stopping behavior and communications after the impact.
Preliminary evaluations typically run several months but can be extended. They may be closed without finding a defect, upgraded to a deeper engineering analysis, or resolved through a voluntary or ordered recall.
NHTSA did not immediately respond to questions about the Santa Monica case beyond the published summary.
A history of run-ins near schools
The Santa Monica incident comes as Waymo is already under federal scrutiny over how its vehicles interact with children in a different context: school buses.
In October, NHTSA opened a separate investigation after reports that a Waymo vehicle in the Atlanta area passed a school bus that was stopped with its red lights flashing and stop arm extended while students were disembarking. School officials in Austin, Texas, later documented more than a dozen similar episodes of Waymo robotaxis driving past stopped buses during the current school year.
On Dec. 5, Waymo said it would file a voluntary software recall to change how its vehicles behave around school buses, after what Chief Safety Officer Mauricio Peña described as incidents “related to appropriately slowing and stopping in these scenarios.” The company said it had already deployed an over-the-air update on Nov. 17 to address the behavior.
Federal attention has since broadened. On Jan. 23, the same day as the Santa Monica crash, the National Transportation Safety Board announced its first investigation into Waymo, focusing on the illegal passing of stopped school buses in Austin. The NTSB, which can only issue recommendations and not regulations, said it would send investigators to Texas and publish a report on its findings.
The combination of bus-related violations and a pedestrian injury in a school neighborhood has placed a spotlight on how autonomous systems make decisions in environments where children are present — and on what regulators will require of them.
How cautious is cautious enough?
Waymo operates fully driverless ride-hailing services in several U.S. cities, including the Phoenix area, San Francisco, Los Angeles, Austin and Atlanta. The company says its vehicles have driven more than 120 million miles without a human behind the wheel and touts sharply lower crash and injury rates than human drivers.
According to safety analyses the company has published in a scientific journal and on its website, its rider-only fleet has been involved in about 90% fewer crashes causing serious injury or worse and roughly 92% fewer pedestrian injury crashes than an equivalent human-driven fleet operating in the same cities.
Those aggregate numbers will be part of the backdrop as NHTSA reviews the Santa Monica collision. Safety researchers say the incident raises more granular questions: what speed and behavior should be considered acceptable when a driverless vehicle moves through a school environment that is inherently unpredictable.
Traffic engineers and child-safety advocates have long pushed for lower speeds in school zones, often 15 mph or less, strict enforcement and design changes that slow drivers down. The Waymo robotaxi in Santa Monica was traveling at about 17 mph, which may have been within the posted limit but is now under federal scrutiny.
Another focus is likely to be how the system interpreted and responded to visual obstructions. Double-parked SUVs can block sight lines and create what engineers call occlusions — hidden areas from which someone or something can suddenly emerge. For a child running toward a school entrance, that space can shrink to almost nothing.
One question investigators are expected to examine is whether the automated driving system treated the area behind the double-parked SUV as an elevated risk zone, and whether its programming should have required a slower approach speed under those circumstances.
A test of trust for driverless cars
The Santa Monica crash also lands in a changed regulatory climate following high-profile failures by another autonomous vehicle operator, Cruise.
In October 2023, a Cruise robotaxi in San Francisco ran over a pedestrian who had already been struck by a human-driven car, then dragged her as it attempted to pull over. Federal and state regulators later found that Cruise had failed to fully disclose the extent of the dragging in its initial report. In 2024, the company agreed to a criminal resolution with the Department of Justice and to strengthen its safety compliance program.
Waymo has repeatedly emphasized that it is taking a different approach. By reporting the Santa Monica crash to NHTSA on the day it occurred and publishing a detailed narrative less than a week later, the company is signaling that it intends to be transparent with regulators and the public.
Even so, the incident underscores how a relatively low-speed collision that leaves a child on their feet can carry outsized weight for an emerging technology.
As NHTSA sifts through sensor logs, braking data and software logic, the agency will be deciding more than whether a single robotaxi performed as designed. Its conclusions could shape how fast driverless vehicles are allowed to travel near schools, whether they can operate during the busiest drop-off and pickup windows, and what precautions they must take when parked cars and small children mix.
On the Santa Monica block where the crash occurred, the scene has already returned to routine: parents pulling to the curb, children spilling onto the sidewalk, a crossing guard holding up a stop sign. Overhead, Waymo’s safety claims and regulators’ findings are converging on a central question that will reach far beyond one California school: what level of risk from a driverless car is acceptable when the people crossing the street are children.