TLDRs;
- Waymo recalls 3,100 autonomous vehicles after illegal passes on stopped school buses in Texas.
- Software flaw caused vehicles to miss stop arms and flashing lights on school buses.
- NHTSA opens probe into safety gaps as Waymo pushes November software fixes.
- Texas reports 19 violations, raising concerns about AV readiness and regulatory oversight.
Alphabet’s autonomous driving subsidiary, Waymo, is facing intensified regulatory scrutiny after issuing a significant software recall linked to repeated failures to stop for school buses in Texas. According to new filings with the National Highway Traffic Safety Administration (NHTSA), the company recalled 3,100 vehicles equipped with its fifth-generation automated driving system following reports that the cars had illegally passed stopped school buses at least 19 times during the current school year.
The recall marks one of Waymo’s most serious public safety setbacks since expanding its driverless operations into new U.S. markets. While the company has emphasized its strong safety record and quickly rolled out a software update, the events in Texas reveal a deeper challenge for autonomous systems: reliably managing everyday, high-risk scenarios involving children.
Texas Incidents Trigger Federal Scrutiny
Texas transportation officials in Austin raised alarms after observing a pattern, Waymo vehicles repeatedly failed to respond to school bus stop arms and flashing red lights. The 19 documented violations amount to roughly 1.5 incidents per week, signaling a persistent system gap rather than isolated edge cases.
One of the incidents under review involved a fully driverless Waymo vehicle, meaning no trained human safety operator was present. That revelation increased pressure on federal regulators, who have tasked Waymo with providing detailed behavioral logs and system explanations by January 20, 2026.
Software Flaw at the Center
Waymo told regulators the failing behavior stemmed from a software misinterpretation of school bus cues, specifically stop-sign arms and roof-mounted red warning lights. The company pushed a software update to its fleet, which NHTSA confirmed had been fully installed by November 17, 2025.
While the update resolved the immediate defect, five violations reportedly occurred after Waymo notified Texas officials of the software issue on November 5. That timeline underscores how long the system struggled with a scenario that human drivers are routinely trained to handle.
Waymo frequently cites internal data showing 91% fewer serious-injury crashes compared to human drivers. But regulators note that a singular category of repeated, preventable errors, especially involving children boarding school buses, can outweigh broader statistical improvements when public trust is at stake.
Regulators, Companies Eye Connected Tech Solutions
The school bus failures are prompting renewed interest in Vehicle-to-Everything (V2X) communication technologies. Systems like C-V2X can send direct alerts from school buses to approaching vehicles, potentially allowing autonomous systems to react even when visual cues are unclear.
Applied Information, a traffic-technology vendor, has already piloted “Connected School Bus Systems” in Fulton County, Georgia, demonstrating real-time alerts that prevent unsafe passes. Industry analysts say incidents like Waymo’s may accelerate national adoption.
Texas officials and other states may now consider expanding connected-infrastructure budgets. Some jurisdictions, including Newfoundland, Canada, already use school bus camera programs that pair enforcement with public safety campaigns, introducing fines ranging from $500 to $1,200 per violation.
For autonomous vehicle developers, incorporating V2X receivers could help reduce regulatory risk and ensure compliance in highly sensitive scenarios where visual recognition struggles.
A Pivotal Moment for AV Safety
Waymo’s recall is more than a technical correction, it highlights the fragility of public trust and the complexities of deploying fully autonomous vehicles in real communities.
As regulators deepen their investigation and states evaluate technology upgrades, the Texas school bus incidents may become a landmark case shaping how automated driving systems must interact with some of the most essential road-safety rules in America.


