Waymo issuing recall over safety concerns involving school buses
Summary
Waymo is preparing a voluntary recall after videos from the Austin Independent School District showed its robotaxis driving past school buses with stop signs and crossing bars deployed. The company says it identified a software issue, pushed updates mid-November and believes performance has since improved; nonetheless it will file a recall to address the problem formally. The NHTSA has opened investigations into multiple Waymo vehicles following similar reports, and public scrutiny has intensified as school districts and regulators demand clearer safety assurances.
Key Points
- Austin ISD published videos and reported that Waymo vehicles illegally passed school buses — averaging about 1.5 violations per week as of 20 November in Austin.
- Waymo implemented software updates by 17 November and says performance has “meaningfully improved” but will still file a voluntary recall to fix the identified issue.
- The NHTSA opened a Preliminary Evaluation in October (covering ~2,000 Waymo 5th-gen ADS-equipped vehicles) and launched an additional probe after Austin ISD’s public release of footage.
- Austin ISD issued at least 20 citations during the school year, prompting public release of the video evidence and greater pressure on Waymo.
- Waymo stresses a strong overall safety record (claims of 12x fewer pedestrian-injury crashes than human drivers) but regulators and parts of the public remain sceptical.
- Wider context: autonomous vehicles face growing public pushback and evolving laws (eg. California’s AB 1777) that shift liability and reporting duties toward manufacturers.
Content summary
Videos shared by Austin Independent School District showed Waymo robotaxis passing stopped school buses with their stop arms extended. After weeks of communication between the district and Waymo — and following the company’s assurances that a software update had been applied — the district released footage when incidents continued. Waymo announced it will file a voluntary recall and says a November software update has improved behaviour; federal investigators at NHTSA are already looking into the issue across thousands of vehicles. The story highlights the tension between autonomous fleet deployment and real-world edge cases that can threaten public trust.
Context and relevance
This matters because school-bus interactions are a high-stakes, well-defined safety scenario where clear legal and ethical responsibilities exist. Regulators are watching closely: NHTSA probes and state-level rules (such as California’s AB 1777) are reshaping how autonomous vehicle behaviour is governed and who is held liable. For fleet operators, local authorities and insurers, this recall is a milestone — it shows that even mature AV programmes must quickly address compliance gaps to sustain deployments and public confidence.
Why should I read this?
Short answer: because kids and buses are involved — and that makes this one you shouldn’t skim past. The piece explains why Waymo is going back to the drawing board, what regulators are doing about it, and why a software patch alone may not restore trust. If you care about AV safety, regulation or how public services react when tech slips up, this saves you time by boiling the situation down fast.
Author style
Punchy: This is a clear red flag for AV deployments — the company is voluntarily recalling vehicles and facing federal scrutiny. Read the detail if you want to understand the likely regulatory and public-relations fallout.