Waymo has been forced to recall thousands of robotaxis across the US after one self–driving car was swept into a creek.
In a notice posted on the National Highway Traffic Safety Administration (NHTSA) website on Tuesday, the company announced it would begin a ‘voluntary recall’.
The recall notice affects nearly 3,800 robotaxis using Waymo’s fifth and sixth generation self–driving systems.
Waymo says the affected cars have a software issue that allows them to drive onto flooded roads.
It follows an incident in San Antonio, Texas, on April 20, in which a Waymo drove into a flooded road and was swept away by the waters.
While no one was on board the vehicle at the time, and no injuries occurred, the company warns that this unexpected behaviour could lead to ‘a loss of vehicle control, increasing the risk of a crash or injury’.
Waymo, which is owned by Google’s parent company Alphabet, says that a solution is ‘under development’ to prevent self–driving cars from accessing waterlogged areas.
It remains unclear whether or not the recall will impact Waymo’s plans to launch in London this autumn.
Waymo has been forced to recall almost 3,800 robotaxis after one drove into a flood and was swept into a creek. Pictured: A Waymo autonomous vehicle caught attempting to cross deep water in Arizona during flooding in 2025
Waymo’s robotaxis now operate across several US cities, such as San Francisco, Austin and Miami, providing more than 500,000 trips per week.
However, the company also has ambitions to expand its self–driving services to other markets.
Following a trial period, Waymo intends to operate the first–ever robotaxi service in London starting from September.
Dozens of driverless cabs are already roaming the capital’s streets, mapping out routes ahead of the launch.
For now, the taxis have a safety driver behind the wheel and are not accepting fares, but they will be fully autonomous once the service begins.
But with the September deadline looming, experts have warned that high–profile safety failures risk undermining public trust in this new technology.
Professor Jack Stilgoe, an expert on emerging technologies from University College London, told the Daily Mail: ‘Companies need to show that they aren’t reckless. How they respond to mishaps and crises is absolutely vital.’
In addition to the recall, Waymo has announced that its San Antonio service will remain temporarily suspended following the incident.
Waymo says the affected vehicles have a software issue that allows them to drive into flooded roads. Pictured: A Waymo vehicle trapped by floodwater in central Phoenix, Arizona, last year
Waymo caused travel chaos in December last year, when a power outage in San Francisco caused hundreds of autonomous vehicles to freeze in place
However, the company adds that it will resume public rides once the software that allowed the crash has been fixed.
This follows several incidents in which self–driving taxis have become a public nuisance.
In December last year, a blackout in San Francisco caused Waymo’s taxis to stop in place, leading to significant traffic disruption.
Meanwhile, in London, residents of one Spitalfields cul–de–sac have been repeatedly woken up at 04:00 most weekdays by Waymo taxis getting trapped in their street.
Last month, a Waymo crashed through a police cordon put up by officers responding to a double stabbing incident in Harlesden, northwest London.
Incidents resulting in injury are far rarer, and none have yet occurred in the UK, but experts say that more regulation governing robotaxis is urgently needed.
In the UK, legal responsibility following an autonomous vehicle crash is governed by the 2024 Automated Vehicles Act, but elsewhere in the world, the rules are less clear.
‘Even though we’ve seen lots of these things moving around US and Chinese cities, responsibility is still a grey area,’ Professor Stilgoe says.
London residents have also been woken up at 04:00 am on ‘most weekdays’ by Waymo self–driving cars that repeatedly get stuck in their dead–end street
‘No technology will ever be perfectly safe. Self–driving vehicles have to operate in unpredictable public spaces with other road users, surrounded by all of the complexity of everyday life.
‘Things will go wrong. But regulators can help minimise the risks and put in place processes so that, when mistakes happen, everyone can learn from them.’
Likewise, Professor Siddartha Khastgir, head of Safe Autonomy at the University of Warwick, told the Daily Mail that more transparency is essential.
Professor Khastgir says: ‘Public trust is underpinned by transparent, honest and responsible communications.
‘Like any technology, the concept of absolute safety is a myth. But we can still have safe autonomous vehicles if we can accurately establish and communicate their capabilities and limitations to the users – we call this “informed safety”.’
A Waymo spokesperson told the Daily Mail: ‘We have identified an area of improvement regarding untraversable flooded lanes specific to higher-speed roadways, and have made the decision to file a voluntary software recall with NHTSA related to this scenario.
We are working to implement additional software safeguards and have put mitigations in place, including refining our extreme weather operations during periods of intense rain, limiting access to areas where flash flooding might occur.’






