An image from the San Bruno Police Department in California showing a bewildered officer who pulled over a Waymo robot taxi for an illegal U-turn.
An image from the San Bruno Police Department in California showing a bewildered officer who pulled over a Waymo robot taxi for an illegal U-turn.
A Waymo vehicle drives along Loop 101 in Tempe, Arizona on Nov. 11, 2025.
A Waymo vehicle waits for passengers in the parking lot of Chandler Fashion Center in Chandler on Nov. 11, 2025.

In September, a driverless taxi was caught on video illegally passing a school bus as children were being let off in Georgia. That same month, an autonomous vehicle pulled an illegal U-turn and was stopped by a bewildered police officer in California. Later the same month, a robotaxi hit and killed a beloved neighborhood cat in San Francisco.

A recent string of mishaps and illegal maneuvers involving driverless vehicles has vexed law enforcement and state officials as they navigate how to hold accountable autonomous cars – and their manufacturers – for mistakes on the road.

Amid the increased scrutiny of autonomous vehicles, one central question has emerged: What happens when a driverless car violates traffic laws?

Vehicles that don't require a human behind the wheel are currently operating in California, Arizona, Texas, Nevada and Georgia. And companies like Waymo and Tesla have begun testing their vehicles in Florida.

Nationwide, over half of all states have passed legislation allowing either testing or public operation of driverless vehicles, though regulation looks different in each jurisdiction, especially when it comes to traffic violations.

Arizona and Texas, for example, have laws in place allowing police officers to cite autonomous vehicles for traffic violations, just like they would if a person was behind the wheel. In California, a new law set to take effect next year would allow officers to issue notices, but it’s unclear whether any penalties will be associated with the paperwork.

Other states, such as Georgia, don't have similar laws, leaving police confused and lawmakers scrambling for a way to regulate the emerging technology.

As companies like Waymo, Tesla and Amazon's Zoox seek to expand their fleets of driverless vehicles, other states will have to decide what restrictions to implement, if any. Many will also have to reconsider whether laws already in place are too lenient, which some safety experts say is the case.

Waymo manufactures the biggest fleet of robot taxis in the United States, with over 2,000 vehicles nationwide. The company says its taxis are designed to respect the rules of the road and that they can be held accountable for mishaps through citations and suspensions.

"We have seen some cases of local law enforcement not knowing what their options or authorities are, but we can and do receive citations and pay them in the places where we operate," the company said in a statement to USA TODAY. "State regulators can also suspend AV operations for valid safety reasons."

Some observers are concerned about the relationship between driverless car companies and the state governments tasked with keeping roads safe.

“For all practical purposes, the companies can do pretty much anything they want, and we’re just hope they’re sufficiently motivated to be safe,” said Phil Koopman, a professor at Carnegie Mellon University with decades of experience on self-driving car safety.

He added: “There’s no state in the U.S. which holds these companies perfectly accountable.”

‘No driver, no hands, no clue’

In late September, the San Bruno Police Department in California’s Bay Area posted several images on social media showing one of its officers peering into the open window of a driverless Waymo taxi.

The caption said that during a DUI enforcement operation, officers saw the Waymo make an illegal U-turn at a stoplight and pulled it over. Waymo vehicles are designed to stop when they detect police lights and sirens, according to Waymo, which is owned by Alphabet, Google's parent company.

“That’s right … no driver, no hands, no clue,” the police department said in a post about the incident. “Officers stopped the vehicle and contacted the company to let them know about the ‘glitch.’”

The statement added, “Since there was no human driver, a ticket couldn’t be issued (our citation books don’t have a box for ‘robot.”

The traffic stop drew national headlines and became a prime example of the limbo police find themselves in when they stop a driverless car. There’s no one behind the wheel, so who is penalized for the error?

Beginning on July 1, 2026, officers in California will be able to issue “notices of noncompliance” to driverless car companies related to an alleged traffic violation. But so far, the state has not said what the penalties associated with those notices will look like, and experts are skeptical that there will be any consequences for manufacturers.

“As far as I can tell the notice of noncompliance doesn’t do anything,” Koopman said. “If the school writes a letter to mom and dad and they decide not to do anything, there’s no consequence.”

What is a traffic ticket to a billion-dollar company?

Even in states where officers can cite companies for the traffic violations of their driverless cars, those fees are just a drop in the bucket.

Koopman said while a ticket is a start at accountability – allowing for a written record of infractions and a way to track them – it’s far from a solution.

“For any of these companies, a $100 traffic fine isn’t going to get their attention,” he said.

Meanwhile, police departments say tickets against robot taxis are rarely issued.

Phoenix is the first city where Waymo first launched its robot taxis and remains a primary location for the service. Phoenix Police Department spokesperson Brian Bower said in an email to USA TODAY, “I do not know of any incident where an autonomous vehicle has been cited.”

In Atlanta, lawmakers vow to draft regulations for robotaxis

In September, a Waymo taxi in Atlanta was recorded failing to stop as it neared a school bus as children were getting off. The incident drew immediate backlash and led the National Highway Traffic Safety Administration to launch an investigation into hundreds of the company’s self-driving cars.

Video of the incident shows a Waymo taxi moving around a school bus that had its red lights flashing and its stop arm deployed. In a statement, the NHTSA said Waymo taxis log an average of 2 million miles per week and that, based on the frequent use of robot taxis, “the likelihood of other prior similar incidents is high.”

The incident occurred in Georgia, which does not have any laws on the books that give police a way to cite driverless vehicles. But the video sparked outrage among state lawmakers, and some have vowed to take action.

State Sen. Rick Williams, a Republican, told KGW8 that he intends to introduce legislation to hold driverless car companies accountable. Williams was a sponsor of Addy’s Law, which introduced potential jail time and fines for those who illegally pass a school bus that has its lights flashing.

“Driverless cars should be stopped until it can be figured out,” he said, adding, “We should not have this on the road it’s too dangerous for our children.”

Georgia State Rep. Clint Crowe, a cosponsor of Addy’s Law, said he would like to see companies face penalties for autonomous car violations, like any regular driver would.

“I'm a big fan of new technologies and emerging technologies … but we got to think about how they're going to comply with the law,” he told the outlet. “We're really going to have to rethink who's the responsible party, who's going to be responsible for being in control of that vehicle and who's going to be the operator of that vehicle.”

How safe are autonomous vehicles?

While Waymo, Tesla and other companies leading the self-driving revolution say their vehicles are safer than human-driven cars, experts say there’s not enough data to confirm such a claim.

A study released in December by Swiss Re, a large insurance provider, said over 25.3 million miles driven, Waymos were involved in nine property damage claims and two bodily injury claims. For the same number of miles, human-driven vehicles could be expected to have 78 property damage and 26 bodily injury claims, according to the study.

But research published by the Rand Corp., a nonpartisan global think tank, said self-driving vehicles will need to drive 275 million failure free miles to be considered as safe as humans. Waymo, which has the largest fleet of robot taxis, hit 100 million driven miles over the summer.

Citing the Rand study, Missy Cummings, a George Mason University professor and expert on driverless car safety, said assertions that autonomous vehicles are safer than humans is "gross statistical overclaiming."

"Humans drive in the trillions of miles every year," she said. "[Autonomous vehicles] are in the low millions, so you can't compare the two right now."

This is where regulators face another quandary: How to allow for innovation and while considering safety measures. Cummings and Koopman told USA TODAY that autonomous vehicles improve as they encounter different scenarios and are updated. But with more autonomous vehicles on the road and more time driving in communities nationwide, the more likely it is an accident will occur.

This is something the companies themselves know and acknowledge.

At a tech conference in San Francisco last month, a reporter asked Tekedra Mawakana, the co-CEO of Waymo, if she believes society would "accept a death potentially caused by a robot."

“I think that society will,” Mawakana said, according to SFGate, adding that the question applies to all autonomous car companies, not only Waymo. “I think the challenge for us is making sure that society has a high enough bar on safety that companies are held to."

Contributing: Elizabeth Weise

This article originally appeared on USA TODAY: Can a robot get a ticket? Misbehaving robotaxis vex officials

Reporting by Christopher Cann, USA TODAY / USA TODAY

USA TODAY Network via Reuters Connect