If a car runs a red light but no one’s driving, who gets the ticket?
That’s the puzzle police in San Bruno, California, faced when they pulled over a Waymo robotaxi that made an illegal U-turn.
The scene was almost surreal: officers peering into an empty driver’s seat.
“Since there was no human driver, a ticket couldn’t be issued (our citation books don’t have a box for ‘robot’),” the department joked online.

Instead, they notified Waymo, hoping a software tweak would stop the car from repeating the stunt.
Waymo insists its “Driver” AI is built to follow traffic laws. Still, glitches happen.
Who’s Liable For Driverless Cars?
Earlier this year, the company recalled over 1,200 cars after they kept colliding with stationary objects like gates and chains.
And last year, regulators opened an investigation into reports of Waymos behaving erratically in traffic.
Help is on the way for law enforcement, though.

Starting in 2026, California police will be able to issue a “notice of noncompliance.”
This will be issued directly to companies when a driverless car breaks the law.
Firms will also be required to give first responders a hotline and move their vehicles within two minutes if ordered.
Until then, the question lingers: when your ride is a robot, who really takes the blame?