On 3 June 2022, in San Francisco, California, a Cruise self-driving vehicle that was driving itself autonomously was involved in an accident with a Toyota Prius. People in both cars sustained injuries as a result of the crash.

According to Automotive News, the crash occurred only one day after regulators in California gave Cruise a critical permit related to its autonomous driving pursuits. The California Public Utilities Commission ruled that Cruise can collect fares from passengers taking rides in the car with no human safety driver behind the wheel. The permit was the first of its kind in The Golden State.

As the story goes, the Cruise car turned left in front of the oncoming Prius. It happened at the intersection of Geary Boulevard and Spruce Street. Officials from Cruise claim the autonomous vehicle stopped before proceeding with the left turn, and the self-driving car was actually sitting still when the Prius hit it.

Cruise went on to say that the driver of the Toyota was speeding. Moreover, the company's report claims the Prius was in a lane that required a right turn, though the human driver proceeded straight through the intersection. A spokesperson for the San Francisco Police Department couldn't confirm or deny Cruise's interpretation of the story since an incident report was either unavailable or not completed in the first place.

Clearly, there is much more information needed in order to sort out exactly what happened in the crash. Carnegie Mellon University professor and autonomous-vehicle safety expert Phil Koopman noted that the specific behaviour of both the human driver and the self-driving car will need to be taken into account. He shared:

"Many people have a word for a driver who cuts in front of them and then stops in the road, and it's not a polite or charitable word. There are a lot of unknowns here. We don't know if the Prius driver intended to turn right, but then swerved to try and avoid the crash with the stationary vehicle, for example. There are just a lot of unknowns."

Cruise shared that the car was in autonomous mode when the crash occurred. However, it didn't clarify if the person or people in the car were safety drivers, employees, passengers, or a combination. We do know that people in the car were not likely charged for the ride since the paid service hadn't yet started prior to the crash.

Regardless of who's to blame for the incident, multiple people were injured, and someone is responsible. This just goes to show that even once it seems self-driving cars may be ready for our roadways in some capacity, incidents are almost certain to occur. It will likely be a long and painstaking process to analyse the interaction between human drivers and robot cars to minimise incidents and determine fault.

Just weeks after the crash, a fleet of Cruise vehicles drove to the same intersection and blocked traffic for hours. Cruise apologised for the incident and called it a "technical issue."