“Available 24 Hours 7 Days A Week”
“Disponibles las 24 horas del dia 7 dias a la semana”

Blog

Waymo’s Self-Driving Cars Are Now Carrying Passengers in Miami — What That Means If You’re Injured by One.

If you’ve driven through Miami recently, especially Downtown and Brickell, you’ve likely already driven by a strange white car with large equipment on the roof. When you drive past the car, you see something shocking – there is no one behind the wheel! But how??? It’s driving calmly, obeying all traffic laws, and carrying a passenger!

Well, it’s not magic. They are autonomous self-driving vehicles owned and operated by Waymo., an American autonomous driving technology company headquartered in Mountain View, California. It is a subsidiary of Alphabet Inc., Google’s parent company.

Waymo has officially opened its self-driving rides to the public in Miami. Thousands of people were on the waitlist. The rides cost about the same as Uber or Lyft. The cars operate within a defined service area and are marketed as safer, calmer, and more predictable than human drivers.
From a technology standpoint, it’s impressive.

However, if you’re reading this, it’s because you’re wondering what happens if one of these hurts you. And from an injury victim’s standpoint, it’s something you need to understand before something goes wrong. Because autonomous cars don’t eliminate accidents. They change how accidents happen — and how responsibility is decided afterward.

Waymo loves to tout that their vehicles have traveled well over 100 million autonomous miles across America. They also like to point to the reduced rates of serious injury crashes compared to humans. This is important, and it does impact how people view these vehicles.

But history is also important, and Waymo’s history is a big reason why injury cases for self-driving cars are different in a legal sense.

Waymo vehicles have been involved in a number of reported incidents over the past few years in cities such as San Francisco, Phoenix, and Los Angeles. There have been minor incidents, such as low-speed crashes, fender-benders, and a vehicle stopping suddenly and then being rear-ended by a human driver. There have also been more problematic incidents.

For example, in one of the most publicized accidents in San Francisco, a pedestrian was hit by another vehicle and then struck by a Waymo. The Waymo did stop, but there are concerns about autonomous vehicles in chaotic accidents involving multiple vehicles. There have also been reports of Waymo vehicles being rear-ended by other vehicles because they stopped suddenly in response to hazards that other drivers did not see. There have also been reports of autonomous vehicles blocking traffic, stopping suddenly, and having difficulty navigating construction zones and emergency scenes.

These accidents do not mean that autonomous vehicles are reckless. What these accidents point out, however, is an important fact: autonomous vehicles don’t think. They don’t think in the way that humans do. Instead, they follow rules and probabilities and priorities. Sometimes this makes them safer. Sometimes it makes them more dangerous.

In addition, from a legal perspective, these accidents in the past have already affected the way that lawyers and insurers and judges view accidents between autonomous vehicles.

First, fault is no longer based on driver behavior. There is no distracted driver. No speeding driver. No drunk driver. Instead, we look to whether or not the system was safe given what it was exposed to on the road.

And that brings us to arguments of product liability. Was this system designed to recognize these types of hazards? Was this system able to interpret sensor data correctly? Was this system sufficiently trained to account for real-world driving conditions, such as Miami traffic, aggressive drivers, pedestrians stepping out into the road, or a sudden closure of lanes?

Second, we look to previous incidents to determine whether or not a company should have foreseen these types of problems. Had this company seen, or should they have seen, that their cars were having difficulty with certain types of problems? Sudden braking. Difficulty with construction zone confusion. Difficulty with interactions with emergency vehicles. These types of problems are no longer hypothetical. They have been seen.

Another significant legal problem that arises in Waymo-related accidents is:

Data control. In a normal car accident, everything is out in the open. In an accident involving an autonomous vehicle, some of the most critical evidence is hidden from plain sight. Sensor recordings, internal system logs, and timelines of what the vehicle “saw” and how it reacted to it are all under the control of the company. In previous Waymo-related accidents, access to such evidence has been a significant concern. In order to preserve such evidence, speed is of the essence, as delays may result in overwritten evidence and incomplete evidence. From the point of view of the victim of a car accident, it is a reminder of the need to seek early legal counsel, as delays can quietly damage a case before it even begins.

Another consideration, based on previous Waymo-related accidents, is the role of insurance. In a normal car accident, you don’t have an adjuster come out and assess the negligence of the driver.

You’re dealing with corporate-level insurers, technical experts, and individuals that assess the situation based on technical reports, as opposed to what the witnesses have seen.

That’s not to say that victims won’t be compensated. It’s just that it’s a more complicated, technical, and sometimes more litigious process.

As a passenger of a Waymo vehicle, there are a number of risks that you’ll be facing. One of the factors that have been seen in previous incidents is passenger complacency. People trust the technology, and therefore they’re more likely to be lax and less prepared for an accident, whether it’s a collision at a low speed or a sudden stop.

Pedestrians and cyclists are also a part of this history. These cars are programmed to be careful, but cities are unpredictable places. Past incidents have shown that it is in these edge cases, these unusual and rapidly changing circumstances, that these cars are most likely to go wrong. And it is in these circumstances that serious injury is most likely to happen.

But then there is Miami. There is traffic, tourists, construction, aggressive driving, narrow streets in places like Little Havana. Past experiences in other cities, which Waymo was a part of, will be a guide in how these cases will be looked at, but Miami is a different story.

The main thing that injured parties need to know is that autonomous cars do not let anyone off the hook. They merely shift the hook.

When a Waymo is involved in a collision, it is not a question of who was at fault, or who was driving. It is a question of the technology, corporate decisions, past incidents, whether or not this technology met the safety standards that were promised to the public.

That is not a fight that most injured parties should be a part of.

Self-driving cars are part of the future, whether we like it or not. But the law still exists to protect real people when technology fails in the real world.

If you or a loved one have been injured in an accident, call Jaime “Mr. 786 Abogado” Suarez today to Get You Paid!

Free Case Evaluator

Contact Form

Case Results

Free Case Evaluator