Saturday, September 30, 2023

Cruise publishes a baseline for their safety analysis

Summary: a Cruise study suggests they are better than a young male ride hail driver in a leased vehicle. However, this result is an estimate, because there is not yet enough data to have a firm conclusion.


I am glad to see Cruise release a paper describing the methodology for computing the human driver  baseline, which they had not previously done. Same too for their "meaningful risk of injury" estimation method. And it is good to see a benchmark that is specific to a deployment rather than a US average.

Cruise has published a baseline study for their safety analysis here:
 blog post:  https://getcruise.com/news/blog/2023/human-ridehail-crash-rate-benchmark/
 baseline study: https://deepblue.lib.umich.edu/handle/2027.42/178179
(note that the baseline study is a white paper and not a peer reviewed publication)

The important take-aways from this in terms of their robotaxi safety analysis are:
  • The baseline is leased ride hail vehicles, not ordinary privately owned vehicles
  • The drivers of the baseline are young males (almost a third are below 30 years old)
  • A "meaningful risk of injury" threshold is defined, but somewhat arbitrary. They apparently do not have enough data to measure actual injury rates with statistical confidence. Given that we have seen two injuries to Cruise passengers so far (and at least one other injury crash), this is not a hypothetical concern.
It should be no surprise if young males driving leased vehicles as Uber/Lyft drivers have a higher crash rate than other vehicles. That is their baseline comparison. In fairness, if their business model is to put all the Uber and Lyft drivers out of work, perhaps that is a useful baseline. But it does not scale to the general driving population.

A conclusion that a Cruise robotaxi is safer (fewer injuries/fatalities) than an ordinary human driver is not quite supported by this study.
  • It is not an "average" human driver unless you only care about Uber/Lyft. If that is the concern, then OK, yes, that is a reasonable comparison baseline.
  • I did not see control for weather, time of day, congestion, and other conditions in the baseline. Road type and geo-fence were the aspects of ODD being used.
  • There is insufficient data to have a conclusion about injury rates, although that will come somewhat soon
  • We are a long way away from insight into how fatality rates will turn out, since the study and Cruise have about 5 million miles and San Francisco fatality rate is more like one per 100 million miles
  • The Cruise emphasis on "at fault" crashes is a distraction from crash outcomes that must necessarily include the contribution of defensive driving behavior (avoiding not-at-fault crashes)
This study could support a Cruise statement that they are on track to being safe according to their selected criteria. But we still don't know how that will turn out. This is not the same as a claim of proven safety in terms of harm reduction.

A different report does not build a model and estimate, but rather compares actual crash reports for robotaxis with crash reports for ride hail cars and comes to the conclusion that Cruise and Waymo operated at 4 to 8 times as many crashes as average US drivers, but that their crash rate is comparable to ride hail vehicles in California.

https://www.researchgate.net/publication/373698259_Assessing_Readiness_of_Self-Driving_Vehicles

Thursday, September 28, 2023

No, Mercedes-Benz will NOT take the blame for a Drive Pilot crash

Summary: Anyone turning on Mercedes-Benz Level 3 Drive Pilot should presume they will be blamed for any crash, even though journalists are saying Mercedes-Benz will take responsibility.

Driver playing Tetris while using Drive Pilot

Drive Pilot usage while playing Tetris. (Source)

There seems to be widespread confusion about who will take the blame if the shiny new Mercedes-Benz Drive Pilot feature is involved in a crash in the US. The media is awash with non-specific claims that amount to "probably Mercedes-Benz will take responsibility."  (See here, herehere, and here)

But the short answer is: it will almost certainly be the human driver taking the initial blame, and they might well be stuck with it -- unless they can pony up serious resources to succeed at a multi-year engineering analysis effort to prove a design defect.

This one gets complicated. So it is understandable that journalists on deadline simply repeat misleading Mercedes-Benz (MB) marketing claims without necessarily understanding the nuances. This is a classic case of "the large print giveth, and the small print taketh away" lawyer phrasing.  The large print in this case is "MERCEDES-BENZ TAKES RESPONSIBILITY" and the small print is "but we're not talking about negligent driving behavior that causes a crash." 

The crux of the matter is that MB takes responsibility for product defect liability (which they have to any way -- they have no choice in the matter). But they are clearly not taking responsibility for tort liability related to a crash (i.e., "blame" and related concepts), which is the question everyone is trying to ask them.

Like I said, it is complicated.

The Crash

Here is a hypothetical scenario to set the stage. Our hero is the proud owner of a Drive Pilot equipped vehicle, and has activated the feature in a responsible way. Now they are merrily playing Tetris on the dashboard, reading e-mail, watching a movie, or otherwise using the MB-installed software according to the driver manual. The car is driving itself, and will notify our hero if it needs help, as befits an SAE Level 3 "self-driving" car feature.

A crash to another vehicle happens ahead in the road, but that other, crashed vehicle is out of the flow of traffic. So our hero's car sees a clear road ahead and does not issue a takeover request to the driver. Our hero is currently engrossed in watching a Netflix movie on the dashboard (using the relevant MB-approved app) full of explosions in an action scene, and does not notice.

Meanwhile, in the real world, a dazed crash victim child wanders out of their wrecked vehicle onto the roadway. But our hero's MB vehicle has a hypothetical design defect in that it can't detect children reliably. 

Our hero's vehicle hits and kills that child. (I emphasize this is a hypothetical, but plausible, defect. The point is that we, and the owner, have no way of knowing if such a defect is present right now. Other crash scenarios will play out in a similar way.)

Our hero then is charged by the police with negligent homicide (or the equivalent charge depending on the jurisdiction) for watching a movie while driving. Additionally a lawsuit for $100M is filed both against the driver and Mercedes-Benz for negligent driving under tort law. The judge determines that Mercedes-Benz did not have a duty of care to other road users under the relevant state law, so the tort lawsuit is changed to be just against the driver.

What happens to our hero next? 

Will MB will step up and pay to settle the $100M lawsuit? Will they also volunteer to go to jail instead of our hero? They have not actually said they will do either of these things, because their idea of "responsibility" is talking about something entirely different.

Tort Law

I will say right here I am not a lawyer. So this is an engineer's understanding of how the law works. But we are doing pretty basic legal stuff here, so probably this is about right. (If you are a journalist, contact law professor William Widen for his take.)

Tort law deals with compensation for a "harm" done to others. Put very simply, every driver has a so-called "duty of care" to other road users to avoid harming them. Failing to exercise reasonable care in a way that proximately causes harm can lead to a driver owing compensation under tort law. Unless there is a clear and unambiguous transfer of duty of care from our hero to the MB Drive Pilot feature, our hero remains on the hook under tort law.

The problem is that the duty of care remains with our hero even after activating Drive Pilot. Pressing a button does not magically make a human driver (who is still required to remain somewhat attentive in the driver seat) somehow not an actual driver under current law.

But, wait, you say. Mercedes says They Take Responsibility!  (This message is currently being splashed across the Internet in breathless reviews of how amazing this technology is. And the technology is truly amazing.)

Well no, not according to tort law they don't take responsibility. Instead, the MB position is that the human driver retains the duty of care for potential harm to other road users when using their Level 3 Drive Pilot system -- even while playing Tetris. Their representative admitted this on stage a couple weeks ago in Vienna Austria. I was in the crowd to hear it. Quoting from Junko Yoshida in that article:  "Yes, you heard it right. Despite its own hype, Mercedes-Benz is saying that the driving responsibility still rests on the human driver in today’s L3 system."

So what does MB mean when they "accept responsibility?"   The answer is -- they are not accepting the responsibility you think they are. In particular, they are not accepting liability that stems from negligent driving behavior.

Product Defect/Liability

A statement from Mercedes-Benz issued to the press makes it clear that MB accepts responsibility for product defects, but not tort law liability.  A key phrase is:  "In the context of Drive Pilot, this means that if a customer uses the system as intended and instructed and the system fails to perform as designed, we stand behind our product."  (Source here.)

This means that in our hypothetical scenario, our hero can hire a lawyer, who then hires a team of engineers to look at the engineering of Drive Pilot to see if it "performs as designed." Expect someone to pay $1M+ for this effort, because this is a full-on, multi-year product defect investigation. Maybe they find a defect. Maybe they don't, even if a defect is really there. Or maybe they find a reproducible defect, and it is so complex the jury doesn't buy the story. And maybe the performance as designed is that sensors will only detect 98% of pedestrians, and the victim in this tragic scenario is just one of that unlucky 2%. 

Perhaps the owner manual says our hero should have been alert to dazed crash victims wandering in the travel lane. Even while playing Tetris.  (That owner manual is not released yet, so we'll just have to wait and see what it says.) In which case MB will blame our hero for being negligent -- as if one is likely to notice a crash while watching the latest war movie with a high end sound system. And what if the owner didn't bother to read the manual and instead just took the dealer and dozens of car journalists saying "MB accepts responsibility" at face value. (OK, well maybe that's a different lawsuit. But we're talking a legal mess here, not a clear acceptance of the duty of care for driving by MB.)

Who is holding the bag here?

It's probably our hero, and not Mercedes-Benz holding the bag. Even though our hero believed the MB marketing that said it was OK to play video games instead of looking out the front window, and believed their claims of superior engineering quality, and so on.

Maybe a product defect can be found. Finding one might influence a jury in a tort law case to blame MB instead of our hero. (But that's not guaranteed. And what if the jury finds a reason to blame both?) Perhaps our hero has a robust umbrella insurance coverage instead of the state minimum that covers such a loss. Perhaps our hero is so broke they are "judgement proof" (no assets to collect -- but owning a Mercedes-Benz makes that less likely I'd think).

In effect, what we'll see is that the human driver will almost certainly be presumed guilty of negligence in a crash if watching the dashboard instead of the road, even if the vehicle operational mode is telling them it is OK to do exactly that. The onus will be upon the driver to prove a design defect as a defense. This is difficult, expensive, time consuming, and overall not an experience I would wish on anyone.

And there is the specter of criminal liability for negligent homicide (or the equivalent). Depending on the state, our hero might be charged criminally based on being the operator or even just the vehicle owner (even if someone else is driving!). It depends. Nobody really knows what will happen until we see a case go through the system. But outcomes so far in Level 2 vehicle automation cases are not encouraging for our hero's prospects.

Driving a Level 3 vehicle is only for the adventurous

Perhaps you believe any claim by MB that their feature will never cause a crash while engaged. Ever. Pinkie swear. But they have had recalls for software defects before, and there is no reason to believe this new, complex software is exempt from defects or malfunctions.

We don't know how this will turn out until a few such crashes make their way through the court system. But there is little doubt that it will be a rough ride for both the drivers and the crash victims after a crash while we see how this sorts out.

Anyone turning on Drive Pilot should presume they will be blamed for any crash, no matter what Mercedes-Benz says. As even MB admits, the human driver retains the duty of care for safety to other road users at all times, just as in a Level 2 "autopilot"-style system. Marketing statements made by MB about giving time back to the driver don't change that.

Who knows, maybe MB will decide to stand behind their product and pay out on the behalf of customers who find themselves embroiled in tort law cases. But they have not said they will do so. And maybe you believe that MB engineers are so good, that this particular software will have no defects AND will never mistakenly cause a crash.  But that's not a set of dice I'm not eager to roll.

This legal ambiguity is completely avoidable by establishing a statutory duty of care for computer drivers that assigns the manufacturer as the responsible party under tort and criminal law. But until that happens, our hero is going to be left holding the bag.

(Things get even worse if you want to dig into the complexities of the handoff process... but that is a story for another post.)


Update --  The MB Drive Pilot US manual is out, and it doubles down on these issues.  Drivers are apparently required to notice "irregularities .. in the traffic situation" -- which means paying attention to the road.  And for "obvious circumstances"

US Drive pilot owners page: https://www.mbusa.com/en/owners/manuals/drive-pilot#owners

Drive Pilot supplement: https://www.mbusa.com/content/dam/mb-nafta/us/owners/drive-pilot/EQS_Sedan_Operators_Manual_US.pdf

US MB EQS manual: https://www.mbusa.com/content/dam/mb-nafta/us/owners/drive-pilot/EQS_Sedan_Operators_Manual_US.pdf

US MB S-Class manual: https://www.mbusa.com/content/dam/mb-nafta/us/owners/drive-pilot/S_Class_Operators_Manual_US.pdf

-------------------------

Phil Koopman has been working on self-driving car safety for more than 25 years. https://users.ece.cmu.edu/~koopman/ 

Wednesday, September 27, 2023

Five questions cities should ask when robotaxis come to town

If your city is getting robotaxis, here are five questions you should be asking yourself if you are in local government or an advocate for local stakeholders. Some of the answers to these questions might not be easy if you are operating under a state-imposed municipal preemption clause, which is quite common. But you should at least think through the situations up front instead of having to react later in crisis mode.

(There might well be benefits to your city. But the robotaxi companies will be plastering them over as much media as they can, so there is no real need to repeat them here. Instead, we're going to consider things that will matter if the rollout does not go ideally, informed by lessons San Francisco has learned the hard way.)

Local citizens have questions for the robotaxi / Dall-E 2

(1) How will you know that robotaxis are causing a disruption?

In San Francisco there has been a lot of concern about disruption of fire trucks, emergency response scenes, blocked traffic, and so on. The traffic blockages show up pretty quickly in other cities as well. While companies say "stopping the robotaxi is for safety" that is only half the story. A robotaxi that stays stopped for tens of minutes causes other safety issues, such as blocking emergency responders, as well as disrupts traffic flow.

How will you know this is happening? Do you plan to proactively collect data from emergency responders and traffic monitoring? Or wait for twitter blow-ups and irate citizens to start coning cars? What is your plan if companies cause excessive disruption?

(2) How will you share data with companies that they should be using to limit testing?

For example, you might wish to ask companies not to test near parades, first amendment events, active school zones, or construction areas. Is it easy for companies to access this information, preferably electronically? Are they interested in doing that? Will they follow your requested testing/operational exclusion areas? What do you plan to do if they ignore your requests to restrict testing in sensitive areas?

(3) How will you ensure testing and operational equity?

What if disruption, testing incidents, and other issues are concentrated in historically disadvantaged areas? (This might happen due to company policy, but might instead be an unintended emergent result due to higher-than-average population density and emergency response activity in such areas.)

How will you know whether exposure to testing risk is being imposed in an equitable manner, especially if robotaxi companies claim that where they test is a trade secret?

If robotaxis are being sold based on service to the disabled and for other social goods, how will you be able to measure whether companies are living up to their promises?

(4) How will you issue traffic tickets to a robotaxi?

Some states require a moving violation citation to be issued to a natural person, but robotaxis don't have a driver. Consider proactively moving to get state-level regulations fixed sooner rather than later to correct this. Without the ability to ticket robotaxis you might find yourself without viable enforcement tools for the worst robotaxi behaviors that might occur.

(5) How can you productively engage with companies despite municipal preemption laws?

Apparently in some cities there is a good working relationship with between city government and robotaxi operators. In San Francisco the city and the companies are practically at war. There are no magic solutions, but trying hard up front to build bridges before things get tense is better than reacting to bad news.

-----------------------------