The TuSimple crash raises safety culture questions

WSJ scoop confirms the TuSimple April 6th crash video and FMCSA investigation.  See: https://www.wsj.com/articles/self-driving-truck-accident-draws-attention-to-safety-at-tusimple-11659346202?st

Truck by open garage door


This article validates what we saw in this video showing the crash: https://lnkd.in/ebGF4Quv

TuSimple blames human error, but it sounds more like a 
#moralcrumplezone situation, with serious safety culture concerns if the story reflects what is actually going on there.

A left turn command was pending from a previous disengagement minutes before. When re-engaged the system started executing a sharp left turn while at 65 mph on a multi-lane highway. That resulted in crossing another traffic lane, a shoulder, and then hitting a barrier.
The safety driver reacted as quickly as one could realistically hope -- but was not able to avoid the crash. A slightly different position of an adjacent vehicle would have meant crashing into another road user (the white pickup truck in an adjacent lane can be seen in the video). This was a non-injury crash, but only by good luck.

If they really insist on blaming the driver for not following procedures perfectly, they should have been mentioning how their SMS will improve procedure compliance. (Was there a 2-person pre-engagement checklist? Why was it not followed? Procedure compliance is never 100% and this was too dangerous to leave to just procedural risk mitigation -- so what were they thinking?) But we just heard a technical band-aid for this particular failure mode. That is far short of a reasonable SMS response to such a severe incident.

The article gives signs of problematic safety culture: "Safety drivers, meanwhile, have flagged concerns about failures in a mechanism that didn’t always enable them to shut off the self-driving system by turning the steering wheel, a standard safety feature, other people familiar with the matter said. Company management dismissed the safety drivers’ concerns, the people said." (Not independently investigated, at least yet; have to wonder if this factored into the crash.)

TuSimple has said in its VSSA that it is following the safety standards ISO 26262 and ISO/PAS 21448. However, in the article there is a description of a lawsuit raising doubts. The key bit is an allegation that "he was wrongfully fired after he refused to sign off on safety standards that he said the company had yet to meet."

TuSimple comment before this story ran: https://lnkd.in/evkWiNiy

Excellent reporting by Kate O'Keefe and Heather Somerville

TuSimple report of crash given to NHTSA:
TuSimple's position is that this event does not meet the criteria for reporting under the SGO. Specifically, the L4 ADS was not in operation at any point in the 30 seconds prior to making contact with the concrete barrier.     The event occurred as follows: As the truck was being operated on the highway, within its mapped ODD, the driver and test engineer attempted to engage the ADS.  However, the ADS was not functional at that moment due to the computer unit not having been initialized, and should not have been attempted to be activated.  In short, this was a failed attempt to engage the system as a result of human error.     The system has now been modified to prevent the ADS from attempting to engage unless the health of the system is fully functional to prevent human error from repeating this event.   When the erroneous attempt to engage occurred, the uninitialized vehicle control unit rotated the steering to the left, causing the truck to veer left.  The safety driver took control of the steering, and was able to steer accordingly, but not before the left front truck tire and left front quarter panel came into contact with the concrete barrier to the left of the lanes of travel.     The contact resulted in a scuff to the left tire and damage to the radar unit extending from the left quarter panel.




Comments

Popular posts from this blog

Software Safety for Vehicle Automation Short Course

A Reality Check on the 94 Percent Human Error Statistic for Automated Cars

Debunking AV Industry Positions on Standards and Regulations