Friday, October 21, 2022

AV Safety with a Telepresent Driver or Remote Safety Operator

Some teams propose to test or even operate autonomous vehicles (AVs) with a telepresent driver or remote safety operator.  Making this safe is no easy thing.

Woman wearing VR goggles holding steering wheel

Typically the remote human driver/supervisor located at a remote operating base, although sometimes they will operate by closely following the AV test platform in a chase vehicle for cargo-only AV configurations.

Beyond the considerations for an in-vehicle safety driver, telepresent safety operators have to additionally contend with at least:

·        Restricted sensory information such as potentially limited visual coverage, lack of audio information, lack of road feel, and lack of other vehicle physical cues depending on the particular vehicle involved. This could cause problems with reacting to emergency vehicle sirens and reacting to physical vehicle damage that might be detected by a physically present driver such as a tire blow-out, unusual vibration, or strange vehicle noise. Lack of road feel might also degrade the driver’s ability to remotely drive the vehicle to perform a fallback operation in an extreme situation.

·        Delayed reaction time due to the round-trip transmission lag. In some situations, tenths or even hundredths of seconds of additional lag time in transmissions might make the difference between a crash and a recovery from a risky situation.

·        The possibility of wireless connectivity loss. Radio frequency interference or loss of a cell tower might interrupt an otherwise reliable connection to the vehicle. Using two different cell phone providers can easily have redundancy limitations due to shared infrastructure such as cell phone towers,[1] cell tower machine rooms (for some providers), and disruption of shared backhaul fiber bundles.[2] A single infrastructure failure or localized interference can disrupt multiple different connectivity providers to one or multiple AVs.

Role of remote safety operator

Achieving acceptable safety with remote operators depends heavily on the duties of the remote operator. Having human operators provide high-level guidance with soft deadlines is one thing: “Vehicle: I think that flag holder at the construction site is telling me to go, but my confidence is too low; did I get that right? Operator: Yes, that is a correct interpretation.” However, depending on a person to take full control of remotely driving a vehicle in real time with a remote steering wheel at speed is quite another, and makes ensuring safety quite difficult.

A further challenge is the inexorable economic pressure to have remote operators monitoring more than one vehicle. Beyond being bad at boring automation supervision tasks, humans are also inefficient at multitasking. Expecting a human supervisor to notice when an AV is getting itself into a tricky situation is made harder by monitoring multiple vehicles. Additionally, there will inevitably be a situation in which two vehicles under control of a single supervisor will need concurrent attention when the operator can only handle one AV in a crisis at a time.

There are additional legal issues to consider for remote operators. For example, how does an on-scene police officer give a field sobriety test to a remote operator after a crash if that operator is hundreds of miles away – possibly in a different country? These issues must be addressed to ensure that remote safety driver arrangements can be managed effectively.

Any claim of testing safety with a telepresent operator needs to address the issues of restricted sensory information, reaction time delays, and the inevitability of an eventual connectivity loss at the worst possible time. There are also hard questions to be asked about the accountability issues and law enforcement implications of such an approach.

Active vs. passive remote monitoring

A special remote monitoring concern is a safety argument that amounts to the vehicle will notify a human operator when it needs help, so there is no need for any human remote operator to continuously monitor driving safety. Potentially the most difficult part of AV safety is ensuring that the AV actually knows when it is in trouble and needs help. Any argument that the AV will call for help is unpersuasive unless it squarely addresses the issue of how it will know it is in a situation it has not been trained to handle.

The source of this concern is that machine learning-based systems are notorious for false confidence. In other words, saying an ML-based system will ask for help when it needs it assumes that the most difficult part to get right – knowing the system is encountering an unknown unsafe condition –  is working perfectly during the testing being performed to see if, in fact, that most difficult part is working. That type of circular dependency is a problem for ensuring safety.

Even if such a system were completely reliable at asking for help when needed, the ability of a remote operator to acquire situational awareness and react to a crisis situation quickly is questionable. It is better for the AV to have a validated capable of performing Fallback operations entirely on its own rather than relying on a remote operator to jump in to save the day. Before autonomous Fallback capabilities are trustworthy, a human safety supervisor should continuously monitor and ensure safety.

Any remote operator road testing that claims the AV will inform the remote operator when attention is needed should be treated as an uncrewed road testing operation as discussed in book section 9.5.7. Any such AV should be fully capable of handling a Fallback operation completely on its own, and only ask a remote operator for help with recovery after the situation has been stabilized.


[1] For example, a cell tower fire video shows the collapse of a tower with three antenna rows, suggesting it was hosting three different providers. 
See: https://www.youtube.com/watch?v=0cT5cXuyiYY

[2] While it is difficult to get public admissions of the mistake of routing both a primary and backup critical telecom service in the same fiber bundle, it does happen.
See: 
https://www.postindependent.com/news/local/the-goof-behind-losing-911-service-in-mays-big-outage/

This is an adapted excerpt (Section 9.5.3) from my book: How Safe is Safe Enough? Measuring and Predicting Autonomous Vehicle Safety


No comments:

Post a Comment

All comments are moderated by a human. While it is always nice to see "I like this" comments, only comments that contribute substantively to the discussion will be approved for posting.