Simplified Proposal for Vehicle Automation Modes
Vehicle Automation Modes emphasize the responsibilities of a self-driving vehicle user
By: Dr. Philip Koopman, Carnegie Mellon University
Now that the AV industry has backed away from SAE Levels, especially the highly problematic Level 3, it's time for a fresh look at operating modes of vehicle automation technology.
If you follow self-driving car technology it’s likely you’ve encountered the SAE Levels of automation. The SAE Levels range from 0 to 5, with higher numbers indicating driving automation technology with more control authority (but not a linear progression, and not necessarily higher levels of safety). Unfortunately, in public discussions there is significant confusion and misuse (even abuse) of that terminology. In large part that is because the SAE Levels are primarily based on an engineering view rather than the perspective of a person driving the car.
We need a different categorization approach. One that emphasizes how drivers and organizations will deploy these vehicles rather than the underlying technology. Such an approach needs to emphasize the practical aspects of the driver’s role in vehicle operation.
If you doubt that another set of terminology is needed, consider the common informal use of the term “Level 2+,” which is undefined by the underlying SAE J3016 standard that sets the SAE Levels. Consider also the fact that different companies mean significantly different things when they say “Level 3.” In some cases Level 3 follows SAE J3016, meaning that the driver is responsible for monitoring vehicle operation and being ready to jump in — even without any notice at all — to take over if something goes wrong. In other cases vehicles described as Level 3 are expected to safely bring themselves to a stop even if the driver does not notice a problem, which is more like a “Level 3+” concept (also undefined by SAE J3016).
Even more importantly, the SAE Levels say nothing about all the safety relevant tasks that a human driver does beyond actual driving. For example, someone has to make sure that the kids are buckled into their car seats. To actually deploy such vehicles, we need to cover the whole picture, in which driving is critical but only a piece of the safety puzzle.
With the recent apparent removal of support for the SAE J3016 level system by the Autonomous Vehicle Industry Association, the time is ripe for revisiting how we talk about the different operational modes for vehicle automation.
We start with the premise that for practical purposes all new vehicles will have some sort of active safety system such as Automated Emergency Braking (AEB) and so skip a category specifically for vehicles with no driver assistance. (One could use a "No Assistance" mode if desired, but it adds unnecessary clutter for most purposes.) We also include a distinct category for testing to help close the SAE Level 2 Loophole which let companies test immature technology without regulatory oversight simply by (improperly) claiming the presence of a safety driver makes an autonomous driving feature testbed SAE Level 2. There is no mapping to the SAE Levels, because that would import baggage that could compromise safety.
The Four Operational Modes
In creating a driver-centric description of capabilities, the most important thing is not the details of the technology, but rather what role and responsibility the driver is assigned in overall vehicle operation. We propose four categories of vehicle operation:
- Driver Assistance
- Supervised Automation
- Autonomous Operation
- Vehicle Testing.
Driver Assistance: A licensed human driver drives, and the vehicle assists.
- Human Role: Licensed driver performs driving task
- Vehicle Role: Active Safety, Driver Support, Driving Convenience
The technology’s job is to help the driver do better by improving the vehicle’s ability to execute the driver’s commands and try to mitigate potential harm from some types of impending crashes. Convenience features might also be provided, excluding sustained automated steering.
Capabilities included as driver assistance might include anti-lock brakes, stability control, cruise control, adaptive cruise control, and automatic emergency braking. The driver always remains in the steering loop, exerting at least some form of sustained control over lane keeping and turns to ensure active engagement and situational awareness.
Momentary intervention by active safety and driver support functions in the steering function such as a steering wheel bump at lane boundaries is considered driver support rather than steering automation. Active safety might momentarily intervene in steering in response to a specific situation but should not permit itself to be used in lieu of continuous driver control of steering. Completely automated speed control is permitted (e.g., adaptive cruise control).
Supervised Automation: The vehicle controls speed and lane keeping. A human driver handles things the system is not designed to address.
- Human Role: Licensed driver keeps eyes on road, monitors for and intervenes in situations vehicle is not designed to handle, executes turns and other tasks beyond ordinary lane-keeping.
- Vehicle Role: Provides steady cruise functions of lane-keeping and speed control.
Technology normally provides a speed and lane-keeping "cruise" capability when feature is activated. A licensed human driver is responsible for continuous monitoring of driving and intervening when a situation is encountered beyond the design scope of the system. Human driver is responsible for situations outside the stated design capabilities of the system. The design capabilities exclude turning at intersections and other scenarios beyond traversing the current roadway. Automation might not be capable of handling situations outside its stated capability, which the driver is aware of and accounts for in supervision. Driver is able to take over full control whenever appropriate.
An effective driver monitoring system is required to ensure driver remains situationally aware and is capable of taking over when required for safety. This does not have to mean hands on the wheel. Keeping hands on the wheel might be required for testing, and might be required in vehicles that do not have camera-based driver monitoring systems to ensure driver engagement. But the requirement for Supervised Automation is simply that the driver must be able to respond when needed, and it is up to the feature developer to determine how to accomplish that in an effective manner. In practice with current technology this is likely to mean a camera-based Driver Monitoring System (DMS).
Supervised automation should make it reasonable to expect a civilian driver without specialized training to achieve at least as good a safety record as would be the case without steering automation given comparable other vehicle capabilities and operational conditions. This means that any such vehicle that is not as safe as a human driver (including not only crashes, but violating traffic laws or exhibiting reckless driving at an elevated rate) should be considered to have a defective design. The scope of design relevant to safety is not only the car, but also the human/driver interface.
As a practical matter, this limits use to highway and straight road-following cruise-control style applications where the vehicle does both lane keeping and speed/separation control. If the vehicle can make turns at intersections, with current technology it is beyond what is reasonably safe for civilian driver supervision, and instead is likely to be a road test vehicle. (This paragraph might be considered controversial. However it is the author's best estimate of what is feasible for safe road use by the full demographic of drivers on public roads, assuming an effective DMS can be deployed.)
Autonomous Operation: The whole vehicle is completely capable of operation with no human monitoring.
- Human Role: No Human Driver; steering wheel optional depending on operational concept
- Vehicle Role: Responsible for all aspects of driving and driving-related safety.
The vehicle can complete an entire driving mission under normal circumstances without human supervision. If the operational design domain (ODD) is restricted, the vehicle is responsible for safely handling any exit from the ODD that might occur.
If something goes wrong, the vehicle is entirely responsible for alerting humans that it needs assistance, and for operating safely until that assistance is available. Things that might go wrong include not only encountering unforeseen situations and technology failures, but also flat tires, a battery fire, being hit by another vehicle, or all of these things at once. People in the vehicle, if there are any, might not be licensed drivers, and might not be capable of assuming the role of “captain of the ship."
Examples of Autonomous vehicles might include uncrewed robo-taxis, driverless last mile delivery vehicles, and heavy trucks in which the driver is permitted to be asleep. A vehicle that received remote assistance would still be exhibiting Autonomous Operation if (a) the vehicle requests assistance whenever needed without any person being responsible for noticing there is a problem, and (b) the vehicle retains responsibility for safety even with assistance. In some cases autonomous operation might change mode to remotely supervised operation if a remote operator becomes responsible for safety.
Achieving safety will depend on the autonomous vehicle being able to handle everything that comes its way, for example according to the UL 4600 safety standard with additional conformance to ISO 26262 and ISO 21448.
Vehicle Testing: A trained safety driver supervises the operation of an automation testing platform.
- Human Role: Trained safety driver performs mitigates dangerous behaviors, and at times might perform driving.
- Vehicle Role: Automation being tested is expected to exhibit dangerous behaviors.
The vehicle is a test bed for vehicle automation features. Because it is immature technology, the driver must have specialized training and operating procedures to ensure public safety, for example according to the SAE J3018 road testing operator safety standard in accordance with a suitable Safety Management System (SMS).
Any vehicle which might exhibit dangerous behavior beyond the mitigation capability of an ordinary licensed driver encompassing the full driver demographic span, or that requires special qualification and care due to potentially dangerous behavior is an automation test platform. Anyone operating such a test platform is performing Vehicle Testing. (Alternately, such a platform is a defective Supervised Automation platform which should not be operating on public roads.)
An advantage of this classification approach is that it provides a straightforward way to address driver liability.
- Driver Assistance: As with conventional vehicles.
- Supervised Automation: Absent vehicle defects, the driver is responsible for safe operation. Vehicle defects are activated when the automation does not perform as described to the driver, including incorrect responses to scenarios said to be handled automatically and also failure to respond to a situation the driver has been told is covered automatically. As an example, a vehicle that suddenly swerves into oncoming traffic while performing lane keeping is likely to have defective automation in the absence of other over-riding considerations.
- Autonomous Operation: The vehicle automation is responsible for safety.
- Vehicle Testing: The organization performing testing is responsible for safety in accordance with a Safety Management System that includes driver qualification, driver training, and testing protocols.
A single vehicle can operate in multiple modes during a single trip. For example a single trip can start in Driver Assistance mode on local roads, switch to Supervised Automation on a limited access highway, and then switch to Autonomous Operation on a designated portion of roads (federal highway, urban downtown, parking garage) as is compatible with its design restrictions.
All modes must have provisions for mitigating risk from foreseeable misuse and abuse. That includes ensuring operation of modes within their intended restrictions (e.g., enforcing the J3016 concept of an Operational Design Domain (ODD)).
Mode changes must be done safely. The principle should be that a human driver can take control in a situation for which that can be safely done, but a human driver can never be forced to assume control involuntarily. This implies, for example, that in Autonomous Operation the vehicle must safety stop in a reasonable location if it is unable to continue a mission without demanding human driver takeover. (A human driver, if present, might elect to assume control, but takeover cannot be required to ensure safety.)
Automation must make a best effort to ensure the highest level of safety it is capable of even without human intervention, but nonetheless is not responsible beyond best effort for dealing with aspects of vehicle and control beyond its currently active mode. The one exception is Vehicle Testing mode, which because it involves immature technology cannot be counted on to provide any automation function beyond a high integrity mechanism for the human test driver to assert vehicle control.
Mode confusion is a critical issue with system safety. There must be an effective scheme for ensuring that any driver is aware of the current vehicle mode. Changes in modes must also be safe. Mode changes should not be permitted without unambiguous determination that any human driver that might be involved has shifted their mental model of current mode to match the actual vehicle mode in effect after the transition and is capable of fulfilling the expected human driver mode for that role.
For an earlier version of this approach with more detail relevant to regulators, see Section V of this paper: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3969214