Report: Automated systems need stronger safeguards to keep drivers focused on the road
ARLINGTON, Va. — March 12, 2020 (GLOBE NEWSWIRE) — The Insurance Institute for Highway Safety has issued a set of research-based safety recommendations on the design of partially automated driving systems. The guidelines emphasize how to keep drivers focused on the road even as the vehicle does more of the work.
Today’s partially automated systems still need the driver to be involved at all times. That means they need robust methods of monitoring driver engagement and more effective ways of regaining the driver’s attention when it wanders. Designs should also be based on a principle of shared control, and they should have built-in limits that prevent them from being used on roads and under conditions where it isn’t safe to do so, IIHS researchers say.
As part of that philosophy of shared control, partially automated systems shouldn’t change lanes or overtake other vehicles without driver input. They should also be responsive to driver steering input even when automatic lane centering is engaged.
“Unfortunately, the more sophisticated and reliable automation becomes, the more difficult it is for drivers to stay focused on what the vehicle is doing,” says IIHS President David Harkey. “That’s why systems should be designed to keep drivers actively engaged.”
Under the classification system developed by SAE International, there are five levels of automation, ranging from 0 (no automation) to 5 (fully self-driving). The highest level available in production vehicles today is Level 2. These systems continuously control acceleration, braking and steering to keep the vehicle traveling at a set speed in the center of its lane while maintaining a selected following distance from the vehicle ahead. They require the human driver to remain vigilant and ready to intervene in the event that the system encounters a situation it cannot handle.
Despite these limitations, some designs make it too easy for the driver to rely heavily on the system and lack robust methods to make sure he or she remains actively engaged in the driving.
Some manufacturers already offer automated lane-changing, and others have announced plans to follow suit. Most systems use only the presence of the driver’s hands on the steering wheel to monitor whether he or she is paying attention. Some seem to discourage the driver from actively sharing in the driving when lane-centering support is engaged.
Only Cadillac’s Super Cruise uses GPS-enabled navigation to restrict its use to specific highways that its engineers believe it can handle. However, Super Cruise doesn’t require the driver’s hands to remain on the wheel at all. Instead it monitors where the driver is looking and issues an alert when the driver’s gaze is diverted for too long. The researchers recommend that driver attention be monitored through multiple modes, so Super Cruise doesn’t meet all their recommendations.
“These systems are amazing feats of engineering,” says IIHS Research Scientist Alexandra Mueller, lead author of the IIHS recommendations. “But they all suffer from the same problem: They don’t account enough for the behavior of the human being behind the wheel.”
Too much trust in partial automation is one issue. An IIHS survey suggests that many consumers think Level 2 systems are practically self-driving. But the problem does not disappear when drivers understand the limits of partial automation and consciously resolve to remain focused on the road.
Research has shown that the more sophisticated and reliable automation becomes, the harder it is for a driver to remain vigilant. Fatigue increases, as indicated by longer and more frequent eye blinks, and the driver’s mind is more likely to wander. It takes less physical effort to drive when these Level 2 systems are providing support, even though the driver must still be in full control at all times, supervising both the roadway and the system’s behavior. Studies have shown that this change in the driver’s role increases the temptation to do other things, such as text or check email.
Various high-profile fatal crashes have shown how dangerous such lapses can be. All these systems can fail to follow the road when confronted with situations as common as a hill or curve.
In a fatal crash involving Tesla’s Autopilot system, for instance, a Tesla Model X failed to properly detect the lane markings at an exit ramp and crashed into a highway divider. The Tesla driver, who was killed, was playing a game on his cell phone at the time of the crash.
Following an investigation, the National Transportation Safety Board (NTSB) concluded in February that Autopilot’s limitations, the driver’s overreliance on the technology and his own distraction led to the crash. The NTSB called for the development of standards for driving monitoring systems “to minimize driver disengagement, prevent automation complacency and account for foreseeable misuse of the automation.”
The new guidelines developed by IIHS are a step in that direction. The researchers reviewed dozens of academic studies to develop a series of recommendations for how manufacturers can better ensure that users remain focused on what’s happening on the road. These recommendations should be implemented together, as applying some of them and not others could make systems more dangerous instead of safer.
Some of the recommendations are based on the idea that just because technology can accomplish certain tasks that humans usually perform, that doesn’t mean it should.
The authors point specifically to automatic lane changing and overtaking. The present Level 2 automation systems offered by BMW, Mercedes-Benz and Tesla can automatically change lanes when the driver triggers the function with the turn signal. Tesla’s system goes even further. In pre-mapped areas, its Navigate on Autopilot feature can change lanes and even exit the freeway without any trigger from the driver.
When Cadillac updates Super Cruise in 2021, that system will automatically change lanes without requiring the driver’s hands to be on the wheel, the company says.
Even if these systems are capable of performing such maneuvers safely in most situations, drivers are more likely to lose track of what is happening on the road when their role in lane changing and overtaking is reduced to the flick of a lever. A false sense of security may cause drivers to initiate the automatic procedure without confirming that the lane next to them is empty, as some user manuals instruct.
More broadly, partially automated steering systems that help keep the vehicle in the center of the travel lane should be designed to share control with the driver as another proactive measure to prevent inattention, Mueller and her coauthors report.
Few of the current systems have this design philosophy. Tesla’s Autopilot discourages active driver participation by canceling the lane-centering function when the driver makes a minor steering adjustment. Several other systems resist driver steering adjustments even when they do not present a safety hazard.
Instead, lane-centering systems should be designed to allow the driver to make steering adjustments without prompting the function to switch off. When the vehicle is in a safe position near the center of the lane, the steering wheel should provide minimal feedback. The system should provide more insistent support when the vehicle drifts toward the edge of the road or drifts into an occupied adjacent lane.
This type of design has a secondary benefit.
“Drivers feel more comfortable with systems that don’t fight their input, especially when navigating curves and making other challenging maneuvers,” Harkey says.
Along with these proactive measures, the researchers recommend more robust methods of monitoring whether the driver is paying attention and re-engaging the driver when that focus wanders.
The systems that are currently available either assume the driver is paying attention when his or her hands are on the wheel or use a driver-facing camera to determine if the driver’s head is oriented toward the road, but neither is foolproof. The researchers recommend employing multiple monitoring methods, including using a driver-facing camera and measuring things like manual adjustments to the steering wheel and how quickly the driver responds to attention reminders.
When the driver monitoring system detects that the driver’s focus has wandered, that should trigger a series of escalating attention reminders. The first warning should be a brief visual reminder. If the driver doesn’t quickly respond, the system should rapidly add an audible or physical alert, such as seat vibration, and a more urgent visual message.
Moments later, all three types of warnings should be presented. Throughout this sequence, the urgency of each alert should continue to escalate. If the driver doesn’t respond to the alerts, the system should increase following distance from the vehicle ahead and pulse the brakes to provide a warning that is difficult to ignore.
If the driver still fails to respond, the system should deploy the hazard lights and gradually slow the vehicle to a stop — though a design capable of first moving the vehicle onto the shoulder would be preferable. The driver can interrupt this safe stop procedure at any time by resuming control over the throttle or brake pedal and steering. However, the driver should be locked out from accessing the Level 2 system for the remainder of the drive anytime that the safe stop procedure has been triggered or a maximum number of attention reminders has been reached.
No manufacturer currently incorporates all these measures. Some systems only monitor one type of behind-the-wheel behavior and do not use an escalation process beyond a visual-audible attention reminder. Others switch themselves off if the driver fails to respond to repeated alerts. If the driver is incapacitated, that would mean that neither the driver nor the lane-centering system is actually steering.
“Because these systems still aren’t capable of driving without human supervision, they have to help prevent the driver from falling out of the loop,” Mueller says.