Don’t Get Left Behind

Race is on to produce fully autonomous vehicle with collision avoidance system. But would your shop know how to repair it?

Every industry needs to prepare for its next generation of employees, which, ideally, continually flows in. The collision repair industry currently faces an aging population in several ways. Not only are long-term employees retiring faster than young ones are being hired, but the technological advances on newer vehicles become standard so rapidly that current techs are struggling to stay on top of them.

Self-driving and fully autonomous vehicles are no longer a fantasy of the future. Today’s technology is rapidly advancing beyond human capabilities, from faster reaction times to the ability to “see” objects around corners.

While the thought of futuristic self-driving cars sounds like we’re almost in the age of “The Jetsons,” the National Highway Traffic Safety Administration (NHTSA) also considers automation the pathway to safer roads and collision avoidance. It will fall on the collision repair industry to ensure that it is. Repair technicians are the ones who will need to know when and how calibration is necessary, if repairs are possible and when parts need replacing.

The U.S. Department of Transportation (DOT) and the NHTSA say they are “fully committed to reaching an era of crash-free roadways through deployment of innovative lifesaving technologies.”

“Many or most of these technologies will find their way into cars well before we have cars that can do the driving completely,” says Donny Seyfer, executive officer at the National Automotive Task Force. That means it’s time to start learning how to safely repair these systems now.

The collision repair industry already faces an automotive tech boom that requires an adaptive set of skills. Increasingly, autonomous vehicles will demand a new understanding of how technology works to protect our safety in a world that will rely so heavily on machinery and computers.


Technology developments are currently underway that would allow vehicles to detect environmental factors that even a human driver can’t see, such as a child running into the street around the corner – even if they’re behind a wall or a house.

Seyfer says, “The newest thing I have seen is using light radar [LIDAR] to bounce a signal off of a solid object to ‘see’ around a corner and identify objects.”

LIDAR isn’t new to autonomous vehicles. Velodyne and Google are already using it to allow mapping of objects surrounding their self-driving vehicles in Arizona. But researchers at Stanford University are working with this existing laser-based technique that would allow vehicles to see and understand objects around a corner even before a driver could see them. While the technology could have many applications, the researchers are focusing on those for autonomous vehicles.

According to Matthew O’Toole, a postdoctoral scholar in electrical engineering at Stanford and researcher on the project, “Our current focus is on determining whether existing LIDAR systems used by autonomous cars will be able to support looking around corners. The key question is whether there is enough light signal detected by an off-the-shelf LIDAR system to enable this type of imaging. If this is determined to be the case, we hope to eventually reach the stage of working with automotive manufacturers and deploying this imaging technology in cars.”


Although limited in the details, existing LIDAR technology works well in the immediate vicinity of the sensor. The new technology is unique in that it would allow vehicles to “see” objects that are otherwise hidden from view. This requires an advanced system that excels at quickly and accurately processing the reflected light to create a final image.

“It sounds like magic, but the idea of non-line-of-sight imaging is actually feasible,” says Gordon Wetzstein, assistant professor of electrical engineering at Stanford. “You have a laser that shoots a very short pulse of light into the scene. Some of the light is directly reflected, but what we’re looking for are indirect reflections. The light scatters outside the line of sight of the camera.”

And, as David Lindell, a Ph.D. student in electrical engineering at Stanford adds, “As light reflects off the wall, interacts with this unknown object and then comes back to our sensor, we are picking up information about the geometry of this object that we can’t directly observe.”

The problem with the technology that’s in use today, researchers say, is the reflections that come back don’t accurately portray the objects they’re supposed to see, or they require too much time to decipher what they’re looking at. The Stanford team is developing a more efficient way to understand the information. And they’re cutting the time it takes to decipher the light signals that come back from hours to seconds and aiming for almost instantaneous.

The algorithm they’ve developed is compatible with existing LIDAR systems now used to detect objects around a vehicle in the development of self-driving cars. Although existing LIDAR systems are used in today’s autonomous cars, they’re far from perfect.

“The big thing still appears to be cost,” Seyfer says, and “Velodyne has a front mounted unit now … perfectly positioned for destruction by flying objects and accidents.” Most LIDAR systems were previously mounted on the roof, but Seyfer says, “One has to wonder how they will do in the sun.”

Currently, the new around-the-corner LIDAR technology needs more work, especially when it comes to seeing objects that are dark in color or moving, like a person in black, a rolling ball or a child running into the street. “It looks to me as if LIDAR is taking so long to become financially viable that RADAR and cameras are grabbing the market share,” Seyfer says.


The Insurance Institute for Highway Safety (IIHS) has conducted studies showing that collision avoidance systems prevent and mitigate many car crashes better than human drivers. Fully autonomous self-driving vehicles have the potential to be safer than those on the road today. But to have fully automated self-driving cars, vehicles must have excellent collision avoidance systems, and collision repair techs need to be able to safely repair or replace these systems at an affordable price.

The current industry debate over the need for pre- and post-repair diagnostic scans is one that remains contested, but it’s possible that developers of self-driving cars have autonomy in mind for more than just driving. “The current systems I’ve seen have self-diagnostic capability and do not require calibration other than that their position is where it was intended to be,” Seyfer notes.

Even if self-diagnosing systems are part of the future, how will we know they’re accurately diagnosing themselves? One day, it may be safe to trust cars to drive themselves and identify their own problems, but these changes come in stages. Today, it falls entirely on the collision repair industry.

According to Seyfer, “The thing that’s rarely discussed is that ‘autonomous’ does not mean that the car is necessarily driving itself. I prefer the terms ‘active driver assistance system’ and ‘self-driving’ to differentiate.”

Still, vehicles on the road now are continually including more autonomous driving and safety features as standards, instead of extras. And though self-driving cars aren’t yet safe for the road, active driver assistance systems, many of which are collision avoidance systems, are making their way into new vehicles.

“Recent negative trends in automotive crashes underscore the urgency to develop and deploy lifesaving technologies that can dramatically decrease the number of fatalities and injuries on our nation’s roadways. NHTSA believes that automated driving systems (ADSs), including those contemplating no driver at all, have the potential to significantly improve roadway safety in the United States.”


SAE International lists six levels of automation from zero to six. At Level Zero, the driver is responsible for all driving and monitoring of the surrounding areas. It’s rare that new vehicles today are made at Level Zero, but there are many existing Level Zero cars on the road.

Level One, called Driver Assistance, may include some assistance systems, but still requires the driver’s full attention. Think of cruise control; the driver is still required to turn it on and off, set the speed and brake when necessary.

Often, vehicles at Levels Zero to One can be repaired mechanically and the majority of system errors can be diagnosed with a universal scan tool.

By Level Two, Partial Automation, the vehicle may have some automated functions, such as braking, when it detects an obstacle in front or steering to keep the car in a lane. However, the driver still needs to monitor the environment and be in control of all driving tasks. Most new vehicles today are at Level Two, and many auto manufacturers are making features at this level the new standard. Think of collision avoidance assistance, or safety tech systems, such as the Nissan ProPilot, Honda Sensing, Subaru EyeSight or Toyota Safety Sense.

In March, Ford announced its new Co-Pilot360, which will come standard on all its new passenger cars, SUVs and F-150s. It includes lane-keep assist, blind spot monitoring, automatic emergency braking and high beams and rain sensing wipers. The fact that these systems come standard on so many new vehicles is spreading with the support of the government and consumer purchasing trends.

The U.S. Department of Transportation (USDOT), the NHTSA and the Insurance Institute for Highway Safety (IIHS) have organized a voluntary agreement between 20 automakers that pledge to “equip virtually all new passenger vehicles by Sept. 1, 2022, with a low-speed AEB system that includes forward collision warning (FCW), technology proven to prevent and mitigate front-to-rear-end crashes.”

Today’s vehicles can only self-diagnose to the extent of Level Zero to One and often require specialized diagnostic scanning with updated software to identify issues. The pre- and post-repair scan debate arose when production of cars at Level Two became more common than levels Zero to One.


By Level Three, Conditional Automation, the vehicle begins to take responsibility for monitoring its surrounding environment and reacting through automatic braking, steering or accelerating. Often, at slow speeds, Level Three vehicles don’t require driver attention, but the driver always needs to be ready to take control. Some OEMs are releasing Level Three vehicles in the next two years.

At Level Four, High Automation, the car can alert the driver when it is safe to switch to automated driving, but situations such as traffic jams or merging might still require driver attention. In safe circumstances, a Level Four vehicle is capable of monitoring and responding to its surroundings, steering, braking, accelerating and determining when it’s safe to turn, change lanes and control lights and turn signals. Honda aims to have a Level Four vehicle by 2025.

By Level Five, Full Automation, the vehicle doesn’t require pedals or a steering wheel because the vehicle is fully autonomous in all conditions.

Because we don’t have many vehicles at Level Four on the road, most collision repair technicians haven’t had to consider repairs at these levels. Will these vehicles return to self-diagnosing systems? Will they require even more labor-intensive scanning and advanced systems than those we’re beginning to see today? What will be the costs for parts and repair information?

To get ahead and avoid playing catch-up, the collision repair industry needs to start asking these questions today.