Home > Analysis > Clear vision needed to make AVs a reality

Clear vision needed to make AVs a reality

So simple, yet so critical - keeping optical sensors clean means keeping vehicle occupants safe. By Megan Lampinen

Intelligent vehicles promise tremendous safety benefits and a more relaxing in-car experience, but their intelligence relies entirely on accurate data. Sensors are the star players in today’s highly automated vehicles, providing pivotal visual information on the surroundings and environment. But if anything were to interfere with the provision of that data, these numerous smart capabilities could not perform.

‘Sensor malfunction – active safety functions disabled’

The problem is, things do happen to these sensors in the course of vehicle use. Mud splatters up on them. Bird droppings fall from above. Even heavy rain has been known to prompt the message: ‘Sensor malfunction – active safety functions disabled’. Keeping sensors clean is a serious business, and one in which dlhBOWLES has invested significant time and resources. The company started out providing the nozzles that spray cleaning fluid on the windshield and headlamps but took a new direction in 2004. At that time, the US government approached with a request to help clean the sensors on autonomous military vehicles. This was part of the Defense Advanced Research Projects Agency (DARPA) Challenge, a competition around revolutionary research in support of autonomous ground vehicles.

“We started playing around and applying our traditional nozzle designs and technology to that application, but they never really reached a point of commercialisation,” explained dlhBOWLES’ Russell Hester, Director of Business Development. “It wasn’t until about five years ago that we started to see an interest in cleaning the cameras on modern vehicles.”

When a bug is splattered on the windshield, a human driver can move his head and look around the bug. But for a camera or sensor, that single insect can obscure the entire sensor or half the sensor, easily. If that’s a critical sensor that the vehicle is using to navigate, it’s driving blind

Early adopters

Ford was an early adopter and the first to launch the company’s camera wash nozzle. The system uses the same traditional washer fluid for cleaning the cameras as it does for cleaning the windshield. As this contains an antifreeze component, it can help with the removal of ice and snow as well as elements like mud, dirt, insects and pollen.

“Ford has now begun to proliferate that on many of its vehicle platforms,” Hester added. So, too, have a handful of others. Today, dlhBOWLES provides camera cleaning systems for about 25 vehicle platforms in production. New legislation in the US could prove a major boom in business – as of May 2018, all new vehicles are required to have a rear camera. That camera will need to be kept clean somehow. Attention has recently begun to turn to other optical sensors, namely LiDAR, that are finding their way onto vehicles in support of advanced driver assistance systems (ADAS). “We are applying that same methodology and philosophy to clean those optical sensors,” Hester noted.

dlhBOWLES isn’t the only company out there providing cleaning technology, but it does believe it has an edge on the competition, as its system consumes less fluid than competing technology. “That might not sound significant, but when you’re dealing with the weight associated with carrying around extra washer fluid in the sensor systems, that becomes a larger issue for the vehicle platform engineers,” he emphasised. Its system consumes, on average, about 30% less than rival offerings.

Towards autonomy

Most of the camera wash systems today are manually activated by the driver but this will change. “There will come a time when the sensors themselves will self-evaluate and they will be able to detect that there’s something obscuring their view,” predicted Hester. “At that point they will activate the washer pump, and then the washer pump will spray and clean off whatever that debris is.”

Those situations with graffiti or stickers on a road sign are clearly very difficult for the vehicle to sort out. The same problem exists if you have a smudge of dirt on the camera as it’s looking at a stop sign. That just highlights the importance of keeping that sensor clean all the time

Ford already offers automatic lens washers with the front and rear cameras on its Edge and Explorer models, and the technology was flagged by Forbes as one of the ‘hottest new-car features’ at the end of 2016. The responsibility for providing that functionality doesn’t lie with dlhBOWLES, however. “That would probably fall to the sensor supplier, the camera manufacturer or perhaps the OEMs… They would be writing that software and ensuring the camera algorithms are doing the object detection. That same programme would be responsible for deciding when the sensor was operating in a degraded state,” elaborated Zack Klein, dlhBOWLES’ Lead Engineer – Camera & Sensor Wash. “LiDAR is a little more straightforward, in that it is continually providing distance measurements. If you have a persistent reading one inch in front of a sensor, then it can determine there is something on the lens and it needs to self-clean.”

However, very little change would be required in the technical functionality of the cleaning system to make it applicable for fully autonomous vehicles. “If you took a current Ford or BMW sedan and you wanted to make that an autonomous vehicle, it would take very little from a sensor cleaning standpoint to get that up to snuff. That’s where our decades of experience comes into play – optimising the spray and integrating the new cleaning system into the pre-existing system,” said Hester. “That said, the specific way that the customers are packaging the sensors or how the OEMs themselves decide to style the sensors into the vehicle might pose their own challenges.”

No room for mistakes

There’s little room for mistakes in this area, as a malfunctioning cleaning system could have serious implications. “When a bug is splattered on the windshield, a human driver can move his head and look around the bug. But if you’re talking about a camera or a sensor, that single insect can obscure the entire sensor or half the sensor, easily. If that’s a critical sensor that the vehicle is using to navigate, it’s driving blind,” warned Klein.

Obscured sensors can be as dangerous as a malicious hack – or even more so. As an example, Klein likens mud on a sensor to deliberate road sign manipulation. Vehicle detection systems must be able to see clearly any road signs before they can correctly classify them. Hacks have shown that something as simple as graffiti or stickers placed over a sign can trick an autonomous vehicle into incorrectly classifying or ignoring it. “Those situations with graffiti or stickers on a road sign are clearly very difficult for the vehicle to sort out. The same problem exists if you have a smudge of dirt on the camera as it’s looking at the stop sign,” he explained. “And it’s not just that one stop sign, it’s every stop sign the vehicle sees. That just highlights the importance of keeping that sensor clean all the time.”

There will come a time when the sensors themselves will self-evaluate and they will be able to detect that there’s something obscuring their view, activate the washer pump, and the washer pump will spray off that debris

Sensor fusion is another big trend that could potentially shape dlhBOWLES’ strategy, and the redundancy provided could help with challenging situations.

“There could conceivably be a bit of debris within the view of the optical sensor and it could potentially misunderstand what an object is due to that. However, sensor fusion is being discussed, where you can generate a three-dimensional landscape of the surroundings based on using three LiDARs and six HD cameras at different views on different portions of the vehicle,” explained Hester. “That redundancy will be absolutely necessary in addition to what we believe is necessary with the cleaning systems that we produce.”

In the interim

Before fully autonomous vehicles make their way onto public roads, however, there will be an interim period in which cars operate at lower levels of autonomy. In these cases, drivers will need to actively monitor the vehicle while it performs certain functions itself. It is in this interim environment that dlhBOWLES sees considerable potential for its technology.

“The vehicle will need to verify that the driver is awake or at the very least is present. There will be cameras on the inside of the vehicle to do that, and there could be some scenarios where we take our air-blow nozzles and clear potential dust or debris that occur on the inside for these sensors,” concluded Hester.

This article appeared in the Q1 2018 issue of Automotive Megatrends Magazine.