LiDAR has been a proven technology since the 1960s, with early applications including mapping for architectural and archaeological purposes – the Apollo 15 mission, NASA’s ninth manned mission to the moon, made use of LiDAR to map the lunar surface. But it wasn’t until the turn of the millennium that the automotive industry started to pay heed. LiDAR’s potential for continuous and highly accurate 3D-scanning made it attractive to those involved in early autonomous vehicle development.
Since then, the technology has improved, and top-mounted LiDAR has become a near regular sight on the public roads used for testing by the likes of Waymo. These systems are far from mass-producible however, at one point costing US$75,000 a unit. The issue of cost means the debate continues over whether LiDAR really has a place in the automotive industry. One notable voice which continues to snub the technology is Tesla Chief Executive Elon Musk – at a 2015 press conference he dismissed LiDAR as unnecessary, and in 2017 went on to claim that SAE Level 5 autonomy can be achieved on Tesla vehicles through a combination of far cheaper sensors such as radar and camera, all within two years.
Predictably, sound bites from the Tesla chief have prompted strong reactions from some, including GM’s Scott Miller, Director of Autonomous Vehicle Integration, who in late 2017 said the Tesla chief’s autonomous driving claims were “full of crap”.
Essential for autonomy
Jason Eichenholz, Co-Founder and Chief Technology Officer at LiDAR developer Luminar Technologies, says there can no longer be any debate. “LiDAR is absolutely needed for truly autonomous driving,” he says. That statement was underlined by confirmation that Luminar LiDAR will appear in Toyota’s Platform 2.1 and 3.0 autonomous test vehicles. “As soon as autonomous vehicles start moving off closed tracks and into the real world, they need to be five nines reliable,” notes Eichenholz, referring to the need for 99.999% reliability. “For that, you need LiDAR. Cameras and radars lack the performance needed to react in time to the edge cases seen in real world driving scenarios – for this, cars need to be able to see in 3D like humans do.”
“Humans do not lend equal weight to everything. Our visual cortex means we can quickly evaluate, prioritise and focus on what’s relevant. iDAR does the same to dynamically track targets and objects of interest, whilst always critically assessing general surroundings.” – Luis Dussan, AEye
By edge cases, Eichenholz is referring to unexpected situations which can spontaneously arise on the road, such as a child walking out from between two parked cars, road-works shutting down a traffic lane, or a semi trailer painted the same colour as the sky pulling in front of a driver – a situation which a 2D scanner might have trouble interpreting correctly. The ability to adapt makes humans well-suited to dealing with these situations. The same cannot be said for machines.
“What happens if, for example, a car has pulled over on to the side of the road, but is sticking three feet out into the driving lane?” he continues. “As humans, we can see that, and make an adjustment. High-fidelity vision provided by LiDAR means that vehicles too can understand the world around them, and make decisions before it’s too late.”
Tom Laux, Head of Business Development and Sales for Continental’s Segment High Resolution Flash LiDAR, says that whilst it is not up to suppliers to define whether the industry requires LiDAR, it is clear that five nines performance can’t be achieved via a single sense function. “What’s needed is the combination of three sensors,” he suggests: “That is, a 2D sensor, like a camera, to image the colour of signals, traffic signs and lane markings. Then a radar to determine velocity, and finally a LiDAR to give accurate angular resolution and 3D imaging in a far more precise manner.”
But how to bring down the cost to a level which OEMs could take seriously? The answer lies in solid-state solutions, which Laux defines simply as free of moving parts. This reduces cost by integrating all mechanical parts into a microchip – as Laux explains, anything built using semiconductor base manufacturing technology is subject to the same cost curve based on the size of the manufacturing environment. “In other words,” he says, “the more you produce, the less expensive these things become over time.” The other key benefit is the removal of mechanical parts, meaning far better reliability.
But, says Eichenholz, the idea of ‘solid-state’ shouldn’t be over-emphasized – what’s important is that suppliers deliver scalable systems that can be manufactured at volume which can still hit performance requirements. Laux agrees, suggesting you can only start squeezing costs when the performance is there. Luminar’s system, explains Eichenholz, is built from the ground up. This has helped the company to tackle the numerous challenges it encountered when it began, including range issues.
“One of the key performance targets is detecting a target that’s only 10% reflective,” he says. “This means a dark object, like a tyre or a person wearing a black hoodie, at 200 metres. That number is one we receive from every customer, roughly equivalent to seven seconds.” Laux agrees that the industry’s figure falls between 200 and 250 metres. In addition to this, systems must be immune to outside interference from weather, and other LiDAR systems. To meet the performance requirements within cost, suppliers have to innovate.
“The 3D FLASH LiDAR fires a single laser pulse for every frame. The laser pulse is like the flash on a camera. As a result, it doesn’t matter if it’s dark, or raining. The flash is an illuminator.” – Tom Laux, Continental
Building from the ground up has led Luminar to make a fundamental architecture change, increasing the wavelength of the laser from 905 nanometres to 1550 nanometres, enabling ten times more power whilst remaining within safety limits for the human eye. The original 905 nanometre requirement, says Eichenholz, was a result of using silicon receivers in the LiDAR system. It was assumed that moving to an alternative like indium gallium arsenide (InGaAs) was too expensive. That, says Eichenholz, was a false assumption.
“We’ve built our system from the ground up,” he says, “and have gone deep into the supply chain to optimise manufacturability. Our laser sources, for example, are very scalable because they’re the same that are used in telecommunications devices, hence why we’re at 1550, and using laser diodes and InGaAs receivers.”
Continental, too, is leveraging InGaAs in its Hi-Res 3D FLASH LiDAR system, originally developed by LiDAR developer Advanced Scientific Concepts prior to its acquisition by the Tier 1 giant. “The 3D FLASH LiDAR fires a single laser pulse for every frame,” explains Laux. “The laser pulse is like the flash on a camera, and as it leaves the rod, it passes through something called a diffuser, which spreads the light across the field of view of the receiver object. As a result, it doesn’t matter if it’s dark, or raining. The flash is an illuminator.”
Meanwhile, San Francisco start-up AEye is drawing attention to the embedded artificial intelligence (AI) it uses in its solid-state offering. “LiDAR on its own lacks intelligence,” says Luis Dussan, Founder and Chief Executive. “Existing systems don’t take into account how a scene evolves or what the mission is. They look everywhere, always, in a fixed scan mode, collecting as much data as possible without discretion. 75% to 95% of data is discarded because it’s useless, and collecting unneeded data creates a huge strain on bandwidth, which translates into a delayed response time – a critical safety liability.”
AEye’s iDAR system (Intelligent Detection and Ranging) imitates the human visual cortex, which brings higher resolution to key objects. The system is not restricted to a fixed laser scan pattern. “As humans, we do not lend equal weight to everything around us,” says Dussan. “Our visual cortex means we can quickly evaluate, prioritise and focus on what’s relevant. iDAR does the same to dynamically track targets and objects of interest, whilst always critically assessing general surroundings.” The bandwidth savings, says Lussan, and the resulting improvement in safety means his company hopes to catalyse the safe, timely rollout of failsafe commercial autonomous vehicles.
The importance of LiDAR to the development of ADAS and autonomous drive technology is clear, with OEMs and suppliers making significant investments in LiDAR specialists. As well as TRI’s aforementioned investment in Luminar, Ford and Baidu have invested in Velodyne, GM has acquired Pasadena start-up
Strobe, Osram has taken a 25.1% stake in Leddartech, and ZF controls 40% of Hamburg-based Ibeo Automotive Systems.
A material science
The 2018 Audi A8 grabbed headlines at its unveiling in Barcelona in 2017 for being the first production model to offer SAE Level 3 autonomy, and came fitted with a LiDAR. Valeo is the supplier, and the first to put a system out on the road. Since then, the discussion has revolved around just how low suppliers can bring down the price. But just what might an acceptable cost look like? Continental and ZF reportedly have their sights set on no more than a few hundred dollars per unit.
“LiDAR is absolutely needed for truly autonomous driving. As soon as autonomous vehicles start moving off closed tracks and into the real world, they need to be five nines reliable.” – Jason Eichenholz, Luminar Technologies
The potential drop in cost was a topic at CES 2018. TriLumina, a US-based laser start-up with backing from Denso, said its use of micro-lenses will mean cheaper laser arrays, and that total system costs could be brought down to US$200. Meanwhile Quanergy, a solid-state LiDAR developer, claimed it has a path to a sub-US$100 system, assuming production volumes are ramped up. The California-based company uses smaller sensors and a technology called optical phase arrays, allowing for easier integration and lower cost.
Laux is confident that acceptable costs can be achieved. “LiDAR is a material science,” he says, “and thus I expect we will see a real drop in pricing over the next five to seven years as manufacturing suppliers catch on. Even compared to five years ago there has been progress – at one point, components would cost in the region of four to five thousand US dollars. This is not the case nowadays.”
But, concludes Eichenholz, the industry must proceed with caution. “There are many people who are pushing for lower cost, with no idea of what the trade-offs are in terms of performance,” he says. “In terms of the base technology, there’s been little innovation in LiDAR technology for ten years, and costs have been bought down by sacrificing performance.”
As the industry moves towards autonomy, failure becomes less and less of an option. A race to the bottom in terms of price could produce systems which, where self-driving vehicles are concerned, are not fit for purpose.
This article appeared in the Q1 2018 issue of Automotive Megatrends Magazine.