Many of the major auto vendors, along with well-known non-auto companies such as Google, are devoting resources to developing autonomous vehicles (often called "self-driving cars"). These vehicles obviously need to see where they are going.

To properly create and decode the situation around the car in all directions and so make the appropriate driving decisions, the flat image that could be generated by a basic camera is inadequate. Instead, self-driving vehicles need a detailed, quantifiable 3D picture of their surroundings. Even using two cameras in a "stereo" mode would likely fall short in providing the required image detail, resolution and precision.

What about using radar or ultrasound, either of which can create detailed images in some circumstances? Getting a 3D image would seem to be a straightforward task for either conventional RF-based radar or acoustic-based ultrasound, but neither is really suitable. While today's autonomous cars have radar and sonar for basic side-to-side lane management as well front-and-back collision avoidance, neither can provide the angular and distance resolution needed for the vehicle to image its surroundings in 3D. This shortcoming is an unavoidable byproduct of the longer wavelengths of radar and sonar signals.

Search for Precision

That's why LIDAR – light detection and ranging – is used. Instead of RF or acoustic power, LIDAR uses laser-generated pulses of light. Based on accurate measurements of the time it takes for any reflections to be received ("time of flight"), the system can determine what is in front of the laser.

Just like radar and sonar, the concept is simple but the execution is difficult. After all, light and photons are very different in behavior and management than RF or acoustic energy. Still, LIDAR works well and a light-based system can provide both precise distance and angular resolution due to the short wavelength of light, among other factors.

A LIDAR system goes far beyond just a basic pulse/reflection mode of sensing "if something is ahead, and, if so, how far ahead?" Instead, the role of a LIDAR system is to provide the raw data so that the car's image processors can create a 3D "point cloud" based on the reflections. This point cloud then can be further integrated by the vehicle's processors to provide a detailed sense of the surroundings in all directions, and at what distances. Creating this detailed image in real time requires a significant amount of dedicated, specialized computational effort after the LIDAR front-end capture. This is due both to the real-time nature of the situation and the sheer volume of reflection data (points) being returned.

Basic LIDAR operation is analogous to radar and sonar. Image source: NASABasic LIDAR operation is analogous to radar and sonar. Image source: NASAThe processing algorithms transform the huge amounts raw reflection data into a volume and vector map relative to the vehicle's position, speed and direction. The resultant image insight is used for object identification, motion vectors, and collision prediction and avoidance. The algorithms are designed to define the image in different zones; typically, a medium distance of about 20-40 meters to the sides for angular imaging, and between 150- 200 meters for front-and-rear long-distance imaging.

Note that LIDAR is not just for autonomous vehicles with their relatively short-distance imaging needs and modest speeds. High-power versions are successfully used in aircraft to create high-resolution images of the Earth's surface and structures, for aerial mapping, by orbiting satellites to measure ocean levels and waves, and even for guidance in spacecraft docking.

LIDAR Basics: Light and Mirrors, or Not

The LIDAR system begins with a laser diode which is directed to emit a brief pulse of infrared light, approximately 100 psec long. The system then switches to "receive" mode, in which a photoreceptor senses any reflections. By using the known speed of light, it's relatively easy to calculate the distance to the object which produced the reflection.

But that's where "easy" stops. A single pulse will tell you what is out there in one direction (a vector), but will not create an image. To create an image, the electronics of LIDAR systems resort to an old optical instrument, a spinning-mirror scanner. This is the cylinder-shaped object seen on top of many autonomous cars. A mirror assembly rotating at 1-10 revolutions per second directs the pulsed light spruce around the 360⁰ circle, and also directs the reflected beam back to the photosensor. Some mirror assemblies do not rotate around the full circle; instead, they scan back and forth over a limited angle. This allows for faster scanning but a more complicated mechanical arrangement.

There's some irony that the relatively old-fashioned mechanical assembly is the optical front-end, working in conjunction with advanced components and electronics such as laser diodes and high-performance graphics computing engines. At present, the rotating mirror is the only commercially viable scanning technique. As a mature (but still evolving) device, a well-designed unit can be relatively reliable and compact, the result of decades of experience.

Multiple, compact LIDAR systems with no moving parts could be mounted at multiple places on a vehicle. Image source: DARPAMultiple, compact LIDAR systems with no moving parts could be mounted at multiple places on a vehicle. Image source: DARPA Still, a LIDAR with no moving parts would be both attractive and likely more compact, so research has focused on alternatives to the rotating mirror. Much of this research has been funded by the U.S. Defense Advanced Research Projects Agency (DARPA). Efforts also are underway to develop both electrically controlled emitter arrays as well as broad-area receiver arrays.

These arrays attempt to replicate for optical signals a topology that has been used successfully for several decades in radar and sonar/ultrasound: the active electronically steerable array (sometimes called "phased array"). In this configuration, the pulse timing of each of the hundreds or even thousands of tiny antennas (for radar) or transducers (sonar/ultrasound) of the array is individually controlled (which is equivalent to phase shifting); this allows the array to created fully steerable, synthesized wavefronts.

Extreme Demands

While the steerable array is a valid and proven electronic concept, any implementation at optical wavelengths places extreme demands on the array. The elements must be precisely set to within a few microns of each other, and positional placement errors in manufacturing or shifts due to environmental stresses as small 100 nanometers can degrade performance. Fortunately, the same photolithographic techniques used for MEMS devices and ICs can being adapted for building a steerable array and the complicated electronics to support it.

In one of the DARPA projects, the synthesized beam can sweep over a 50⁰ angle at 100,000 times a second (10,000 times the mechanical mirror speed). However, the sweep angle may be too narrow for many aspects of the autonomous car; again, it's only at the starting point of a new technology. Alternatively, if the electronic version can be made small and inexpensive enough, it may make sense to use several these devices in an overlapping configuration to enable stitching together the individual images if needed.

In detector array LIDAR, the entire scene is "flooded" with pulsed light. Image source: NASA In detector array LIDAR, the entire scene is "flooded" with pulsed light. Image source: NASA LIDAR prototyping work also is being done using a high-power vertical-cavity surface-emitting laser (VCSEL) to flash nanosecond-wide pulses over a broad area and thus "flood" the field of view. The reflected light then would be captured by an array of multiple receiver elements in a focal-plane array configuration. This approach has issues of emitted power safety, thermal concerns on the source, and consistency and timing accuracy at the receiver array, critical factors for achieving a focused, effective point-cloud image.

Next time you see or read about the autonomous car coming soon, just remember that literally and figuratively there is an old-fashioned, mechanically rotating mirror for the LIDAR systems at the top of all those almost countless sophisticated sensors and actuators, backed by incredible processing power and advanced algorithms — at least for now.

But also remember that just 10 years ago, the idea of a placing 77-GHz radar system in a mass-market product such as an automobile was almost unthinkable. Even so, it is now available as a standard feature on some vehicles. Indeed, it’s not impossible to imagine that all-electronic LIDAR may be the next disruptive technology for cars and other 3D imaging applications.

References

1. Lidar-on-a-Chip: Scan Quickly, Scan Cheap http://spectrum.ieee.org/cars-that-think/transportation/sensors/lidaronachip-scan-quickly-scan-cheap


2. Real-Time LiDAR Signal Processing FPGA Modules http://www.techbriefs.com/component/content/article/1264-ntb/tech-briefs/electronics-and-computers/23658-gsc-17215-1

3. A Survey of LIDAR Technology and its Use in Spacecraft Relative Navigation http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20140000616.pdf

4. Lidar advances drive success of autonomous vehicles

http://www.laserfocusworld.com/articles/print/volume-51/issue-05/features/photonics-applied-transportation-lidar-advances-drive-success-of-autonomous-vehicles.html

5. SWEEPER Demonstrates drive success of autonomous vehicles

http://www.laserfocusworld.com/articles/print/volume-51/issue-05/features/photonics-applied-transportation-lidar-advances-drive-success-of-autonomous-vehicles.html

6. What is LIDAR? http://oceanservice.noaa.gov/facts/lidar.html