If you're designing autonomous vehicle applications, you know how important LiDAR (light detection and ranging) is. A critical optical technology that executes autonomous vehicle distance-sensing, LiDAR is an important part of advanced driver assistance systems (ADAS) and essential for self-driving cars. Its performance depends on the optical front-end, as well as how the signal is transmitted through the signal chain and then processed. An important component in this signal chain is the transimpedance amplifier (TIA). In this blog post, I'll provide an overview of how LiDAR works and discuss what to look for in an effective TIA.
In LiDAR applications, a laser light nanosecond pulse burst is sent from a source and reflected from objects in the view. The reflected light is detected, and the time of flight (ToF) creates a distance map of the objects in the view (Figure 1).
Figure 1. The view of a car's laser/receiver system (A) captures items in a pie shape angle. The ToF data measures (B) the distances of objects from the laser/receiver system.
In Figure 1, a nanosecond pulsed signal burst from the car laser (A) sends light to the targeted area in the view. Once the laser signal bursts hit an object, the light goes back to the car. The electronics in the vehicle capture the returned signal with a ToF data stamp. The measure of the ToF is the round-trip time with a return to the laser/receiver system location. To collect complete ToF data, the laser/receiver system spans across the view.
The ToF data creates a distance map (B) that provides the distance of objects from the laser/receiver system. In Figure 1.B, the smaller distance data values indicate closer items and larger distance data values indicate that an item is further away. Per the laser/receiver ToF data (B), the red figure (a woman) is the closest, followed by the yellow object (a tree), and, finally, the remote green object (a city bus). The area in the view is consistently scanned back and forth to gain full and constant coverage.
A Look at Laser/Receiver System Electronics
The laser/receiver system consists of multiple photodiodes, a laser diode, and supporting electronics to transmit and receive the light signal (Figure 2).
Figure 2. The laser/receiver system transmits light across the view to find objects with the reflection of the laser light.
In Figure 2, the laser driver initiates a laser light pulse transmission towards the object and feeds back a frame of time-reference signal through D2, TIA2, and COMP2 to the microcontroller (MCU). The laser light travels to a glass plate that reflects the laser signal to the MCU and through to the transmitter optics (specified lenses).
The emitted pulse reaches the object and reflects to the laser/receiver system. Once past the receiver optics and mirror reflection, the light impinges on an InGaAs photodiode (D1). An InGaAs photodiode is a highly sensitive semiconductor electronic device with an optical sensitivity bandwidth of 1310nm to 1550nm.
The light impending on D1 may be bright or dim, depending on the distance traveled. Additionally, there may be contaminants in the atmosphere, and to confuse the system, there may be interfering phantom lights. The D1 converts light to current (ID1), which travels on to the transimpedance amplifier #1 (TIA1) and comparator #1 (COMP1).
The Brains Behind What the Photodiode Sees
The TIA1 and COMP1 are the brains behind what the photodiode sees. TIA1's frequency bandwidth maintains the integrity of the rise and fall times of the returning pulses and allows the acquisition of more pixels. TIA1 also can safely withstand and quickly recover from large, transient input overload currents.
A low noise floor captures the object's details, producing an accurate output for the following comparator. A further enhancement of the dynamic range includes a programmable gain function. Also, since the optical front-end of a LiDAR system consumes a significant amount of power, placing TIA1 and TIA2 in low-power modes can be advantageous. The TIA design in Figure 3 specifically meets these challenges.
Figure 3. MAX40660/MAX40661 transimpedance amplifiers for automotive LiDAR systems.
The MAX40661 and MAX40660 transimpedance amplifiers deliver high bandwidths of 160MHz and 490MHz, respectively. These devices provide high resolution by amplifying high-speed nanosecond input current pulses. A noise density of 2.1pA√Hz provides higher system accuracy and operation across longer distances. The dynamic range adjustment uses two pin-selectable transimpedance values of 25kΩ and 50kΩ, eliminating the need for an additional amplification stage to match different photodiodes with differing parasitic capacitance. These parts robustly withstand input currents up to 2A for 10ns and have a fast 2ns overload current recovery time. When not in use, the low-power mode consumes only 26mW of power. These ICs are AEC-Q100 qualified, and their FMEDA results are available to assist ASIL-compliance computations at the system level. They are available in a 3mm x 3mm, 10-lead TDFN package and, to facilitate manufacturing throughput, the package has side-wettable flanks.
The comparators in these circuits, COMP1 and COMP2, change the TIA's analog output to a clean digital signal (Figure 4).
Figure 4. MAX40025/MAX40026 transimpedance amplifiers for automotive LiDAR systems.
The MAX40025 package is in a 1.218mm x 0.818mm, 6-bump wafer-level package (WLP), while the MAX40026 is available in a 2mm x 2mm 8-pin TDFN side-wettable package and meets AEC-Q100 automotive qualification requirements.
Commercialization of autonomous cars represents an exciting journey ahead, and LiDAR technology in ADAS is right in line to become a significant player in the future of autonomous vehicles. By placing attention on the operation of the optical front-end, this blog examined how the receiver signal chain's electronic components play a significant role in the system performance. The LiDAR-ready TIA and comparator highlighted here provide the unprecedented levels of system accuracy that are necessary for LiDAR applications.
For a closer look at how the technology fusion of cameras, radar, and LiDAR systems creates a near-human vision system for autonomous cars, read my blog post, "Guide to LiDAR Sensors for Self-Driving Cars."