The path of the photograph as we know it today began a thousand years ago when Iraqi scientist Abu Ali Al-Hasan Ibn al-Haytham built a device, later known as a camera obscura, which projected images onto a wall. It wasn’t until the late 1830s when Frenchman Joseph Nicéphore Niépce used a camera obscura to record the first lasting image. Through the mid-to-late 19th century – from Daguerreotypes to Emulsion Plates to Dry Plates – the quality of images improved. But photography would remain the domain of experimenters and professionals until the 1880s, when George Eastman founded Kodak, a company which sold a camera for the every-man.

At the beginning of this 1,000-year-long technological journey the goal was to simply capture two-dimensional images of historical events for posterity and backyard pictures for family albums. But today we are on the cusp of a revolution enabled by three-dimensional depth perception not simply for the images they produce, but for the information that is collected at the edge and can be used to allow vehicles and machines to interact with their environment (as on manufacturing floors where machines are programmed to safely grasp objects) and control their movements (such as robots and autonomous vehicles built with the ability to perceive distances and avoid collision) and even provide a new thrust to smartphones as they entertain users with next wave of games and augmented reality and professional quality photos and videos.

Just as fuzzy Daguerreotypes were supplanted by newer, sharper imaging, the same is true for 3D imaging technology, now seeing dramatic improvements in resolution, range and accuracy with the development of 3D Time of Flight (3D ToF). 3D ToF cameras are a type of LIDAR (Light Detection and Ranging) which uses a laser to bounce low-power optical pulses off a target and detecting the light reflected, measures the distance to the target using high resolution image sensors.  With 3D ToF both higher resolution and information from all three dimensions are possible.

For years Analog Devices has been delivering components for 3D time-of-flight (ToF) imaging. Over time as we have taken on some of the deepest challenges in ToF we built a strong system-level expertise.  That experience in addition to a series of collaborations across the industry have allowed us to start building full systems wrapped around our recently developed CMOS imagers to deliver imaging with greater 3D detail, operating over farther distances, and performing robustly across environments.  We are developing a comprehensive roadmap of imagers, laser drivers, software- & hardware-based depth compute, and full systems. The goal is to develop an easy-to-use, high-performance platform for consumer electronics, automotive cabins, and industrial logistics. We anticipate bringing to market our first 3D ToF systems by the end of the year.

As we talk to our customers two major themes recur: they want depth image capture that “just works” as easily as taking a 2D photo (hear that, Mr. Eastman?) as well as easily implemented systems so customers can focus on enriching their application. Three-dimensional depth imaging is complicated as there are many variables that must be addressed for the measurement to come out as expected. Depth imaging includes tasks that require carefully bathing the target in infrared laser light then precisely measuring the delay between the transmitted light and the reflected light – and do so for every pixel in the image sensor. Like the pioneers before us in 2D imaging, we are taking the complicated work of 3D depth imaging off our customers’ plates. 

In the coming months this blog series will highlight the exciting achievements being made and news in the next generation 3D ToF journey. Let us know what you think in the comments below or email us at