One hundred and eighty decisions. That’s what the average driver makes every minute. That’s three decisions every second. Pretty impressive.
Now imagine you’re driving and an oncoming vehicle drifts into your lane. At the same time, a man is walking his dog on the sidewalk. Do you brake hoping to minimize a collision? Do you swerve and possibly hit the man and dog? Do you veer the other way and into oncoming traffic? Whichever you choose, it’s unlikely you escape without causing substantial harm or damage.
Even with the ability to make three decisions in a split second, this still becomes a no-win situation for just about all of us. Unless we can avoid it.
And that’s one of the main goals of autonomous driving: vehicles where sensing, communication, actuation, and artificial intelligence (AI) work together to gather information, analyze it, and make decisions faster and sooner than even the best human drivers.
Extending awareness beyond what the eye can see.
Most of the information we get while driving comes from what we see. And that can be limited and impacted by many variables, such as weather, distance, and distraction. As a result, a lot of the decisions we make behind the wheel are reactive. Autonomous vehicles offer the promise of predictive driving. To realize that, they need sensing capabilities far greater than ours.
Three technologies that are central to sensing the external environment of autonomous vehicles are RADAR, LiDAR, and High Performance IMUs.
RADAR is currently being used extensively in Advanced Driver Assistance Systems (ADAS) applications such as collision warning and mitigation, blind spot monitoring, and lane change assistance. Approximately 50% of all recently produced RADAR modules contain technology from Analog Devices (ADI). With a 15-year track record in automotive RADAR, Analog Devices is now developing the innovative Drive360™ RADAR technology platform to deliver the highest level of performance and distance resolution available. The Drive360™ RADAR platform is engineered to support the full 76-GHz to 81-GHz frequency band and provide for platform longevity.
Drive360™ RADAR is built around 28nm CMOS technology, so it can provide the highest degree of digital signal processing integration flexibility, while at the same time, ADI’s RF IP allows for highly differentiated waveform and algorithm implementations. Drive360 RADAR-enabled products will reliably detect smaller, faster moving objects at longer distances (e.g. motorcycles, pedestrians, animals), providing the critical time to avoid injuries or fatalities.
While RADAR is central in the future of all-weather autonomous driving, it is only part of the solution for split-second decision making. Other sensors are needed, such as cameras and LIDAR. LIDAR (Light Detection and Ranging), with its range and accuracy, will be key in solving the most difficult ADAS challenges, and is an area of rapid and intense development. ADI is currently focused on solid-state LIDAR designs based on the same material found in computer monitors to scan light. The cost-effective design overcomes the current prohibitive cost and improves reliability by eliminating the moving parts found in conventional offerings. It also offers improvement in key performance metrics such as range, resolution, frame rate and power consumption.
Inertial Measurement Units
Along with seeing the surrounding environment, autonomous vehicles also need to feel and respond to the road in all weather conditions. Analog Devices Inertial Measurements Units (IMUs) combine multi-axis accelerometers and gyroscopes with processing and calibration in a single package. IMUs, in conjunction with on-board ADAS and satellite localization inputs, provide an accurate picture of a vehicle’s position and heading, while rejecting shocks and vibrations from normal driving.
Much as RADAR, LIDAR and High Performance IMUs can extend the sensing capabilities of autonomous vehicles beyond what we can see; Analog Devices is looking beyond how those technologies are used today. Think about this. In the 1990s, when a new cell phone came out, it may have had a better battery or a thinner profile. Then came the smartphone, which fundamentally and permanently transformed and improved how we live. That’s how we’re looking at the technology we’re about to introduce for RADAR, LIDAR, and High Performance IMUs. These transformational technologies will be the foundation of advanced safety and autonomous driving applications to come.