Autonomous vehicles of tomorrow will require sensor synthesis for precision navigation
Market research suggests that by 2035, consumers around the world will purchase 85 million autonomous-capable vehicles a year. But before the driverless revolution can take off, the automotive industry must solve some significant technological challenges—not the least of which is precision navigation.
“A normal GPS receiver is accurate to about 15 feet,” said Tim Harris, co-founder and CEO of Swift Navigation, a San Francisco-based company developing a high-accuracy GPS receiver for use in autonomous vehicles, including driverless cars, unmanned aerial vehicles, and ground robots. “That’s good enough to find the restaurant, but it’s not good enough for your car to know what lane it’s in or for a drone to drop a package on your doorstep instead of in the neighbor’s pool.”
Companies like Swift are pioneering affordable precision receivers that promise to help autonomous vehicles navigate with centimeter-level accuracy either by receiving dual signals—signals of different frequencies or from separate global navigation satellite system (GNSS) constellations—or by calculating positioning using an approach known as Real Time Kinematic (RTK) satellite navigation, which leverages actual GNSS radio waves instead of the code they transmit.
While the cost and performance of these receivers is constantly improving, the silver bullet will be an integrated network of diverse sensors, of which GNSS is only one.
“Do you rely on an external positioning system like GPS, or do you treat an autonomous car like a computer with sensors that do the driving like a human does, with no real knowledge of [absolute] location?” asked Brian Markwalter, senior vice president of research and standards at the Consumer Technology Association. “Those are two extremes, and what we’re seeing is an integration of the two—the idea of crowd-sourced integration, where cars will record the ground truth based on their sensors and integrate that with a mapping function in the cloud.”
This sensor network might include cameras, LiDAR, and radar, while the mapping function will likely belong to a solution like HD Live Map, unveiled in January by automotive mapping service HERE, whose cloud-based system supplies autonomous vehicles with a detailed and dynamic representation of the immediate road environment so they can pre-emptively anticipate obstacles in their path. In a virtuous circle, the maps are updated in real time based on road information collected by complementary sensors.
Concluded Markwalter, “Nobody’s counting on a single system for exact positioning. There’s going to be a lot of synthesis taking place within the car to help it understand where it is.”