Cognition for cars: Making sense of sensor data helps autonomous cars make better decisions
Autonomous vehicles are tasked with the tall order of exceeding the performance of all human drivers, all of the time. The systems designed to manage an autonomous vehicle are growing ever more complex as a myriad of new sensors come online and the available computing power to process the data generated grows exponentially. One unique startup is positioned to provide that cognitive layer that helps cars process the data to improve it’s performance.
“We provide cognition for cars. If you think of the self-driving space as three categories, the eyes, the cognition and the decision … cognition is the layer that describes the world, the sensors see the world and the decision is the layer that interacts with the world. We provide that middle layer,” said Sravan Puttagunta, founder and chief executive officer of Civil Maps.
Puttagunta described his company’s value proposition to Jeff Frick (@JeffFrick), host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during this year’s Auto Tech Council – Innovation in Motion in Milpitas, California. (* Disclosure below.)
Intelligent sensor fusion for better decision making
Combining sensor data from a wide range of high-resolution nodes may seem complex, but Puttagunta broke down the problem into an easy-to-comprehend analogy.
“We as humans have multiple senses. We have our eyes, we have our ears, our sense of touch, even our sense of orientation through our sinuses. Cars are very similar; they have different information from IMU [inertial measurement unit], GPS [Global Positioning System], radars, cameras and lidars, so they have to do a similar sensor fusion. Cognition takes care of all of that sensor fusion and makes sense of the world so the car can make decisions in a very confident way,” Puttagunta explained.
Moving beyond helping cars make sense of the sensor data, Puttagunta see’s the next challenge around the corner centered around communicating the car’s perception back to the passengers.
“As a driver, if I’m giving up my ability to communicate with the car mechanically, I need to have some other interface where the car communicates to me visually; and augmented reality is that replacement. It communicates the car’s understanding of the world, it’s intentions … it’s developing my trust in the system over time,” Puttagunta concluded.
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of Auto Tech Council – Innovation in Motion. (* Disclosure: TheCUBE is a paid media partner for Auto Tech Council – Innovation in Motion. Neither Western Digital Corp., the event sponsor, nor other sponsors have editorial influence on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
Since you’re here …
… We’d like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.
If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.