Perception-Driven Safe Autonomy in Uncertain Environments

ABOUT THE PROJECT

At a glance

Though fully end-to-end driving systems have been prototyped, such systems have not been vetted in high-performance or safety critical environments. Perceptual sensors such as cameras may experience blur and distort at high speeds, and predictions will degrade in quality when a car skids, drifts, or encounters problematic terrain. How can an embodied learning system compensate for new experiences without causing damage to the vehicle?

This proposal investigates a shared perception, prediction, and control system for autonomous cars, where the subunits compensate for each other by adding robustness to the full system. We propose a project that develops such robust interplay on a 1:10 sized model car in an autonomous racing scenario, in which perception and prediction modules are iteratively improved as the car learns to maintain safety while driving more aggressively in a cluttered environment with varied terrain. The car will restrict its sensing to a single on-board camera. Our approach builds upon incorporation of uncertainty in SLAM approaches, learning inside Model-Predictive Control loops, and estimating uncertainty in perception and dynamical systems through high-dimensional statistical methods.

PRINCIPAL INVESTIGATORSRESEARCHERSTHEMES
Benjamin Recht learning control, model-predictive control, safe exploration