IEEE Open Journal of Control Systems (Jan 2024)

Learning Robust Output Control Barrier Functions From Safe Expert Demonstrations

  • Lars Lindemann,
  • Alexander Robey,
  • Lejun Jiang,
  • Satyajeet Das,
  • Stephen Tu,
  • Nikolai Matni

DOI
https://doi.org/10.1109/OJCSYS.2024.3385348
Journal volume & issue
Vol. 3
pp. 158 – 172

Abstract

Read online

This paper addresses learning safe output feedback control laws from partial observations of expert demonstrations. We assume that a model of the system dynamics and a state estimator are available along with corresponding error bounds, e.g., estimated from data in practice. We first propose robust output control barrier functions (ROCBFs) as a means to guarantee safety, as defined through controlled forward invariance of a safe set. We then formulate an optimization problem to learn ROCBFs from expert demonstrations that exhibit safe system behavior, e.g., data collected from a human operator or an expert controller. When the parametrization of the ROCBF is linear, then we show that, under mild assumptions, the optimization problem is convex. Along with the optimization problem, we provide verifiable conditions in terms of the density of the data, smoothness of the system model and state estimator, and the size of the error bounds that guarantee validity of the obtained ROCBF. Towards obtaining a practical control algorithm, we propose an algorithmic implementation of our theoretical framework that accounts for assumptions made in our framework in practice. We validate our algorithm in the autonomous driving simulator CARLA and demonstrate how to learn safe control laws from simulated RGB camera images.

Keywords