Slow Down, Move Over: A Case Study in Formal Verification, Refinement, and Testing of the Responsibility-Sensitive Safety Model for Self-Driving Cars

被引:2
作者
Strauss, Megan [1 ]
Mitsch, Stefan [1 ]
机构
[1] Carnegie Mellon Univ, Comp Sci Dept, Pittsburgh, PA 15213 USA
来源
TESTS AND PROOFS, TAP 2023 | 2023年 / 14066卷
基金
美国国家科学基金会;
关键词
differential dynamic logic; refinement; testing; self-driving cars; collision avoidance; theorem proving;
D O I
10.1007/978-3-031-38828-6_9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Technology advances give us the hope of driving without human error, reducing vehicle emissions and simplifying an everyday task with the future of self-driving cars. Making sure these vehicles are safe is very important to the continuation of this field. In this paper, we formalize the Responsibility-Sensitive Safety model (RSS) for self-driving cars and prove the safety and optimality of this model in the longitudinal direction. We utilize the hybrid systems theorem prover KeYmaera X to formalize RSS as a hybrid system with its nondeterministic control choices and continuous motion model, and prove absence of collisions. We then illustrate the practicality of RSS through refinement proofs that turn the verified nondeterministic control envelopes into deterministic ones and further verified compilation to Python. The refinement and compilation are safety-preserving; as a result, safety proofs of the formal model transfer to the compiled code, while counterexamples discovered in testing the code of an unverified model transfer back. The resulting Python code allows to test the behavior of cars following the motion model of RSS in simulation, to measure agreement between the model and simulation with monitors that are derived from the formal model, and to report counterexamples from simulation back to the formal model.
引用
收藏
页码:149 / 167
页数:19
相关论文
共 25 条
[1]  
Abhishek A, 2020, P AMER CONTR CONF, P4729, DOI [10.23919/acc45564.2020.9147679, 10.23919/ACC45564.2020.9147679]
[2]  
Althoff M, 2017, IEEE INT VEH SYM, P719, DOI 10.1109/IVS.2017.7995802
[3]   A Formal Safety Net for Waypoint-Following in Ground Robots [J].
Bohrer, Brandon ;
Tan, Yong Kiam ;
Mitsch, Stefan ;
Sogokon, Andrew ;
Platzer, Andre .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (03) :2910-2917
[4]  
Bohrer B, 2018, ACM SIGPLAN NOTICES, V53, P617, DOI [10.1145/3192366.3192406, 10.1145/3296979.3192406]
[5]   KeYmaera X: An Axiomatic Tactical Theorem Prover for Hybrid Systems [J].
Fulton, Nathan ;
Mitsch, Stefan ;
Quesel, Jan-David ;
Voelp, Marcus ;
Platzer, Andre .
AUTOMATED DEDUCTION - CADE-25, 2015, 9195 :527-538
[6]   Implicit Definitions with Differential Equations for KeYmaera X (System Description) [J].
Gallicchio, James ;
Tan, Yong Kiam ;
Mitsch, Stefan ;
Platzer, Andre .
AUTOMATED REASONING, IJCAR 2022, 2022, 13385 :723-733
[7]   Falsifying Motion Plans of Autonomous Vehicles With Abstractly Specified Traffic Scenarios [J].
Klischat, Moritz ;
Althoff, Matthias .
IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (02) :1717-1730
[8]  
Kohl M.A., 2021, An executable structural operational formal semantics for python
[9]   Autonomous Vehicles Meet the Physical World: RSS, Variability, Uncertainty, and Proving Safety [J].
Koopman, Philip ;
Osyk, Beth ;
Weast, Jack .
COMPUTER SAFETY, RELIABILITY, AND SECURITY, SAFECOMP 2019, 2019, 11698 :245-253
[10]   Formally Verified Safety Net for Waypoint Navigation Neural Network Controllers [J].
Kopylov, Alexei ;
Mitsch, Stefan ;
Nogin, Aleksey ;
Warren, Michael .
FORMAL METHODS, FM 2021, 2021, 13047 :122-141