- Main
Formal Analysis of AI-Based Autonomy: From Modeling to Runtime Assurance
Published Web Location
https://doi.org/10.1007/978-3-030-88494-9_19Abstract
Autonomous systems are increasingly deployed in safety-critical applications and rely more on high-performance components based on artificial intelligence (AI) and machine learning (ML). Runtime monitors play an important role in raising the level of assurance in AI/ML-based autonomous systems by ensuring that the autonomous system stays safe within its operating environment. In this tutorial, we present VerifAI, an open-source toolkit for the formal design and analysis of systems that include AI/ML components. VerifAI provides features supporting a variety of use cases including formal modeling of the autonomous system and its environment, automatic falsification of system-level specifications as well as other simulation-based verification and testing methods, automated diagnosis of errors, and automatic specification-driven parameter and component synthesis. In particular, we describe the use of VerifAI for generating runtime monitors that capture the safe operational environment of systems with AI/ML components. We illustrate the advantages and applicability of VerifAI in real-life applications using a case study from the domain of autonomous aviation.
Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-