Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Adapting Static Analysis Tools to Meet User Expectations

Abstract

Traditionally, static analysis tools for catching program errors and security vulnerabilities were designed as verification tools. Hence, soundness, the criteria of never missing an error, was the primary goal. In practice, however, most users are significantly more concerned about false-positives, analysis times, and repairability than unsoundness; they expect the tools to have fewer than 20% false-positive warnings and take at most a few minutes to run. Since most static analysis tools are optional to run, users give up on tools that don't meet their expectations. To meet these expectations, a few newer tools have been designed from the ground up to prioritize these three criteria, but they require a redesign for every static analysis; they cannot use existing, mature, soundness-focused tools. So the question is: can we adapt existing soundness-focused static analysis tools to meet user expectations?

This thesis shows that the answer to this question is a yes, by introducing three new tools, CGPruner, QueryMax, and RLFixer, to address the three criteria users care about the most: false-positives, analysis time, and automated repair suggestions. The central idea underlying each of these tools is identifying opportunities where a little soundness can be traded off for large improvements in these three criteria. These three new tools are designed as pre-processors and post-processors to a black-box static analysis, and hence are applicable to many analyses. Our experiments show that they significantly improve the results of several existing soundness-focused static analysis tools on the three critical user criteria.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View