Finding and Mitigating Memory Corruption Errors in Systems Software
Obtaining secure systems software is notoriously hard. One reason is the continuing use of unsafe languages due to their efficiency, direct control over hardware resources, and developer familiarity. However, this fine-grained control comes with opportunities for mistakes and therefore invites bugs such as memory corruption errors. In the absence of an adequate defense, these bugs can be readily exploited by attackers. This fact—combined with the inevitability of exploitable bugs due to the use of unsafe languages—has spurred a large body of research with code-reuse attacks and defenses of those attacks being the most prominent line of work.
In this dissertation, we apply the principles of code-reuse (which usually targets static or jitted code) in a dynamic context, sidestepping all existing defenses. Concretely, we demonstrate state-of-the-art, whole-function code reuse by abusing the dynamic dispatch mechanism found in languages such as Objective-C. We also devise a defense scheme and apply it to the Objective-C runtime. The result is a low-overhead, drop-in replacement for the Objective-C runtime that prevents our as well as other metadata-corruption attacks.
The assumed inevitability of exploitable bugs and stringent performance constraints of systems software have steered much of the previous research on systems security towards the mitigation of the exploitation phase. Sanitization and fuzzing are industry trends that instead try to weed out the bugs themselves, i.e., they tackle the cause instead of trying to mitigate the consequences. In this dissertation, we present a novel technique to increase the efficiency of sanitizers and extend their applicability via run-time partitioning. We significantly reduce the overhead of two popular compiler-based sanitizers extending their usage scenarios and increasing fuzzing throughput. Together with other recent work, our research challenges the assumption that bugs in systems software are inevitable.