Oral Comprehensive Exam Details
Toolkits for the Automatic Construction of Data Flow Analyzers
|Date:||June 07, 2004|
|Committee:||Michal Young (Chair)|
Data flow analysis is a form of static program analysis, which determine, for each instruction in a program, a property that is approximate but guaranteed to hold for that instruction at runtime. The technique is characterized by a nondeterministic model of a program's control flow and by a lattice-theoretic encoding of the data flow problem itself. In addition to formal guarantees of crucial properties, this abstraction over key elements of the analysis facilitates the development of generic solution algorithms.
As with the relationship between parsing theory and parser generators, the development of a unifying foundation presents data flow analyzer construction tools as an obvious application and suggests a straightforward approach to their development. The lattice structures of the framework guide construction of data structures in the toolkit; the solution algorithm, properly instantiated, becomes the heart of the generated analyzer; and framework abstractions become user contracts, constrained by simple interfaces.
In this talk, I will survey developments in the state of the art of data flow analysis construction tools, from those designed for integration with production compilers to those designed as pure research systems. The style of presentation will be conceptual rather than historical, focusing on the kinds of problems that face the designers of such tools.