Isomorphism, Normalizing Flows, and Density Estimation: Preserving Relationships Between Data
Steven Walton
Committee: Humphrey Shi (chair), Daniel Lown, Thien Nguyen
Area Exam(Jul 2023)
Keywords: Normalizing Flow, isomorphism, machine learning, AI

Normalizing Flows are a powerful type of generative model that transforms an intractable distribution of data into a more desirable one through the use of bijective functions. Their concept is simple to understand and create powerful models that allow one to work with much simpler distributions than that of the underlying data. In essence, our motivation in studying Normalizing Flows is to preserve relationships between data. Normalizing Flows have a wide variety of uses and applications, providing important statistical information about data as well as enabling more interpretable control over latent structures. They can be used for mathematical applications, such as: variational inference, density estimation, anomaly detection, manifold analysis; as well as more application focused works, such as: pose estimation, speech generation, image generation, and more. An important aspect of Normalizing Flows is that they preserve the latent structure o f the data that they are trained upon, which makes gives them to the power to perform the aforementioned tasks. In this review we provide an introduction to Normalizing Flows, clarifying how they differ from other popular generative models, provide an updated overview of the current literature, discuss their applications, as well as the future of these models and how they can play a critical role in AI research. We also aim to clarify the distinction between different generative models in a more clear and precise way than many other works. We aim to make this work a sufficient introduction to Normalizing Flows and be self-contained with little to no background required.