Skip Navigation

Colloquium Details

Visualization and Analysis of Very Large Scientific Simulations

Author:Hank Childs Lawrence Berkeley National Lab
Date:April 03, 2012
Time:15:30
Location:220 Deschutes
Host:Andrzej Proskurowski

Abstract

Visualization and analysis are critical to the success of the simulation process; they help realize the value of computing by increasing the rate at which new science is discovered. Their techniques are used to confirm that simulations are running correctly, to communicate simulation results to an audience, and, most importantly, to explore data, which is often where new insights are obtained.

As supercomputers get ever larger, simulations are producing increasingly massive data sets. Visualization and analysis techniques must keep pace with these increasing data sizes. In this presentation, I will describe the most common approach for visualizing massive data: parallelization. I will describe the strategies and complexities for a data parallel approach, as well as discuss specifics for particle advection, one of the most challenging algorithms in our field to parallelize and the basis for techniques like streamlines, Poincare plots, and Finite-Time Lyapunov Exponents (FTLE). Finally, I will describe several upcoming challenges, including strategies for visualization and analysis in the context of power-constrained high performance computing.

Biography

Hank Childs is a researcher at the intersection of visualization and high performance computing. He received his Ph.D. from the University of California at Davis in 2006 and is currently employed at Lawrence Berkeley Lab. Hank is most well-known as the architect of the "VisIt" project, an end user visualization and analysis tool for large data that is used worldwide, including over 200,000 total downloads. Hank has served as the chief software architect of three different $5M+ multi-institution efforts, for the SciDAC program, the XSEDE program, and the Department of Energy's climate program.