Fakultät Informatik

tum.3D wins IEEE Visualization Contest 2006

We are very proud to announce that our entry to the IEEE 2006 Visualization Contest, held at the worlds largest and most renowned scientific conference on visualization, has been ranked 1st place by an expert jury.

"California Streaming"

 Jens Schneider,  Jens Krüger,  Kai Bürger,  Polina Kondratieva, Rüdiger Westermann


PDF: [840KB]
Videos: DivX AVI [High 262MB][Medium 131MB][Low 64MB]


Download a demo version of the award-winning Particle Engine.
Download the heightfield: [.x ASCII format (130MB)], [OBJ ASCII format (65MB)].
Download the topographic texture: [Windows bmp (1.5MB)].

Contest Entry Visualization System

The task of this contest is to design an effective visualization system for the  Terashake 2.1 data. This data is the result of an magnitude 7.7 earthquake simulation performed at the  San Diego Supercomputing Center, and comprises of over 10TB of data. A sampled down version of this data (still 70GB) was provided for the contest. Inspection of the  quake maps of the USGS proofed the common knowledge that earthquakes are very frequent in California. This led us to the belief that the visualization system should be designed with short response times in mind, and to be in the best case able to perform a visualiazation of measured data between the strong phase and the aftershock of an earthquake, thus providing efficient means to understand the phenomenon and to take preventive counter-measures. This means that the system should not have long preprocessing times (and in fact we just attach a small header to the data and read it), and it should be running at interactive frame rates, optimally at real-time. Most important, the system should provide meaningful insight into the data, including various other layers to provide the user with positional cues (such as heightfields, topographic maps, or the basins provided with the contest data). Nevertheless, the visualization system should be still affordable in order to enable as much scientists as possible to explore the data in parallel, ruling out expensive special purpose hardware.

In this contribution we addressed the aforementioned issues, presenting a rapid visualization system that runs at interactive frame rates on commodity PC hardware. To achieve the best possible performance, we have modified our  GPU-based particle engine that runs on consumer class graphics hardware. To deal with large real-world or simulated data sets, we are able to stream huge datasets from hard disk. This allows us to present all of the time steps publicly available at the San Diego Super Computing site. User-friendly and interactive control over these time steps is provided by an interface similar to those used in traditional media players. We achieve high quality by application of well-established flow visualization modes, such as particle tracing, stream-lines, stream ribbons etc. The necessary integration for these primitives is performed using higher order numerical schemes and in full float precision on latest nVidia graphics hardware. To provide an intuitive visualization, we build on the experience gained with our  ClearView focus+context approach and applied it to the visualization of unsteady vector fields. Additional data layers can be visualized together with the time-resolved wave propagation field in order to give better positional cues. To demonstrate the capabilities of our system, we obtained a heightfield of southern California from the USGS, we reconstructed the basins as described in the meta-data description, and we obtained an additional topographic texture map. The user can toggle these data layers interactively to co-locate features in the wave propagation data with precise geographic features. ClearView automatically blends the selected layers based on only a few user-selected parameters. To further increase flexibility, for each layer a custom-taylored high-level fragment shader can be used that has full access to the multi-modal dataset as well as the state of the visualization system. For instance it is a matter of only a few lines of shader code to obtain a color-coding of the near-surface layers of the velocity data.

These are the figures that accompanied our entry, but a lot more insight can be gained from the movie. Please click to enlarge, the original images are ca. 7-12MB per piece.



Bachelor and Master thesis in the following areas:
- A remote rendering system for point cloud data (in collaboration with industry)

- Deep learning for improved weather forecasting

- Learning trajectory clustering using neural network
- Learning Level-of-Detail representations for point clouds

- In collaboration with partners from industry, we have a number of thesis topics available in the area of point-based rendering, geo-localization using public data, scene fusion from different viewpoints. If you are interested, please contact  westermann(at)tum.de


- One PhD position on   Turbulence Visualization is available at the Computer Graphics & Visualization group.