Für eine korrekte Darstellung dieser Seite benötigen Sie einen XHTML-standardkonformen Browser, der die Darstellung von CSS-Dateien zulässt.

Youtube Kanal CG

Sie sind hier

  • English
  • Deutsch

Research

Time-of-Flight Range Data

Since real-time range cameras (Time-of-Flight or Kinect cameras) become more and more available in the market, new 3D applications arise such as scene understanding, augmented reality or human-computer interaction.

A growing number of modern applications such as position determination, object recognition and collision prevention depend on accurate scene analysis.

An important field of reasearch in computer vision is the 3D analysis and reconstruction of objects and scenes.

Simulation of Time-of-Flight sensors, specifically PMD sensors, with emphasis on dynamic scene handling.

Particle Systems

The main research approaches in computational fluid dynamics (CFD) are grid-based or particle-based.

Physically correct particle systems (PS) are designed to add essential properties to the virtual world. This work introduces a real-time GPU implementation of both the simulation and rendering of a dynamically-growing particle system of up to 1 million particles.

Efficient SPH-based simulation of convective-diffusive flows and surfactant dynamics and interactive on-the-fly rendering of the resulting particle data.

Level sets are used for the representation and evolution of closed surfaces. The particle level set (PLS) method combines the advantages of grid-based level set approaches and particle-based methods by interchanging the information between the grid and the particles. 

Flow visualizations can help, e.g., climate researchers to explore flow characteristics. In this work, the Particle Level Set (PLS) method is adapted to volumetric dye advection via streak volumes, and to the visualization by time surfaces and path volumes. Additionally, a complete GPU-based particle engine and framework for the real-time visualization of unsteady climate flow data sets is presented.

Scientific Visualization and Medical Data

GPU-based visualization of unsteady mass and concentration flows using stream, path, and streak lines.

Rendering of volume data for medical applications, and related techniques

Top view on a vertebra

One of the routine examination procedures for patients with back pain is magnetic resonance imaging (MRI).

Volume Rendering and related techniques.

Computer Vision and Sensor Data Analysis

The DFG-project "Interactive Analysis and Visualization of Multi-modal Confocal Raman Microscopy Data" involves the analysis and visualization of the measurement data obtained from our multispectra

Multi- and Hyperspectral imaging allows the acquisition of image series of a specified wavelength range.

A SAR image overlayed over other remote sensing data.

Interactive visualization and analysis of images from Synthetic Aperture Radar (SAR) images.

Fast separation of direct and global scene components is subject to resolution loss. Therefore, we developed a constrained up-scaling technique in order to perform high-resolution acquisitions for time-critical scenes.

Inverse Rendering refers to estimation of intrinsic scene characteristics (depth, shape, surface orientation, incident light, reflectance, ...) given a single photo or a set of photos of the scene.

The processing of THz data allows to find object properties in a frequency range that is not well explored, yet. Current imaging technologies are not useable in this part of the spectrum. For example, security screenings and material detection can be improved.

Scene Flow is the motion of each and every single point in three dimensional space (typically RGB-D data). We focus on real-time high accuracy algorithms for RGB-D valued data.

Computer Graphics

Real-time Rendering and Collision Handling of grass fields

Light field rendering (part 1) and relighting (part 2).

Progressive online integration of uncalibrated image sequences with substantial geometric and/or photometric discrepancies into a single, geometrically and photometrically consistent image.

NASA Video Frame 000

Using depth maps, stereoscopic 3D video can be synthesized from conventional 2D video, and viewing conditions for 3D video can be improved by on-the-fly retargeting to different display geometries.

The well known NASA Blue Marble Next Generation (BMNG) data set. You are looking onto the University of Siegen.

Accurate terrain rendering for dynamically changing terrain data from remote sensing systems.

The Virtual Reality lab at the University of Siegen.

Out-of-core voxelization on GPU to create sparse voxel octrees from triangle meshes.