Understanding 3D Sensors and Measurement systems

99.9% of all optical 3D sensors for rough surfaces fall into four main sensing principles

99.9% of all optical 3D sensors for rough surfaces fall into four main sensing principles1

  • triangulation (e.g. stereo vision, structured light approaches)
  • focus-based sensing (e.g. confocal and depth-from-defocus)
  • short coherence interferometry and holography (e.g. WLI, OCT)
  • time-of-flight based sensing (e.g. Lidar)

Within the group we are working towards an improved understanding of these sensing principles, with respect to realworld applications. Simulation and optimization of the sensors and evaluation algortihms as well as practical experimentation are the key methods that we apply.

Especially, we are interested in

  • Simulation and optimization of 3D sensors
  • Characterization and comparison of different sensing methods
  • real-world applications
  • expert systems for the planning of sensors and their configurations
  • Image processing
Fig. 1: Simulation of different measurement signals
Fig. 1: Simulation of different measurement signals

1The classification into these four fields is partly arbitrary and one might e.g. argue that interferometry and holography are also time-of-flight based techniques or that confocal microscopy is a special part of triangulation.

To the top of the page