Multi-Sensor Data Fusion for Land Cover Mapping

Project Leader
Dr. Pedram Ghamisi

Host institute
Prof. Dr. Xiaoxiang Zhu
Remote Sensing Technology Institute, German Aerospace Center
Signal Processing in Earth Observtion, Technische Universität München

Cooperation Partners
Prof. Dr. Bernhard Höfle, Ruprecht-Karls-Universität Heidelberg

Multisensory Data Fusion

The increased availability of data from different satellite and airborne sensors for a particular scene makes it desirable to jointly use data from multiple data sources for improved information extraction, hazard monitoring, and land cover/land use mapping. In this context, hyperspectral sensors provide detailed spectral information, which can be used to discriminate different classes of interest, but they do not provide structural and elevation information. On the other hand, LiDAR data can extract useful information related to the size, structure, and elevation of different objects, but cannot model the spectral characteristics of different materials. The main objective of this project goes to the proposition of efficient approaches for the integration of LiDAR and hyperspectral data

Multisensory image classification

Figure 1. A schematic drawing of multisensory image classification using extinction profile and kernel PCA.