With the increasing availability of remote sensing data recorded from multiple sources, comprehensive digitalization of urbanization processes becomes possible, aiming to provide quantitative information for smart city development. Most of the existing approaches for urbanization monitoring, however, are based on individual sensors with limited observation. The overarching goal of this joint project is to integrate the complementary information from multisource data for reliable urbanization monitoring. The data sources include high-resolution RGB images, hyperspectral data, nighttime remote sensing images, and LiDAR measurements. Citizen data, part of the big data, provide auxiliary information and high-level semantic features, which form part of the indicators of human settlements, regional population capacity and migration patterns. This new data type will also be included in this project. How to conduct data fusion and information extraction effectively among these multimodal images is a big challenge to face and handle.

The research will focus on three areas: (1) Image quality improvement by multisource data fusion. (2) Fine scale 3D urban feature mapping and change detection. (3) Road mapping and light pollution analysis.

School

School of Engineering & IT

Research Area

Imaging | AI for Space

Supervisor