Spatiotemporal fusion of remote sensing images based on multi-level feature compensation
Author:
Affiliation:

1.School of Computer, Chongqing University of Posts and Telecommunications,Chongqing 400065,China;2.Chongqing Key Laboratory of Image Cognition, Chongqing University of Posts and Telecommunications,Chongqing 400065,China

Funding:

Ethical statement:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
    Abstract:

    A large amount of earth observation data with the high spatial and temporal resolution is employed in many earth science applications. The spatiotemporal image fusion method provides a feasible and economical solution for generating high spatiotemporal resolution data. However, some of the existing learning-based methods are poor in extracting deep image features and utilizing the detail features of high-resolution image. A spatiotemporal fusion method is proposed for remote sensing images based on multi-level feature compensation. It uses two branches to perform multi-level feature compensation and proposes a residual module fused with a channel attention mechanism as the basic unit of the network, which can extract and utilize the deep features of high-resolution input images in more detail. An edge loss is proposed based on the Laplacian operator, which saves the computational cost of pre-training and achieves a good fusion effect. The proposed method is experimentally evaluated by using Landsat and Moderate-resolution Imaging Spectroradiometer(MODIS) satellite images collected from two regions in Shandong and Guangdong. Experimental results show that the proposed method bears higher quality in both visual appearance and objective metrics.

    Reference
    Related
    Cited by
Get Citation

刘文杰,李雨珈,白梦浩,张莉萍,雷大江.基于多级特征补偿的遥感图像时空融合方法[J]. Journal of Terahertz Science and Electronic Information Technology ,2023,21(7):939~951

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
History
  • Received:September 30,2022
  • Revised:November 10,2022
  • Adopted:
  • Online: July 27,2023
  • Published: