Edge aware guidance saliency detection based on multi-modal remote sensing image
Author:
Affiliation:

1.Department of Computer Science and Technology;;2.Beijing Key Laboratory of Petroleum Data Mining,China University of Petroleum,Beijing 102249,China;;3.Beijing Institute of Space Mechanics & Electricity,Beijing 100094,China

Funding:

Ethical statement:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
    Abstract:

    To address the problems of poor robustness and poor detection accuracy of multimodal remote sensing image saliency detection, this paper proposes a method based on the novel and efficient Multi-modal Edge aware Guidance Network(MEGNet), which mainly consists of a salient detection backbone network for multi-modal remote sensing images, a cross-modal feature sharing module and an edge aware guidance network. First of all, a Cross-modal Feature Sharing Module(CFSM) is used during feature extraction for remote sensing image pairs, which encourages different modalities to complement each other in the feature extraction process and suppresses the influence of defective feature data from different modalities. Secondly, based on the Edge Aware Guidance Network(EAGN), the effectiveness of edge features is detected through the edge map supervision module and the final salient detection map will have clear boundaries. Finally, experiments are carried out on three kinds of saliency objects detection remote sensing image datasets. The average Fβ, Mean Absolute Error(MAE) and Sm scores are 0.917 6, 0.009 5 and 0.919 9, respectively. The experimental results show that the proposed MEGNet is suitable for saliency detection in multi-modal scenes.

    Reference
    Related
    Cited by
Get Citation

连远锋,石旭,江澄.基于多模态遥感影像的边缘感知引导显著性检测[J]. Journal of Terahertz Science and Electronic Information Technology ,2023,21(3):360~370

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
History
  • Received:November 01,2022
  • Revised:December 26,2022
  • Adopted:
  • Online: March 31,2023
  • Published: