MDMASNet: A Dual-task Interactive Semi-supervised Remote Sensing Image Segmentation Method
Abstract
Remote sensing image (RSIs) segmentation is widely used in urban planning, natural disaster detection and many other fields. Compared with natural scene images, RSIs have higher resolution, complex imaging, and diverse object shapes and sizes, while semantic segmentation methods based on deep learning often require many data labels. In this paper, we propose a semi-supervised RSIs segmentation network with multi-scale deformable threshold feature extraction module and mixed attention (MDMANet). First, a pyramid ensemble structure is used, which incorporates deformable convolution and bole convolution, to extract features of objects with different shapes and sizes and reduce the influence of redundant features. Meanwhile, a mixed attention (MA) is proposed to aggregate long-range contextual relationships and fuse low-level features with high-level features. Second, an FCN-based full convolution discriminator task network is designed to help evaluate the feasibility of unlabeled image prediction results. We performed experimental validation on three datasets, and the results show that MDMANet segmentation provides more significant improvement in accuracy and better generalization than existing segmentation networks.
Recommended Citation
L. Zhang et al., "MDMASNet: A Dual-task Interactive Semi-supervised Remote Sensing Image Segmentation Method," Signal Processing, vol. 212, article no. 109152, Elsevier; European Association for Signal Processing, Nov 2023.
The definitive version is available at https://doi.org/10.1016/j.sigpro.2023.109152
Department(s)
Civil, Architectural and Environmental Engineering
Keywords and Phrases
Attention mechanism; GAN; Semantic segmentation; Semi-supervised learning
International Standard Serial Number (ISSN)
0165-1684
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2023 Elsevier; European Association for Signal Processing, All rights reserved.
Publication Date
01 Nov 2023