中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Learning transferable cross-modality representations for few-shot hyperspectral and LiDAR collaborative classification

文献类型:期刊论文

作者Dai, Mofan; Xing, Shuai; Xu, Qing; Wang, Hanyun; Li, Pengcheng; Sun, Yifan; Pan, Jiechen; Li YQ(李玉琼)
刊名INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION
出版日期2024-02-01
卷号126页码:11
ISSN号1569-8432
关键词Multimodal remote sensing data Meta-learning Few-shot learning Cross-modality feature learning
DOI10.1016/j.jag.2023.103640
通讯作者Xing, Shuai(xing972403@163.com) ; Li, Yuqiong(liyuqiong@imech.ac.cn)
英文摘要Hyperspectral image (HSI) classification, incorporating both spatial and spectral information, is a crucial topic in earth observation and land cover analysis. However, ground objects with similar spectral attributes are still the challenges for finer classifications. Recently, deep learning-based multimodality fusion provides promising solutions by exploiting LiDAR data with its geometric information to fuse with spectral attributes. However, the labor-intensive and time-consuming multimodality data annotation limits the performance of supervised deep learning technologies. How to address the semantic disparity between the LiDAR data and HSIs, and learning transferable representations for cross-scene classifications are still challenging. In this paper, we propose a multimodal fusion relational network with meta-learning (MFRN-ML) to solve these challenges. Specifically, the MFRN-ML incorporates the multimodal learning and few-shot learning (FSL) into a three-stage task-based learning framework to learn the transferable cross-modality representations for few-shot HSI and LiDAR collaborative classification. First, a multimodal fusion relational network, composed of a cross-modality feature fusion module and a relation learning module, is proposed to address the challenge of limited annotations in multimodal learning in a data-adaptive way. Then, a three-stage task-based learning framework can train the network to learn transferable representations with few labeled samples for cross-scene classification. We perform experiments on four multimodal datasets collected by different sensors. Compared with existing supervised, semi-supervised, and meta-learning methods, MFRN-ML attains state-of-the-art performances in few-shot tasks. Particularly, our method shows promising generalization ability on unseen categories across different domains.
分类号一类
WOS关键词LAND-COVER CLASSIFICATION ; IMAGE CLASSIFICATION ; NETWORK
资助项目National Natural Science Foundation of China[42271457] ; National Natural Science Foundation of China[41876105] ; Henan Province[202300410535] ; Joint Fund of Collaborative Innovation Center of Geo-Information Technology for Smart Central Plains, Henan Province ; Key Laboratory of Spatiotemporal Perception and Intelligent Processing, Ministry of Natural Resources[212108]
WOS研究方向Remote Sensing
语种英语
WOS记录号WOS:001152464700001
资助机构National Natural Science Foundation of China ; Henan Province ; Joint Fund of Collaborative Innovation Center of Geo-Information Technology for Smart Central Plains, Henan Province ; Key Laboratory of Spatiotemporal Perception and Intelligent Processing, Ministry of Natural Resources
其他责任者Xing, Shuai ; Li, Yuqiong
源URL[http://dspace.imech.ac.cn/handle/311007/94228]  
专题力学研究所_流固耦合系统力学重点实验室(2012-)
推荐引用方式
GB/T 7714
Dai, Mofan,Xing, Shuai,Xu, Qing,et al. Learning transferable cross-modality representations for few-shot hyperspectral and LiDAR collaborative classification[J]. INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION,2024,126:11.
APA Dai, Mofan.,Xing, Shuai.,Xu, Qing.,Wang, Hanyun.,Li, Pengcheng.,...&李玉琼.(2024).Learning transferable cross-modality representations for few-shot hyperspectral and LiDAR collaborative classification.INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION,126,11.
MLA Dai, Mofan,et al."Learning transferable cross-modality representations for few-shot hyperspectral and LiDAR collaborative classification".INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION 126(2024):11.

入库方式: OAI收割

来源:力学研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。