Application of deep learning in ecological resource research: Theories, methods, and challenges
文献类型:期刊论文
作者 | Guo, Qinghua4![]() |
刊名 | SCIENCE CHINA-EARTH SCIENCES
![]() |
出版日期 | 2020 |
卷号 | 63期号:10页码:1457-1474 |
关键词 | Ecological resources Deep learning Neural network Big data Theory and tools Application and challenge |
ISSN号 | 1674-7313 |
DOI | 10.1007/s11430-019-9584-9 |
文献子类 | Review |
英文摘要 | Ecological resources are an important material foundation for the survival, development, and self-realization of human beings. In-depth and comprehensive research and understanding of ecological resources are beneficial for the sustainable development of human society. Advances in observation technology have improved the ability to acquire long-term, cross-scale, massive, heterogeneous, and multi-source data. Ecological resource research is entering a new era driven by big data. Traditional statistical learning and machine learning algorithms have problems with saturation in dealing with big data. Deep learning is a method for automatically extracting complex high-dimensional nonlinear features, which is increasingly used for scientific and industrial data processing because of its ability to avoid saturation with big data. To promote the application of deep learning in the field of ecological resource research, here, we first introduce the relationship between deep learning theory and research on ecological resources, common tools, and datasets. Second, applications of deep learning in classification and recognition, detection and localization, semantic segmentation, instance segmentation, and graph neural network in typical spatial discrete data are presented through three cases: species classification, crop breeding, and vegetation mapping. Finally, challenges and opportunities for the application of deep learning in ecological resource research in the era of big data are summarized by considering the characteristics of ecological resource data and the development status of deep learning. It is anticipated that the cooperation and training of cross-disciplinary talents may promote the standardization and sharing of ecological resource data, improve the universality and interpretability of algorithms, and enrich applications with the development of hardware. |
学科主题 | Geosciences, Multidisciplinary |
出版地 | BEIJING |
电子版国际标准刊号 | 1869-1897 |
WOS关键词 | CONVOLUTIONAL NEURAL-NETWORK ; SEMANTIC SEGMENTATION ; CLOUD DETECTION ; POINT CLOUDS ; CLASSIFICATION ; IMAGERY ; LIDAR ; IDENTIFICATION ; RECOGNITION ; ALGORITHM |
WOS研究方向 | Geology |
语种 | 英语 |
WOS记录号 | WOS:000524962500002 |
出版者 | SCIENCE PRESS |
资助机构 | Strategic Priority Research Program of Chinese Academy of SciencesChinese Academy of Sciences [XDA19050401] ; National Natural Science Foundation of ChinaNational Natural Science Foundation of China (NSFC) [31971575, 41871332] |
源URL | [http://ir.ibcas.ac.cn/handle/2S10CLM1/21822] ![]() |
专题 | 植被与环境变化国家重点实验室 |
作者单位 | 1.Chinese Acad Sci, Inst Bot, State Key Lab Systemat & Evolutionary Bot, Beijing 100093, Peoples R China 2.Chengdu Univ Technol, State Key Lab Geohazard Prevent & Geoenvironm, Chengdu 610059, Peoples R China 3.Univ Chinese Acad Sci, Beijing 100049, Peoples R China 4.Chinese Acad Sci, Inst Bot, State Key Lab Vegetat & Environm Change, Beijing 100093, Peoples R China 5.Peking Univ, Inst Remote Sensing & Geog Informat Syst, Beijing 100871, Peoples R China |
推荐引用方式 GB/T 7714 | Guo, Qinghua,Jin, Shichao,Li, Min,et al. Application of deep learning in ecological resource research: Theories, methods, and challenges[J]. SCIENCE CHINA-EARTH SCIENCES,2020,63(10):1457-1474. |
APA | Guo, Qinghua.,Jin, Shichao.,Li, Min.,Yang, Qiuli.,Xu, Kexin.,...&Liu, Yu.(2020).Application of deep learning in ecological resource research: Theories, methods, and challenges.SCIENCE CHINA-EARTH SCIENCES,63(10),1457-1474. |
MLA | Guo, Qinghua,et al."Application of deep learning in ecological resource research: Theories, methods, and challenges".SCIENCE CHINA-EARTH SCIENCES 63.10(2020):1457-1474. |
入库方式: OAI收割
来源:植物研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。