Investigating the geometric structure of neural activation spaces with convex hull approximations
文献类型:期刊论文
作者 | Jia, Yuting1; Zhang, Shao1; Wang, Haiwen1; Wen, Ying1; Fu, Luoyi1; Long, Huan1; Wang, Xinbing1; Zhou, Chenghu2 |
刊名 | NEUROCOMPUTING
![]() |
出版日期 | 2022-08-14 |
卷号 | 499页码:93-105 |
关键词 | Neural networks Activation space understanding Convex hull |
ISSN号 | 0925-2312 |
DOI | 10.1016/j.neucom.2022.05.019 |
通讯作者 | Wen, Ying(ying.wen@sjtu.edu.cn) |
英文摘要 | Neural networks have achieved great success in many tasks, including data classification and pattern recognition. However, how neural networks work and what representations they learn are still not fully understood. For any data sample fed into a neural network, we wondered how its corresponding vectors expanded by activated neurons change throughout the layers and why the final output vector could be classified or clustered. To formally answer these questions, we define the data sample outputs of each layer as activation vectors and the space expanded by them as the activation space. Then, we investigate the geometric structure of the high-dimensional activation spaces of neural networks by studying the geometric characters of the massive activation vectors through approximated convex hulls. We find that the different layers of neural networks have different roles, where the former and latter layers can disperse and gather data points, respectively. Moreover, we also propose a novel classification method based on the geometric structures of activation spaces, called nearest convex hull (NCH) classification, for the activation vectors in each layer of a neural network. The empirical results show that the geometric structure can indeed be utilized for classification and often outperforms original neural networks. Finally, we demonstrate that the relationship among the convex hulls of different classes could be a good metric to help us optimize neural networks in terms of over-fitting detection and network structure simplification. CO 2022 Elsevier B.V. All rights reserved. |
WOS关键词 | ALGORITHM ; NETWORKS |
资助项目 | National Natural Science Founda-tion of China[42050105] ; National Natural Science Founda-tion of China[62020106005] ; National Natural Science Founda-tion of China[62061146002] ; National Natural Science Founda-tion of China[61960206002] ; National Natural Science Founda-tion of China[61822206] ; National Natural Science Founda-tion of China[61832013] ; National Natural Science Founda-tion of China[61829201] |
WOS研究方向 | Computer Science |
语种 | 英语 |
WOS记录号 | WOS:000802967100009 |
出版者 | ELSEVIER |
资助机构 | National Natural Science Founda-tion of China |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/179001] ![]() |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Wen, Ying |
作者单位 | 1.Shanghai Jiao Tong Univ, Shanghai, Peoples R China 2.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Jia, Yuting,Zhang, Shao,Wang, Haiwen,et al. Investigating the geometric structure of neural activation spaces with convex hull approximations[J]. NEUROCOMPUTING,2022,499:93-105. |
APA | Jia, Yuting.,Zhang, Shao.,Wang, Haiwen.,Wen, Ying.,Fu, Luoyi.,...&Zhou, Chenghu.(2022).Investigating the geometric structure of neural activation spaces with convex hull approximations.NEUROCOMPUTING,499,93-105. |
MLA | Jia, Yuting,et al."Investigating the geometric structure of neural activation spaces with convex hull approximations".NEUROCOMPUTING 499(2022):93-105. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。