Elastic Information Bottleneck
文献类型:期刊论文
作者 | Ni, Yuyan2![]() ![]() |
刊名 | MATHEMATICS
![]() |
出版日期 | 2022-09-01 |
卷号 | 10期号:18页码:26 |
关键词 | information bottleneck transfer learning generalization bound |
DOI | 10.3390/math10183352 |
英文摘要 | Information bottleneck is an information-theoretic principle of representation learning that aims to learn a maximally compressed representation that preserves as much information about labels as possible. Under this principle, two different methods have been proposed, i.e., information bottleneck (IB) and deterministic information bottleneck (DIB), and have gained significant progress in explaining the representation mechanisms of deep learning algorithms. However, these theoretical and empirical successes are only valid with the assumption that training and test data are drawn from the same distribution, which is clearly not satisfied in many real-world applications. In this paper, we study their generalization abilities within a transfer learning scenario, where the target error could be decomposed into three components, i.e., source empirical error, source generalization gap (SG), and representation discrepancy (RD). Comparing IB and DIB on these terms, we prove that DIB's SG bound is tighter than IB's while DIB's RD is larger than IB's. Therefore, it is difficult to tell which one is better. To balance the trade-off between SG and the RD, we propose an elastic information bottleneck (EIB) to interpolate between the IB and DIB regularizers, which guarantees a Pareto frontier within the IB framework. Additionally, simulations and real data experiments show that EIB has the ability to achieve better domain adaptation results than IB and DIB, which validates the correctness of our theories. |
资助项目 | National Key R&D Program of China[2021YFF1201600] ; Vanke Special Fund for Public Health and Health Discipline Development, Tsinghua University[2022-1080053] ; Beijing Academy of Artificial Intelligence (BAAI) |
WOS研究方向 | Mathematics |
语种 | 英语 |
WOS记录号 | WOS:000859604800001 |
出版者 | MDPI |
源URL | [http://ir.amss.ac.cn/handle/2S8OKBNM/60938] ![]() |
专题 | 应用数学研究所 |
通讯作者 | Lan, Yanyan |
作者单位 | 1.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China 2.Chinese Acad Sci, Acad Math & Syst Sci, Beijing 100190, Peoples R China 3.Tsinghua Univ, Inst AI Ind Res, Beijing 100084, Peoples R China |
推荐引用方式 GB/T 7714 | Ni, Yuyan,Lan, Yanyan,Liu, Ao,et al. Elastic Information Bottleneck[J]. MATHEMATICS,2022,10(18):26. |
APA | Ni, Yuyan,Lan, Yanyan,Liu, Ao,&Ma, Zhiming.(2022).Elastic Information Bottleneck.MATHEMATICS,10(18),26. |
MLA | Ni, Yuyan,et al."Elastic Information Bottleneck".MATHEMATICS 10.18(2022):26. |
入库方式: OAI收割
来源:数学与系统科学研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。