中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Item Response Theory Based Ensemble in Machine Learning

文献类型:期刊论文

作者Ziheng Chen; Hongshik Ahn
刊名International Journal of Automation and Computing
出版日期2020
卷号17期号:5页码:621-636
关键词Classification ensemble learning item response theory machine learning expectation maximization (EM) algorithm.
ISSN号1476-8186
DOI10.1007/s11633-020-1239-y
英文摘要In this article, we propose a novel probabilistic framework to improve the accuracy of a weighted majority voting algorithm. In order to assign higher weights to the classifiers which can correctly classify hard-to-classify instances, we introduce the item response theory (IRT) framework to evaluate the samples′ difficulty and classifiers′ ability simultaneously. We assigned the weights to classifiers based on their abilities. Three models are created with different assumptions suitable for different cases. When making an inference, we keep a balance between the accuracy and complexity. In our experiment, all the base models are constructed by single trees via bootstrap. To explain the models, we illustrate how the IRT ensemble model constructs the classifying boundary. We also compare their performance with other widely used methods and show that our model performs well on 19 datasets.
源URL[http://ir.ia.ac.cn/handle/173211/42263]  
专题自动化研究所_学术期刊_International Journal of Automation and Computing
作者单位Department of Applied Mathematics and Statistics, Stony Brook University, New York 11794−3600, USA
推荐引用方式
GB/T 7714
Ziheng Chen,Hongshik Ahn. Item Response Theory Based Ensemble in Machine Learning[J]. International Journal of Automation and Computing,2020,17(5):621-636.
APA Ziheng Chen,&Hongshik Ahn.(2020).Item Response Theory Based Ensemble in Machine Learning.International Journal of Automation and Computing,17(5),621-636.
MLA Ziheng Chen,et al."Item Response Theory Based Ensemble in Machine Learning".International Journal of Automation and Computing 17.5(2020):621-636.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。