中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
listopt: learning to optimize for xml ranking

文献类型:会议论文

作者Gao Ning ; Deng Zhi-Hong ; Yu Hang ; Jiang Jia-Jian
出版日期2011
会议名称15th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2011
会议日期24-May-20
会议地点Shenzhen, China
关键词Adaptive boosting Data mining Information retrieval Neural networks XML
页码482-492
英文摘要Many machine learning classification technologies such as boosting, support vector machine or neural networks have been applied to the ranking problem in information retrieval. However, since the purpose of these learning-to-rank methods is to directly acquire the sorted results based on the features of documents, they are unable to combine and utilize the existing ranking methods proven to be effective such as BM25 and PageRank. To solve this defect, we conducted a study on learning-to-optimize, which is to construct a learning model or method for optimizing the free parameters in ranking functions. This paper proposes a listwise learning-to-optimize process ListOPT and introduces three alternative differentiable query-level loss functions. The experimental results on the XML dataset of Wikipedia English show that these approaches can be successfully applied to tuning the parameters used in an existing highly cited ranking function BM25. Furthermore, we found that the formulas with optimized parameters indeed improve the effectiveness compared with the original ones. © 2011 Springer-Verlag.
收录类别EI
会议录Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
会议录出版地Germany
语种英语
ISSN号3029743
ISBN号9783642208461
源URL[http://124.16.136.157/handle/311060/14269]  
专题软件研究所_软件所图书馆_会议论文
推荐引用方式
GB/T 7714
Gao Ning,Deng Zhi-Hong,Yu Hang,et al. listopt: learning to optimize for xml ranking[C]. 见:15th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2011. Shenzhen, China. 24-May-20.

入库方式: OAI收割

来源:软件研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。