中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Efficient Cross-modal Retrieval Using Social Tag Information Towards Mobile Applications

文献类型:会议论文

作者Jianfeng He; Qingming Huang; Weigang Zhang; Qiang Qu; Shuhui Wang
出版日期2017
会议日期2017
会议地点德国慕尼黑
英文摘要With the prevalence of mobile devices, millions of multimedia data represented as a combination of visual, aural and textual modalities, is produced every second. To facilitate better information retrieval on mobile devices, it becomes imperative to develop efficient models to retrieve heterogeneous content modalities using a specific query input, e.g., text-to-image or image-to-text retrieval. Unfortunately, previous works address the problem without considering the hardware constraints of the mobile devices. In this paper, we propose a novel method named Trigonal Partial Least Squares (TPLS) for the task of cross-modal retrieval on mobile devices. Specifically, TPLS works under the hardware constrains of mobile devices, i.e., limited memory size and no GPU acceleration. To take advantage of users’ tags for model training, we take the label information provided by the users as the third modality. Then, any two modalities of texts, images and labels are used to build a Kernel PLS model. As a result, TPLS is a joint model of three Kernel PLS models, and a constraint to narrow the distance between label spaces of images and texts is proposed. To efficiently learn the model, we use stochastic parallel gradient descent (SGD) to accelerate the learning speed with reduced memory consumption. To show the effectiveness of TPLS, the experiments are conducted on popular cross-modal retrieval benchmark datasets, and competitive results have been obtained.
语种英语
源URL[http://ir.siat.ac.cn:8080/handle/172644/11930]  
专题深圳先进技术研究院_其他
作者单位2017
推荐引用方式
GB/T 7714
Jianfeng He,Qingming Huang,Weigang Zhang,et al. Efficient Cross-modal Retrieval Using Social Tag Information Towards Mobile Applications[C]. 见:. 德国慕尼黑. 2017.

入库方式: OAI收割

来源:深圳先进技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。