中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
A Text Localization Method Based on Weak Supervision

文献类型:会议论文

作者Jiyuan Zhang1,2; Chen Du1,2; Zipeng Feng1,2; Yanna Wang2; Chunheng Wang2; Wang, Chunheng; Feng, Zipeng; Wang, Yanna; Du, Chen; Zhang, Jiyuan
出版日期2020-02-03
会议日期20-25 Sept. 2019
会议地点Sydney, Australia, Australia
关键词weak supervision fully convolutional network text localization map
DOI10.1109/ICDAR.2019.00129
英文摘要

Recently, numerous deep learning based scene text detection methods have achieved promising performances in different text detecting tasks. Most of these methods are trained in a supervised way, which requires a large amount of annotated data. In this paper, we explore a weakly supervised method to locate text regions in scene images. We propose a fully convolutional network (FCN) architecture to implement binary classification. The training data we used do not need any text location annotation, we only need to divide the training data into two categories according to whether it contains text or not. We can obtain the text localization map (TLM) directly from the last convolutional layer. By setting a fixed threshold, the TLM is converted to a mask map. Then the connected component analysis and the text proposals method based on Maximally Stable Extremal Regions (MSERs) are used to get the text region bounding boxes. We conduct comprehensive experiments on standard text datasets. The results show that our text localization method achieves comparable recall performance with other methods and has more stable property.

会议录出版者IEEE
语种英语
源URL[http://ir.ia.ac.cn/handle/173211/39227]  
专题自动化研究所_复杂系统管理与控制国家重点实验室_影像分析与机器视觉团队
通讯作者Chunheng Wang; Wang, Chunheng
作者单位1.University of Chinese Academy of Sciences
2.Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Jiyuan Zhang,Chen Du,Zipeng Feng,et al. A Text Localization Method Based on Weak Supervision[C]. 见:. Sydney, Australia, Australia. 20-25 Sept. 2019.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。