中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Gesture Design of Hand-to-Speech Converter derived from Speech-to-Hand Converter based on Probabilistic Integration Model

文献类型:会议论文

作者Aki Kunikoshi; Yu Qiao; Daisuke Saito; Nobuaki Minematsu; Keikichi Hirose
出版日期2011
会议名称12th Annual Conference of the International-Speech-Communication-Association 2011
会议地点Florence, ITALY
英文摘要When dysarthrics, individuals with speaking disabilities, try to communicate using speech, they often have no choice but to use speech synthesizers which require them to type word symbols or sound symbols. Input by this method often makes real-time communication troublesome and dysarthric users struggle tohave smooth flowing conversations. In this study, we are developing a novel speech synthesizer where speech is generated through hand motions rather than symbol input. In recent years, statistical voice conversion techniques have been proposed based on space mapping between given parallel utterances. By applying these methods, a hand space was mapped to a vowel space and a converter from hand motions to vowel transitions was developed. It reported that the proposed method is effective enough to generate the five Japanese vowels. In this paper, we discuss the expansion of this system to consonant generation. In order to create the gestures for consonants, a Speech-to-Hand conversion system is firstly developed using parallel data for vowels, in which consonants are not included. Then, we are able to automatically search for candidates for consonant gestures for a Hand-to-Speech system.
收录类别EI
语种英语
源URL[http://ir.siat.ac.cn:8080/handle/172644/3264]  
专题深圳先进技术研究院_集成所
作者单位2011
推荐引用方式
GB/T 7714
Aki Kunikoshi,Yu Qiao,Daisuke Saito,et al. Gesture Design of Hand-to-Speech Converter derived from Speech-to-Hand Converter based on Probabilistic Integration Model[C]. 见:12th Annual Conference of the International-Speech-Communication-Association 2011. Florence, ITALY.

入库方式: OAI收割

来源:深圳先进技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。