中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
首页
机构
成果
学者
登录
注册
登陆
×
验证码:
换一张
忘记密码?
记住我
×
校外用户登录
CAS IR Grid
机构
自动化研究所 [29]
计算技术研究所 [5]
合肥物质科学研究院 [2]
软件研究所 [2]
物理研究所 [1]
数学与系统科学研究院 [1]
更多
采集方式
OAI收割 [43]
内容类型
期刊论文 [23]
学位论文 [17]
会议论文 [3]
发表日期
2024 [1]
2023 [3]
2022 [1]
2020 [3]
2019 [6]
2017 [1]
更多
学科主题
Other appl... [1]
筛选
浏览/检索结果:
共43条,第1-10条
帮助
条数/页:
5
10
15
20
25
30
35
40
45
50
55
60
65
70
75
80
85
90
95
100
排序方式:
请选择
题名升序
题名降序
提交时间升序
提交时间降序
作者升序
作者降序
发表日期升序
发表日期降序
Dual-View Curricular Optimal Transport for Cross-Lingual Cross-Modal Retrieval
期刊论文
OAI收割
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 卷号: 33, 页码: 1522-1533
作者:
Wang, Yabing
;
Wang, Shuhui
;
Luo, Hao
;
Dong, Jianfeng
;
Wang, Fan
  |  
收藏
  |  
浏览/下载:0/0
  |  
提交时间:2024/05/20
Visualization
Noise measurement
Estimation
Costs
Transportation
Training
Task analysis
Cross-modal retrieval
noise correspondence learning
cross-lingual transfer
optimal transport
machine translation
Contrastive Adversarial Training for Multi-Modal Machine Translation
期刊论文
OAI收割
ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 卷号: 22, 期号: 6, 页码: 157:1-18
作者:
Huang X(黄鑫)
;
Zhang JJ(张家俊)
;
Zong CQ(宗成庆)
  |  
收藏
  |  
浏览/下载:11/0
  |  
提交时间:2023/06/26
contrastive learning
adversarial training
multi-modal machine translation
Transformer: A General Framework from Machine Translation to Others
期刊论文
OAI收割
Machine Intelligence Research, 2023, 卷号: 20, 期号: 4, 页码: 514-538
作者:
Yang Zhao
  |  
收藏
  |  
浏览/下载:10/0
  |  
提交时间:2023/08/02
Neural machine translation, Transformer, document neural machine translation (NMT), multimodal NMT, low-resource NMT
Towards Unified Multi-Domain Machine Translation With Mixture of Domain Experts
期刊论文
OAI收割
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 卷号: 31, 页码: 3488-3498
作者:
Lu, Jinliang
;
Zhang, Jiajun
  |  
收藏
  |  
浏览/下载:3/0
  |  
提交时间:2023/12/21
Training
Adaptation models
Transformers
Task analysis
Speech processing
Machine translation
Switches
Machine Translation
Multi-domain
Mixture-of-expert
Enhancing Lexical Translation Consistency for Document-Level Neural Machine Translation
期刊论文
OAI收割
ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 卷号: 21, 期号: 3, 页码: 21
作者:
Kang, Xiaomian
;
Zhao, Yang
;
Zhang, Jiajun
;
Zong, Chengqing
  |  
收藏
  |  
浏览/下载:24/0
  |  
提交时间:2022/06/10
Document-level translation
neural machine translation
lexical consistency
discourse phenomena
Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning
会议论文
OAI收割
Online, November 16–20, 2020
作者:
Kang, Xiaomian
;
Zhao, Yang
;
Zhang, Jiajun
;
Zong, Chengqing
  |  
收藏
  |  
浏览/下载:23/0
  |  
提交时间:2021/05/26
Docment-level NMT
Neural Machine Translation
Reinforcement Learning
Context Selection
基于单语语料和词向量对齐的蒙汉神经机器翻译研究
期刊论文
OAI收割
中文信息学报, 2020, 卷号: 34
作者:
曹宜超
;
高翊
;
李淼
;
冯韬
;
王儒敬
  |  
收藏
  |  
浏览/下载:20/0
  |  
提交时间:2020/10/26
Mongolian-Chinese neural machine translation
monolingual corpora
word embedding alignment
蒙汉神经机器翻译
单语语料
词向量对齐
Synchronous Bidirectional Inference for Neural Sequence Generation
期刊论文
OAI收割
Artificial Intelligence, 2020, 期号: 281 (2020) 103234, 页码: pp.1-19
作者:
Zhang, Jiajun
;
Zhou, Long
;
Zhao, Yang
;
Zong, Chengqing
  |  
收藏
  |  
浏览/下载:15/0
  |  
提交时间:2020/06/23
Sequence to sequence learning, Bidirectional inference, Beam search, Machine translation, Summarization
Machine Translation Evaluation Metric Based on Dependency Parsing Model
期刊论文
OAI收割
ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2019, 卷号: 18, 期号: 4, 页码: 15
作者:
Yu, Hui
;
Xu, Weizhi
;
Lin, Shouxun
;
Liu, Qun
  |  
收藏
  |  
浏览/下载:16/0
  |  
提交时间:2020/12/10
Automatic evaluation metric
dependency parsing model
machine translation
Effectively training neural machine translation models with monolingual data
期刊论文
OAI收割
NEUROCOMPUTING, 2019, 卷号: 333, 页码: 240-247
作者:
Yang, Zhen
;
Chen, Wei
;
Wang, Feng
;
Xu, Bo
  |  
收藏
  |  
浏览/下载:28/0
  |  
提交时间:2019/07/12
Neural machine translation
Monolingual data
Gate-enhanced
Source-side and target-side
Effectively