中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Adapter Tuning With Task-Aware Attention Mechanism

文献类型:会议论文

作者Lu JL(陆金梁)1,2; Zhang JJ(张家俊)1,2
出版日期2023-06
会议日期3-10 June, 2023
会议地点Rhodes Island, Greece
英文摘要

Adapter-tuning inserts simple feed-forward layers (adapters) in pre-trained language models (PLMs) and just tunes the adapters when transferring to downstream tasks, having become the state-of-the-art parameter-efficient tuning (PET) strategy. Although the adapters aim to learn task-related representations, their inputs are still obtained from the task-independent and frozen multi-head attention (MHA) modules, leading to insufficient utilization of contextual information for various downstream tasks. Intuitively, MHA should be task-dependent and could attend to different contexts in different downstream tasks. Thus, this paper proposes the task-aware attention mechanism (TAM) to enhance adapter tuning. Specifically, we first utilize the task-dependent adapter to generate token-wise task embedding. Then, we apply the task embedding to influence MHA which task-dependently aggregates the contextual information. Experimental results on a wide range of natural language understanding and generation tasks demonstrate the effectiveness of our method. Furthermore, extensive analyses demonstrate that the generated task embedding corresponds with the difficulty of tasks.

会议录出版者IEEE
源URL[http://ir.ia.ac.cn/handle/173211/57387]  
专题紫东太初大模型研究中心
通讯作者Zhang JJ(张家俊)
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
2.Institute of Automation, Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Lu JL,Zhang JJ. Adapter Tuning With Task-Aware Attention Mechanism[C]. 见:. Rhodes Island, Greece. 3-10 June, 2023.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。