Java知识分享网 - 轻松学习从此开始!    

Java知识分享网

        
AI编程,程序员挑战年入30~100万高级指南 - 职业规划
SpringBoot+SpringSecurity+Vue权限系统高级实战课程        

IDEA永久激活

Java微信小程序电商实战课程(SpringBoot+VUe)

     

AI人工智能学习大礼包

     

PyCharm永久激活

66套java实战课程无套路领取

     

Cursor+Claude AI编程 1天快速上手视频教程

     
当前位置: 主页 > Java文档 > 人工智能AI >

transformer论文集合 下载


时间:2025-05-26 09:57来源:http://www.java1234.com 作者:转载  侵权举报
transformer论文集合
失效链接处理
transformer论文集合 下载

 
 
相关截图:
 

主要内容:
 

1 Introduction
Transformer has been the most widely used ar-
chitecture for machine translation (Vaswani et al.,
2017). Despite its strong performance, the decod-
ing of Transformer is inefficient as it adopts the
sequential auto-regressive factorization for its prob-
ability model (Figure 1a). Recent work such as
non-autoregressive transformer (NAT), aim to de-
code target tokens in parallel to speed up the gener-
ation (Gu et al., 2018). However, the vanilla NAT
still lags behind Transformer in the translation qual-
ity – with a gap about 7.0 BLEU score. NAT as-
sumes the conditional independence of the target
tokens given the source sentence. We suspect that
NAT’s conditional independence assumption pre-
vents learning word interdependency in the target
sentence. Notice that such word interdependency
is crucial, as the Transformer explicitly captures
that via decoding from left to right (Figure 1a).


 


------分隔线----------------------------


锋哥推荐