原理讲解:
1、Attention Is All You Need----Transformer
https://blog.csdn.net/weixin_38937984/article/details/101159195
2、Seq2Seq模型及Attention机制
https://blog.csdn.net/weixin_38937984/article/details/101111619
3、Attention原理详解
https://blog.csdn.net/weixin_38937984/article/details/101119917
代码讲解:
1、Transformer代码详解
https://blog.csdn.net/mijiaoxiaosan/article/details/74909076?utm_source=app
2、德英数据集下载
https://wit3.fbk.eu/archive/2016-01//texts/de/en/de-en.tgz
3、源代码
https://github.com/Kyubyong/transformer/tree/master/tf1.2_legacy