Transformer 架构

博客地址:https://www.cnblogs.com/zylyehuo/

参考 强烈推荐!台大李宏毅自注意力机制和Transformer详解!

Inference

Encoder

c34dadc34a5cef3078735b35be5f91d

be9af59705f3332db9690884e12d5ce

dc277ebca50f5ddaf94d18b0bb65ed2

Decoder

AT

8c4cdda94dee308350c857fc4ccecd0

b5486516a1f2824ec4bec4469339bb9

98395793dfb976b337c85c36c006389

6e6afbe7338d3082bc3e34de1f8e0c6

bc0530d04749f595e067494b2805d8c

a5dcecba4a9ee29aa74092168046a16

3ae0244088616f497caae57243f158d

2a9df2ee633f6e5e506f8beebf6fd34

d72359e899ffb0b372b018bb9647584

NAT

af080f957dea424352fa272e7a4982a

0189b7bc1f4e5f9f5de3ca0afdf467c

Encoder-Decoder

1dadbc6527bdf111ccf33b1c1f5292b

1d58d19d170d5e3fffda478790b9691

7b86fca41d55754b25f8aacc0df0a8b

0020eafd2555badd932a537522ee425

7f51e20e0926f31438fd7a0da996737

Train

b09d2e0b68d3f0a7bc1267fdb92da2b

cac63a3eaa54bf7d0227cb03645baeb

55a76782fc53af1bd99315648f748ca

8987e5e30369467a38dd2dc3223b578

7b9187dca4bcdb88a2a5814511504a1

posted @ 2025-07-12 23:35  zylyehuo  阅读(11)  评论(0)    收藏  举报