摘要:
1、Visualize attention weights of multiple heads in this experiment. from matplotlib import pyplot as plt out = attention.attention.attention_weights.d 阅读全文
posted @ 2021-05-27 17:37
哈哈哈喽喽喽
阅读(97)
评论(0)
推荐(0)
摘要:
1、Modify keys in the toy example and visualize attention weights. Do additive attention and scaled dot-product attention still output the same attenti 阅读全文
posted @ 2021-05-27 17:32
哈哈哈喽喽喽
阅读(48)
评论(0)
推荐(0)
摘要:
#2.What is the value of our learned w in the parametric attention pooling experiment? Why does it make the weighted region sharper when visualizing th 阅读全文
posted @ 2021-05-27 17:30
哈哈哈喽喽喽
阅读(104)
评论(0)
推荐(0)
摘要:
#1、What can be the volitional cue when decoding a sequence token by token in machine translation? What are the nonvolitional cues and the sensory inpu 阅读全文
posted @ 2021-05-27 17:26
哈哈哈喽喽喽
阅读(86)
评论(0)
推荐(0)

浙公网安备 33010602011771号