注意力机制资料收集

一、 信息边传递边衰减

  • 所有人都能学会用Python写出RNN-LSTM代码!
    http://www.toutiao.com/i6402514355929219586/?tt_from=weixin&utm_campaign=client_share&from=groupmessage&app=news_article&utm_source=weixin&iid=9019236026&utm_medium=toutiao_android&wxshare_count=1

二、 超智能体—LSTM实战

  • 循环神经网络——scan实现LSTM - 知乎专栏 https://zhuanlan.zhihu.com/p/25821063
  • 循环神经网络——双向LSTM&GRU - 知乎专栏 https://zhuanlan.zhihu.com/p/25858226
  • 代码演示LV3 · 超智能体 https://yjango.gitbooks.io/superorganism/content/%E4%BB%A3%E7%A0%81%E6%BC%94%E7%A4%BAlv3.html
  • Awesome-rnn https://github.com/kjw0612/awesome-rnn

三、博客集

  • https://blog.heuritech.com/2016/01/20/attention-mechanism/
    http://smerity.com/articles/2016/google_nmt_arch.html
    http://blog.evjang.com/2016/06/understanding-and-implementing.html
    http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/
    http://distill.pub/2016/augmented-rnns/
    https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-with-gpus/
    https://www.quora.com/What-is-exactly-the-attention-mechanism-introduced-to-RNN-recurrent-neural-network-It-would-be-nice-if-you-could-make-it-easy-to-understand
    https://explosion.ai/blog/deep-learning-formula-nlp
    https://research.googleblog.com/2016/09/a-neural-network-for-machine.html
    https://blog.themusio.com/2016/03/25/attentionmemory-in-deep-learning/
    http://torch.ch/blog/2015/09/21/rmva.html

三、 注意力模型的总结

  • 注意力机制(Attention Mechanism)在自然语言处理中的应用 - robert_ai - 博客园 http://www.cnblogs.com/robert-dlut/p/5952032.html
posted @ 2022-11-13 22:35  dlhl  阅读(11)  评论(0)    收藏  举报