摘要:
翻译 | Placing Search in Context The Concept Revisited 原文 摘要 [1] Keyword based search engines are in widespread use today as a popular means for Web bas 阅读全文
摘要:
翻译 | Improving Distributional Similarity with Lessons Learned from Word Embeddings 叶娜老师说:“读懂论文的最好方法是翻译它”。我认为这是很好的科研训练,更加适合一个陌生领域的探索。因为论文读不懂,我总结无非是因为这个 阅读全文
摘要:
Derivative of Softmax Loss Function A softmax classifier: $$ p_j = \frac{\exp{o_j}}{\sum_{k}\exp{o_k}} $$ It has been used in a loss function of the f 阅读全文
摘要:
Getting Started with Word2Vec 1. Source by Google Project with Code: https://code.google.com/archive/p/word2vec/ Blog: "Learning Meaning Behind Words" 阅读全文
摘要:
论文阅读笔记 Word Embeddings A Survey 收获 Word Embedding 的定义 dense, distributed, fixed length word vectors, built using word co occurrence statistics as per 阅读全文
摘要:
A Summary of Multi task Learning author by Yubo Feng. Intro In this paper[0], the introduction of multi task learning through the data hungry, the mos 阅读全文