摘要:
Lab: Batchnormalization Layer What is a batchnormalization layer? It is a layer that normalize the output before the activation layer. "The original p 阅读全文
摘要:
author:yangjing date:2018 10 24 KMeans 1.Process: The algorithm alternates between two steps:1)assigning each data point to the closet cluter center,2 阅读全文
摘要:
author:yangjing time:2018 10 22 Gradient boosting decision tree 1.main diea The main idea behind GBDT is to combine many simple models(also known as w 阅读全文