Local weighted regression

Local weighted regression(we need to find theta which can minimize this formular for each prediction) 

Local weighted regression is very costly. This algorithm will train the theta vector while predicting each time. 

Besides Local weighted regression is kind of parameteric learning algorithm, which means the number of paramters is not fixed.

Adding weight in this way : weight gives higher weights for the part of training samples which are close to the target input, and gives lower weights for the other part of training samples which are far from the target input. Approximately we can ignore the effect of lower weighted training samples, only consider the higher weighted samples, which transforms the original problem into a local linear regression problem. Intuitively in this way would lead to a good prediction. 

As Andrew Ng mentioned, KD-tree may help improve the efficiency of this problem. Surely is the fact, that KD-tree is pretty good at finding the nearest neighbers.

 

It would always be not good to use regression algorithms to sovle classification problems. 

posted on 2013-05-19 20:53  flytomylife  阅读(328)  评论(0编辑  收藏  举报

导航