From summation to matrix

x,y $\epsilon R^n$

$x^Ty \epsilon R =\sum_{i=1}^n{x_iy_i}$, 注意将此式扩展。

X is a matrix of m*n. 

$X^T X$其第j个对角线元素为$\sum_i X_{ij}^2 $

$\sum_i \sum_j X_{ij}^2 =\sum_j (X^T X)_{jj} = tr X^T X$

$x^{(i)}$ is a vector of n*1; $\vec y \epsilon R^m$

$X^T=[x^{(1)} x^{(2)} ... x^{(m)}]$ m is the number of training set, n is the number of features.

$X^T\vec y=[\sum_{i=1}^m x^{(i)}y^{(i)}] \epsilon R^n$

 

$(X^TX)_{ij}=\sum_{k=1}^m x_i^k x_j^k$, because

$X^TX=x^{(1)}x^{(1)T}+x{(2)}x{(2)T}...x{(m)}x{(m)T}$, and because $x^{(1)}x^{(1)T}$ is n*n matrix, i row j col element is $x_i*x_j$

so $X^TX=[\sum_{k=1}^m x_i^k x_j^k]$

 

$\sum_{i=1}^n \sum_{i=1}^n x_i*x_j= \sum_{i=1}^nx_i*\sum_{j=1}^nx_j = (\sum_{i=1}^nx_i)^2$

 

$w=\sum_{i=1}^m\alpha_iy^{(i)}x^{(i)}$, the jth element of w is $w_j=\sum_{(i=1)}^m \alpha_iy^{(i)}x_j^{(i)}$

take it another way. $w=\sum_{(i=1)}^m\alpha_iy^{(i)}x^{(i)}, w=\sum_{(j=1)}^m\alpha_jy^{(j)}x^{(j)}$

$\left \| w \right \|^2 & = & w^Tw=\sum_{(i=1)}^m\alpha_iy^{(i)}x^{(i)T} \sum_{(j=1)}^m\alpha_jy^{(j)}x^{(j)} = \sum_{(i=1)}^m\sum_{(j=1)}^m\alpha_iy^{(i)}\alpha_jy^{(j)}x^{(i)T}x^{(j)}$

 

posted @ 2012-09-28 22:59  sidereal  Views(101)  Comments(0)    收藏  举报