兵马逐浪去,将象入海流。炮声震心动,惊起卧龙游。
我的博客园主页 --------- 我的知乎主页 --------- 我的github主页 --------- 我的csdn主页 --------- 我的新浪微博

矩阵的复习回顾

矩阵的转置:     AT= (aji)    其中 A= (aij)

矩阵的共轭:     (aji)    其中 A= (aij)

//----------------------------------------------------------------------------------------------------

 

以下转载自:http://fourier.eng.hmc.edu/e161/lectures/klt/node3.html

下文在其基础上添加了解释和说明。

Karhunen-Loeve Transform (KLT)

Now we consider the Karhunen-Loeve Transform (KLT) (also known as Hotelling Transform and Eigenvector Transform), which is closely related to the Principal Component Analysis (PCA) and widely used in data analysis in many fields.

Let ${\bf\phi}_k$ be the eigenvector corresponding to the kth eigenvalue $\lambda_k$ of the covariance matrix ${\bf\Sigma}_x$, i.e.,

\begin{displaymath}{\bf\Sigma}_x {\bf\phi}_k=\lambda_k{\bf\phi}_k\;\;\;\;\;\;(k=1,\cdots,N) \end{displaymath}


or in matrix form:

\begin{displaymath}\left[ \begin{array}{ccc}\cdots &\cdots &\cdots \\
\cdots & ...
...f\phi}_k   \end{array} \right]
\;\;\;\;\;\;(k=1,\cdots,N) \end{displaymath}


As the covariance matrix ${\bf\Sigma}_x={\bf\Sigma}_x^{*T}$ is Hermitian (symmetric if ${\bf x}$ is real), its eigenvector ${\bf\phi}_i$'s are orthogonal:

(Hermit矩阵是对称矩阵的推广)

\begin{displaymath}\langle {\bf\phi}_i,{\bf\phi}_j\rangle={\bf\phi}^T_i {\bf\phi...
...\{ \begin{array}{ll} 1 & i=j  0 & i\ne j \end{array} \right. \end{displaymath}


and we can construct an $N \times N$ unitary (orthogonal if ${\bf x}$ is real) matrix ${\bf\Phi}$

\begin{displaymath}{\bf\Phi}\stackrel{\triangle}{=}[{\bf\phi}_1, \cdots,{\bf\phi}_{N}] \end{displaymath}


satisfying

(U矩阵是正交矩阵的推广)

\begin{displaymath}{\bf\Phi}^{*T} {\bf\Phi} = {\bf I},\;\;\;\;\mbox{i.e.,}\;\;\;\;
{\bf\Phi}^{-1}={\bf\Phi}^{*T} \end{displaymath}


The $N$ eigenequations above can be combined to be expressed as:

\begin{displaymath}{\bf\Sigma}_x{\bf\Phi}={\bf\Phi}{\bf\Lambda} \end{displaymath}


or in matrix form:

\begin{displaymath}
\left[ \begin{array}{ccc}\ddots &\cdots &\cdots \\
\vdots &...
... & \vdots \\
0 & \cdots & \lambda_{N}
\end{array} \right]
\end{displaymath}


Here ${\bf\Lambda}$ is a diagonal matrix ${\bf\Lambda}=diag(\lambda_1, \cdots,
\lambda_{N} )$. Left multiplying ${\bf\Phi}^T={\bf\Phi}^{-1}$ on both sides, the covariance matrix ${\bf\Sigma}_x$ can be diagonalized:

\begin{displaymath}{\bf\Phi}^{*T}{\bf\Sigma}_x{\bf\Phi}={\bf\Phi}^{-1} {\bf\Sigm...
... {\bf\Phi}
= {\bf\Phi}^{-1}{\bf\Phi}{\bf\Lambda}={\bf\Lambda} \end{displaymath}


Now, given a signal vector ${\bf x}$, we can define a unitary (orthogonal if ${\bf x}$ is real) Karhunen-Loeve Transform of ${\bf x}$ as:

\begin{displaymath}
{\bf y}=\left[ \begin{array}{c} y_1 \vdots  y_{N} \end{...
...ht]\left[\begin{array}{c}x_1 \vdots, x_N\end{array}\right]
\end{displaymath}


where the ith component $y_i$ of the transform vector is the projection of ${\bf x}$ onto ${\bf\phi_i}$:

\begin{displaymath}
y_i=\langle {\bf\phi}_i,{\bf x} \rangle={\bf\phi}_i^T{\bf x}^*
\end{displaymath}


Left multiplying ${\bf\Phi}=({\bf\Phi}^{*T})^{-1}$ on both sides of the transform ${\bf y}={\bf\Phi}^{*T} {\bf x}$, we get the inverse transform:

\begin{displaymath}
{\bf x}={\bf\Phi} {\bf y}=\left[\begin{array}{ccc}&& {\bf...
...dots  y_{N} \end{array} \right]
=\sum_{i=1}^{N} y_i \phi_i
\end{displaymath}


We see that by this transform, the signal vector ${\bf x}$ is now expressed in an N-dimensional space spanned by the N eigenvectors ${\bf\phi}_i$ ($i=1,\cdots,N$) as the basis vectors of the space.

 
posted @ 2021-04-17 15:00  leoking01  阅读(142)  评论(0编辑  收藏  举报
#back-to-top { background-color: #00CD00; bottom: 0; box-shadow: 0 0 6px #00CD00; color: #444444; padding: 10px 10px; position: fixed; right: 50px; cursor: pointer; }