## 在博客中使用LaTeX插入数学公式

2016-12-12 09:07  Fururur  阅读(30634)  评论(12编辑  收藏  举报

## LaTeX公式基础

### 常用西文符号

\alpha, \beta, …, \omega代表α,β,…ω. 大写字母,使用 \Gamma, \Delta, …, \Omega代表Γ,Δ,…,Ω.

### 运算

• 分数：\frac{}{}。例如，\frac{1+1}{2}+1: $$\frac{1+1}{2}+1$$
• 求和：\sum_1^n:$$\sum_1^n$$
• 积分：\int_1^n:$$\int_1^n$$
• 极限：lim_{x \to \infty:$$\lim_{x \to \infty}$$
• 矩阵：$$\begin{matrix}…\end{matrix}$$，使用&分隔同行元素，\\换行。例如：
$$\begin{matrix} 1 & x & x^2 \\ 1 & y & y^2 \\ 1 & z & z^2 \\ \end{matrix}$$


$\begin{matrix} 1 & x & x^2 \\ 1 & y & y^2 \\ 1 & z & z^2 \\ \end{matrix}$

## 杂例

• $$h(\theta)=\sum_{j=0}^n \theta_jx_j$$

$h(\theta)=\sum_{j=0}^n \theta_jx_j(线性模型)$

• $$J(\theta)=\frac1{2m}\sum_{i=0}(y^i-h_\theta(x^i))^2$$

$J(\theta)=\frac1{2m}\sum_{i=0}^m(y^i-h_\theta(x^i))^2（均方误差\;or\;cost function）$

• $$\frac{\partialJ(\theta)}{\partial\theta_j}=-\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i))x^i_j$$

$\frac{\partial J(\theta)}{\partial\theta_j }=-\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i))x^i_j （批量梯度下降的梯度算法）$

$$f(n) = \begin{cases} n/2, & \text{if n is even} \\ 3n+1, & \text{if n is odd} \end{cases}$$


$f(n) = \begin{cases} n/2, & \text{if n is even} \\ 3n+1, & \text{if n is odd} \end{cases}$

$$\left\{ \begin{array}{c} a_1x+b_1y+c_1z=d_1 \\ a_2x+b_2y+c_2z=d_2 \\ a_3x+b_3y+c_3z=d_3 \end{array} \right.$$


$\left\{ \begin{array}{c} a_1x+b_1y+c_1z=d_1 \\ a_2x+b_2y+c_2z=d_2 \\ a_3x+b_3y+c_3z=d_3 \end{array} \right.$

$$X=\left( \begin{matrix} x_{11} & x_{12} & \cdots & x_{1d}\\ x_{21} & x_{22} & \cdots & x_{2d}\\ \vdots & \vdots & \ddots & \vdots\\ x_{m1} & x_{m2} & \cdots & x_{md}\\ \end{matrix} \right) =\left( \begin{matrix} x_1^T \\ x_2^T \\ \vdots\\ x_m^T \\ \end{matrix} \right)$$


$X=\left( \begin{matrix} x_{11} & x_{12} & \cdots & x_{1d}\\ x_{21} & x_{22} & \cdots & x_{2d}\\ \vdots & \vdots & \ddots & \vdots\\ x_{m1} & x_{m2} & \cdots & x_{md}\\ \end{matrix} \right) =\left( \begin{matrix} x_1^T \\ x_2^T \\ \vdots\\ x_m^T \\ \end{matrix} \right)$

\begin{align} \frac{\partial J(\theta)}{\partial\theta_j} & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i)) \frac{\partial}{\partial\theta_j}(y^i-h_\theta(x^i)) \\ & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i)) \frac{\partial}{\partial\theta_j}(\sum_{j=0}^n\theta_jx_j^i-y^i) \\ & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i))x^i_j \end{align}


\begin{align} \frac{\partial J(\theta)}{\partial\theta_j} & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i)) \frac{\partial}{\partial\theta_j}(y^i-h_\theta(x^i)) \\ & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i)) \frac{\partial}{\partial\theta_j}(\sum_{j=0}^n\theta_jx_j^i-y^i) \\ & = -\frac1m\sum_{i=0}^m(y^i-h_\theta(x^i))x^i_j \end{align}