Summary of Machine Learning

单变量线性回归

Model Representation

cost fuction

$$x = {-b \pm \sqrt{b^2-4ac} \over 2a}$$

$$
J \left( \theta_0, \theta_1 \right) = \frac{1} {2m}\sum_{k=1}^m{ \left( h_\theta \left( x^ \left( i \right) \right) - y^\left( i \right) \right)^2}
$$

Gradient Descent

Repeat until convergence {

$$
\theta_j := \theta_j - \alpha \frac{\partial}{\partial \theta_j} J \left( \theta_0, \theta_1 \right) (for\ j = 0 \ and \ j = 1)
$$
}

Linear Regression With Single Variables

Repeat {

$$
\theta_0 := \theta_0 - \alpha \frac{1}{m} \sum_{i=1}^m {\left(h_\theta \left( x ^ \left( i \right) \right) - y^\left( i \right) \right)}
$$

$$
\theta_0 := \theta_0 - \alpha \frac{1}{m} \sum_{i=1}^m {\left(h_\theta \left( x ^ \left( i \right) \right) - y^\left( i \right) \cdot x^\left( i \right) \right)}
$$
}