[regression] updated exercise to new chapter
This commit is contained in:
parent
3d600e6ab7
commit
b7f6abfc94
@ -62,13 +62,13 @@
|
||||
data in the file \emph{lin\_regression.mat}.
|
||||
|
||||
In the lecture we already prepared the cost function
|
||||
(\code{lsqError()}), and the gradient (\code{lsqGradient()}) (read
|
||||
chapter 8 ``Optimization and gradient descent'' in the script, in
|
||||
particular section 8.4 and exercise 8.4!). With these functions in
|
||||
place we here want to implement a gradient descend algorithm that
|
||||
finds the minimum of the cost function and thus the slope and
|
||||
intercept of the straigth line that minimizes the squared distance
|
||||
to the data values.
|
||||
(\code{meanSquaredError()}), and the gradient
|
||||
(\code{meanSquaredGradient()}) (read chapter 8 ``Optimization and
|
||||
gradient descent'' in the script, in particular section 8.4 and
|
||||
exercise 8.4!). With these functions in place we here want to
|
||||
implement a gradient descend algorithm that finds the minimum of the
|
||||
cost function and thus the slope and intercept of the straigth line
|
||||
that minimizes the squared distance to the data values.
|
||||
|
||||
The algorithm for the descent towards the minimum of the cost
|
||||
function is as follows:
|
||||
@ -86,7 +86,7 @@
|
||||
why we just require the gradient to be sufficiently small
|
||||
(e.g. \code{norm(gradient) < 0.1}).
|
||||
\item \label{gradientstep} Move against the gradient by a small step
|
||||
($\epsilon = 0.01$):
|
||||
$\epsilon = 0.01$:
|
||||
\[\vec p_{i+1} = \vec p_i - \epsilon \cdot \nabla f_{cost}(m_i, b_i)\]
|
||||
\item Repeat steps \ref{computegradient} -- \ref{gradientstep}.
|
||||
\end{enumerate}
|
||||
|
BIN
regression/exercises/lin_regression.mat
Normal file
BIN
regression/exercises/lin_regression.mat
Normal file
Binary file not shown.
Reference in New Issue
Block a user