[regression] updated exercise to new chapter

This commit is contained in:
Jan Benda 2019-12-10 22:33:48 +01:00
parent 3d600e6ab7
commit b7f6abfc94
2 changed files with 8 additions and 8 deletions

View File

@ -62,13 +62,13 @@
data in the file \emph{lin\_regression.mat}.
In the lecture we already prepared the cost function
(\code{lsqError()}), and the gradient (\code{lsqGradient()}) (read
chapter 8 ``Optimization and gradient descent'' in the script, in
particular section 8.4 and exercise 8.4!). With these functions in
place we here want to implement a gradient descend algorithm that
finds the minimum of the cost function and thus the slope and
intercept of the straigth line that minimizes the squared distance
to the data values.
(\code{meanSquaredError()}), and the gradient
(\code{meanSquaredGradient()}) (read chapter 8 ``Optimization and
gradient descent'' in the script, in particular section 8.4 and
exercise 8.4!). With these functions in place we here want to
implement a gradient descend algorithm that finds the minimum of the
cost function and thus the slope and intercept of the straigth line
that minimizes the squared distance to the data values.
The algorithm for the descent towards the minimum of the cost
function is as follows:
@ -86,7 +86,7 @@
why we just require the gradient to be sufficiently small
(e.g. \code{norm(gradient) < 0.1}).
\item \label{gradientstep} Move against the gradient by a small step
($\epsilon = 0.01$):
$\epsilon = 0.01$:
\[\vec p_{i+1} = \vec p_i - \epsilon \cdot \nabla f_{cost}(m_i, b_i)\]
\item Repeat steps \ref{computegradient} -- \ref{gradientstep}.
\end{enumerate}

Binary file not shown.