[regression] improved exercise
This commit is contained in:
parent
fd7d78cb54
commit
49f5687bfe
@ -58,18 +58,20 @@
|
||||
|
||||
\begin{questions}
|
||||
|
||||
\question Implement the gradient descent for finding the parameters
|
||||
of a straigth line \[ y = mx+b \] that we want to fit to the data in
|
||||
the file \emph{lin\_regression.mat}.
|
||||
|
||||
In the lecture we already prepared most of the necessary functions:
|
||||
1. the cost function (\code{lsqError()}), and 2. the gradient
|
||||
(\code{lsqGradient()}). Read chapter 8 ``Optimization and gradient
|
||||
descent'' in the script, in particular section 8.4 and exercise 8.4!
|
||||
\question We want to fit the straigth line \[ y = mx+b \] to the
|
||||
data in the file \emph{lin\_regression.mat}.
|
||||
|
||||
In the lecture we already prepared the cost function
|
||||
(\code{lsqError()}), and the gradient (\code{lsqGradient()}) (read
|
||||
chapter 8 ``Optimization and gradient descent'' in the script, in
|
||||
particular section 8.4 and exercise 8.4!). With these functions in
|
||||
place we here want to implement a gradient descend algorithm that
|
||||
finds the minimum of the cost function and thus the slope and
|
||||
intercept of the straigth line that minimizes the squared distance
|
||||
to the data values.
|
||||
|
||||
The algorithm for the descent towards the minimum of the cost
|
||||
function is as follows:
|
||||
|
||||
\begin{enumerate}
|
||||
\item Start with some arbitrary parameter values (intercept $b_0$
|
||||
and slope $m_0$, $\vec p_0 = (b_0, m_0)$ for the slope and the
|
||||
@ -106,9 +108,11 @@
|
||||
\lstinputlisting{../code/descentfit.m}
|
||||
\end{solution}
|
||||
|
||||
\part Find the position of the minimum of the cost function by
|
||||
means of the \code{min()} function. Compare with the result of the
|
||||
gradient descent method. Vary the value of $\epsilon$ and the
|
||||
\part For checking the gradient descend method from (a) compare
|
||||
its result for slope and intercept with the position of the
|
||||
minimum of the cost function that you get when computing the cost
|
||||
function for many values of the slope and intercept and then using
|
||||
the \code{min()} function. Vary the value of $\epsilon$ and the
|
||||
minimum gradient. What are good values such that the gradient
|
||||
descent gets closest to the true minimum of the cost function?
|
||||
\begin{solution}
|
||||
|
Reference in New Issue
Block a user