[regression] updated exercise

This commit is contained in:
Jan Benda 2018-12-11 16:04:53 +01:00
parent 6391c63765
commit 8676cdee2e

View File

@ -20,7 +20,7 @@
\else
\newcommand{\stitle}{}
\fi
\header{{\bfseries\large Exercise 11\stitle}}{{\bfseries\large Gradient descent}}{{\bfseries\large January 9th, 2018}}
\header{{\bfseries\large Exercise 10\stitle}}{{\bfseries\large Gradient descent}}{{\bfseries\large December 17th, 2018}}
\firstpagefooter{Dr. Jan Grewe}{Phone: 29 74588}{Email:
jan.grewe@uni-tuebingen.de}
\runningfooter{}{\thepage}{}
@ -59,12 +59,11 @@
\begin{questions}
\question Implement the gradient descent for finding the parameters
of a straigth line that we want to fit to the data in the file
\emph{lin\_regression.mat}.
of a straigth line \[ y = mx+b \] that we want to fit to the data in
the file \emph{lin\_regression.mat}.
In the lecture we already prepared most of the necessary functions:
1. the error function (\code{meanSquareError()}), 2. the cost
function (\code{lsqError()}), and 3. the gradient
1. the cost function (\code{lsqError()}), and 2. the gradient
(\code{lsqGradient()}). Read chapter 8 ``Optimization and gradient
descent'' in the script, in particular section 8.4 and exercise 8.4!
@ -72,8 +71,9 @@
function is as follows:
\begin{enumerate}
\item Start with some arbitrary parameter values $\vec p_0 = (m_0, b_0)$
for the slope and the intercept of the straight line.
\item Start with some arbitrary parameter values (intercept $b_0$
and slope $m_0$, $\vec p_0 = (b_0, m_0)$ for the slope and the
intercept of the straight line.
\item \label{computegradient} Compute the gradient of the cost function
at the current values of the parameters $\vec p_i$.
\item If the magnitude (length) of the gradient is smaller than some