[regression] updated exercise
This commit is contained in:
parent
6391c63765
commit
8676cdee2e
@ -20,7 +20,7 @@
|
|||||||
\else
|
\else
|
||||||
\newcommand{\stitle}{}
|
\newcommand{\stitle}{}
|
||||||
\fi
|
\fi
|
||||||
\header{{\bfseries\large Exercise 11\stitle}}{{\bfseries\large Gradient descent}}{{\bfseries\large January 9th, 2018}}
|
\header{{\bfseries\large Exercise 10\stitle}}{{\bfseries\large Gradient descent}}{{\bfseries\large December 17th, 2018}}
|
||||||
\firstpagefooter{Dr. Jan Grewe}{Phone: 29 74588}{Email:
|
\firstpagefooter{Dr. Jan Grewe}{Phone: 29 74588}{Email:
|
||||||
jan.grewe@uni-tuebingen.de}
|
jan.grewe@uni-tuebingen.de}
|
||||||
\runningfooter{}{\thepage}{}
|
\runningfooter{}{\thepage}{}
|
||||||
@ -59,12 +59,11 @@
|
|||||||
\begin{questions}
|
\begin{questions}
|
||||||
|
|
||||||
\question Implement the gradient descent for finding the parameters
|
\question Implement the gradient descent for finding the parameters
|
||||||
of a straigth line that we want to fit to the data in the file
|
of a straigth line \[ y = mx+b \] that we want to fit to the data in
|
||||||
\emph{lin\_regression.mat}.
|
the file \emph{lin\_regression.mat}.
|
||||||
|
|
||||||
In the lecture we already prepared most of the necessary functions:
|
In the lecture we already prepared most of the necessary functions:
|
||||||
1. the error function (\code{meanSquareError()}), 2. the cost
|
1. the cost function (\code{lsqError()}), and 2. the gradient
|
||||||
function (\code{lsqError()}), and 3. the gradient
|
|
||||||
(\code{lsqGradient()}). Read chapter 8 ``Optimization and gradient
|
(\code{lsqGradient()}). Read chapter 8 ``Optimization and gradient
|
||||||
descent'' in the script, in particular section 8.4 and exercise 8.4!
|
descent'' in the script, in particular section 8.4 and exercise 8.4!
|
||||||
|
|
||||||
@ -72,8 +71,9 @@
|
|||||||
function is as follows:
|
function is as follows:
|
||||||
|
|
||||||
\begin{enumerate}
|
\begin{enumerate}
|
||||||
\item Start with some arbitrary parameter values $\vec p_0 = (m_0, b_0)$
|
\item Start with some arbitrary parameter values (intercept $b_0$
|
||||||
for the slope and the intercept of the straight line.
|
and slope $m_0$, $\vec p_0 = (b_0, m_0)$ for the slope and the
|
||||||
|
intercept of the straight line.
|
||||||
\item \label{computegradient} Compute the gradient of the cost function
|
\item \label{computegradient} Compute the gradient of the cost function
|
||||||
at the current values of the parameters $\vec p_i$.
|
at the current values of the parameters $\vec p_i$.
|
||||||
\item If the magnitude (length) of the gradient is smaller than some
|
\item If the magnitude (length) of the gradient is smaller than some
|
||||||
|
Reference in New Issue
Block a user