51 lines
1.8 KiB
TeX
51 lines
1.8 KiB
TeX
\documentclass[a4paper,12pt,pdftex]{exam}
|
|
|
|
\newcommand{\ptitle}{Mutual information}
|
|
\input{../header.tex}
|
|
\firstpagefooter{Supervisor: Jan Benda}{phone: 29 74573}%
|
|
{email: jan.benda@uni-tuebingen.de}
|
|
|
|
\begin{document}
|
|
|
|
\input{../instructions.tex}
|
|
|
|
|
|
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
|
|
\begin{questions}
|
|
\question A subject was presented two possible objects for a very
|
|
brief time ($50$\,ms). The task of the subject was to report which of
|
|
the two objects was shown. In {\tt decisions.mat} you find an array
|
|
that stores which object was presented in each trial and which
|
|
object was reported by the subject.
|
|
|
|
\begin{parts}
|
|
\part Plot the data appropriately.
|
|
\part Compute a 2-d histogram that shows how often different
|
|
combinations of reported and presented came up.
|
|
\part Normalize the histogram such that it sums to one (i.e. make
|
|
it a probability distribution $P(x,y)$ where $x$ is the presented
|
|
object and $y$ is the reported object). Compute the probability
|
|
distributions $P(x)$ and $P(y)$ in the same way.
|
|
\part Use that probability distribution to compute the mutual
|
|
information $$I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
|
|
\log_2\frac{P(x,y)}{P(x)P(y)}$$ that the answers provide about the
|
|
actually presented object.
|
|
|
|
The mutual information is a measure from information theory that is
|
|
used in neuroscience to quantify, for example, how much information
|
|
a spike train carries about a sensory stimulus.
|
|
\part What is the maximally achievable mutual information (try to
|
|
find out by generating your own dataset which naturally should
|
|
yield maximal information)?
|
|
\part Use bootstrapping to compute the $95\%$ confidence interval
|
|
for the mutual information estimate in that dataset.
|
|
\end{parts}
|
|
|
|
\end{questions}
|
|
|
|
|
|
|
|
|
|
|
|
\end{document}
|