69 lines
2.3 KiB
TeX
69 lines
2.3 KiB
TeX
\documentclass[a4paper,12pt,pdftex]{exam}
|
|
|
|
\newcommand{\ptitle}{Mutual information}
|
|
\input{../header.tex}
|
|
\firstpagefooter{Supervisor: Jan Benda}{phone: 29 74573}%
|
|
{email: jan.benda@uni-tuebingen.de}
|
|
|
|
\begin{document}
|
|
|
|
\input{../instructions.tex}
|
|
|
|
|
|
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
|
|
\begin{questions}
|
|
\question A subject was presented two possible objects for a very
|
|
brief time ($50$\,ms). The task of the subject was to report which of
|
|
the two objects was shown. In {\tt decisions.mat} you find an array
|
|
that stores which object was presented in each trial and which
|
|
object was reported by the subject.
|
|
|
|
\begin{parts}
|
|
\part Plot the data appropriately.
|
|
|
|
\part Compute a 2-d histogram that shows how often different
|
|
combinations of reported and presented came up.
|
|
|
|
\part Normalize the histogram such that it sums to one (i.e. make
|
|
it a probability distribution $P(x,y)$ where $x$ is the presented
|
|
object and $y$ is the reported object). Compute the probability
|
|
distributions $P(x)$ and $P(y)$ in the same way.
|
|
|
|
\part Use that probability distribution to compute the mutual
|
|
information
|
|
|
|
\[ I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
|
|
\log_2\frac{P(x,y)}{P(x)P(y)}\]
|
|
that the answers provide about the actually presented object.
|
|
|
|
The mutual information is a measure from information theory that is
|
|
used in neuroscience to quantify, for example, how much information
|
|
a spike train carries about a sensory stimulus.
|
|
|
|
\part What is the maximally achievable mutual information?
|
|
|
|
Show this numerically by generating your own datasets which
|
|
naturally should yield maximal information. Consider different
|
|
distributions of $P(x)$.
|
|
|
|
Here you may encounter a problem when computing the mutual
|
|
information whenever $P(x,y)$ equals zero. For treating this
|
|
special case think about (plot it) what the limit of $x \log x$ is
|
|
for $x$ approaching zero. Use this information to fix the
|
|
computation of the mutual information.
|
|
|
|
\part Use a permutation test to compute the $95\%$ confidence
|
|
interval for the mutual information estimate in the dataset from
|
|
{\tt decisions.mat}. Does the measured mutual information indicate
|
|
signifikant information transmission?
|
|
|
|
\end{parts}
|
|
|
|
\end{questions}
|
|
|
|
|
|
|
|
|
|
|
|
\end{document}
|