improved text for mutual information project

This commit is contained in:
Jan Benda 2018-02-01 09:37:59 +01:00
parent 40dbc3cfb9
commit 8f545f997c

View File

@ -20,25 +20,33 @@
\begin{parts} \begin{parts}
\part Plot the data appropriately. \part Plot the data appropriately.
\part Compute a 2-d histogram that shows how often different \part Compute a 2-d histogram that shows how often different
combinations of reported and presented came up. combinations of reported and presented came up.
\part Normalize the histogram such that it sums to one (i.e. make \part Normalize the histogram such that it sums to one (i.e. make
it a probability distribution $P(x,y)$ where $x$ is the presented it a probability distribution $P(x,y)$ where $x$ is the presented
object and $y$ is the reported object). Compute the probability object and $y$ is the reported object). Compute the probability
distributions $P(x)$ and $P(y)$ in the same way. distributions $P(x)$ and $P(y)$ in the same way.
\part Use that probability distribution to compute the mutual \part Use that probability distribution to compute the mutual
information $$I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y) information
\log_2\frac{P(x,y)}{P(x)P(y)}$$ that the answers provide about the \[ I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
actually presented object. \log_2\frac{P(x,y)}{P(x)P(y)}\]
that the answers provide about the actually presented object.
The mutual information is a measure from information theory that is The mutual information is a measure from information theory that is
used in neuroscience to quantify, for example, how much information used in neuroscience to quantify, for example, how much information
a spike train carries about a sensory stimulus. a spike train carries about a sensory stimulus.
\part What is the maximally achievable mutual information (try to \part What is the maximally achievable mutual information (try to
find out by generating your own dataset which naturally should find out by generating your own dataset which naturally should
yield maximal information)? yield maximal information)?
\part Use bootstrapping to compute the $95\%$ confidence interval
for the mutual information estimate in that dataset. \part Use bootstrapping (permutation test) to compute the $95\%$
confidence interval for the mutual information estimate in the
dataset from {\tt decisions.mat}.
\end{parts} \end{parts}
\end{questions} \end{questions}