improved text for mutual information project
This commit is contained in:
parent
40dbc3cfb9
commit
8f545f997c
@ -19,26 +19,34 @@
|
||||
object was reported by the subject.
|
||||
|
||||
\begin{parts}
|
||||
\part Plot the data appropriately.
|
||||
\part Plot the data appropriately.
|
||||
|
||||
\part Compute a 2-d histogram that shows how often different
|
||||
combinations of reported and presented came up.
|
||||
combinations of reported and presented came up.
|
||||
|
||||
\part Normalize the histogram such that it sums to one (i.e. make
|
||||
it a probability distribution $P(x,y)$ where $x$ is the presented
|
||||
object and $y$ is the reported object). Compute the probability
|
||||
distributions $P(x)$ and $P(y)$ in the same way.
|
||||
|
||||
\part Use that probability distribution to compute the mutual
|
||||
information $$I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
|
||||
\log_2\frac{P(x,y)}{P(x)P(y)}$$ that the answers provide about the
|
||||
actually presented object.
|
||||
information
|
||||
\[ I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
|
||||
\log_2\frac{P(x,y)}{P(x)P(y)}\]
|
||||
that the answers provide about the actually presented object.
|
||||
|
||||
The mutual information is a measure from information theory that is
|
||||
used in neuroscience to quantify, for example, how much information
|
||||
a spike train carries about a sensory stimulus.
|
||||
|
||||
\part What is the maximally achievable mutual information (try to
|
||||
find out by generating your own dataset which naturally should
|
||||
yield maximal information)?
|
||||
\part Use bootstrapping to compute the $95\%$ confidence interval
|
||||
for the mutual information estimate in that dataset.
|
||||
|
||||
\part Use bootstrapping (permutation test) to compute the $95\%$
|
||||
confidence interval for the mutual information estimate in the
|
||||
dataset from {\tt decisions.mat}.
|
||||
|
||||
\end{parts}
|
||||
|
||||
\end{questions}
|
||||
|
Reference in New Issue
Block a user