[projects] updated mutual information and noisy ficurves
This commit is contained in:
parent
a385cab925
commit
46830d54be
@ -11,6 +11,49 @@
|
|||||||
|
|
||||||
|
|
||||||
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
|
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
|
||||||
|
The mutual information is a measure from information theory that is
|
||||||
|
used in neuroscience to quantify, for example, how much information a
|
||||||
|
spike train carries about a sensory stimulus. It quantifies the
|
||||||
|
dependence of an output $y$ (e.g. a spike train) on some input $x$
|
||||||
|
(e.g. a sensory stimulus).
|
||||||
|
|
||||||
|
The probability of each of $n$ input values $x = {x_1, x_2, ... x_n}$
|
||||||
|
is given by the corresponding probabilty distribution $P(x)$. The entropy
|
||||||
|
\begin{equation}
|
||||||
|
\label{entropy}
|
||||||
|
H[x] = - \sum_{x} P(x) \log_2 P(x)
|
||||||
|
\end{equation}
|
||||||
|
is a measure for the surprise of getting a specific value of $x$. For
|
||||||
|
example, if from two possible values '1' and '2', the probability of
|
||||||
|
getting a '1' is close to one ($P(1) \approx 1$) then the probability
|
||||||
|
of getting a '2' is close to zero ($P(2) \approx 0$). For this case
|
||||||
|
the entropy, the surprise level, is almost zero, because both $0 \log
|
||||||
|
0 = 0$ and $1 \log 1 = 0$. It is not surprising at all that you almost
|
||||||
|
always get a '1'. The entropy is largest for equally likely outcomes
|
||||||
|
of $x$. If getting a '1' or a '2' is equally likely then you will be
|
||||||
|
most surprised by each new number you get, because you can not predict
|
||||||
|
them.
|
||||||
|
|
||||||
|
Mutual information measures information transmitted between an input
|
||||||
|
and an output. It is computed from the probability distributions of
|
||||||
|
the input, $P(x)$, the output $P(y)$ and their joint distribution
|
||||||
|
$P(x,y)$:
|
||||||
|
\begin{equation}
|
||||||
|
\label{mi}
|
||||||
|
I[x:y] = \sum_{x}\sum_{y} P(x,y) \log_2\frac{P(x,y)}{P(x)P(y)}
|
||||||
|
\end{equation}
|
||||||
|
where the sums go over all possible values of $x$ and $y$. The mutual
|
||||||
|
information can be also expressed in terms of entropies. Mutual
|
||||||
|
information is the entropy of the outputs $y$ reduced by the entropy
|
||||||
|
of the outputs given the input:
|
||||||
|
\begin{equation}
|
||||||
|
\label{mientropy}
|
||||||
|
I[x:y] = E[y] - E[x|y]
|
||||||
|
\end{equation}
|
||||||
|
|
||||||
|
The following project is meant to explore the concept of mutual
|
||||||
|
information with the help of a simple example.
|
||||||
|
|
||||||
\begin{questions}
|
\begin{questions}
|
||||||
\question A subject was presented two possible objects for a very
|
\question A subject was presented two possible objects for a very
|
||||||
brief time ($50$\,ms). The task of the subject was to report which of
|
brief time ($50$\,ms). The task of the subject was to report which of
|
||||||
@ -19,50 +62,56 @@
|
|||||||
object was reported by the subject.
|
object was reported by the subject.
|
||||||
|
|
||||||
\begin{parts}
|
\begin{parts}
|
||||||
\part Plot the data appropriately.
|
\part Plot the raw data (no sums or probabilities) appropriately.
|
||||||
|
|
||||||
|
\part Compute and plot the probability distributions of presented
|
||||||
|
and reported objects.
|
||||||
|
|
||||||
\part Compute a 2-d histogram that shows how often different
|
\part Compute a 2-d histogram that shows how often different
|
||||||
combinations of reported and presented came up.
|
combinations of reported and presented came up.
|
||||||
|
|
||||||
\part Normalize the histogram such that it sums to one (i.e. make
|
\part Normalize the histogram such that it sums to one (i.e. make
|
||||||
it a probability distribution $P(x,y)$ where $x$ is the presented
|
it a probability distribution $P(x,y)$ where $x$ is the presented
|
||||||
object and $y$ is the reported object). Compute the probability
|
object and $y$ is the reported object).
|
||||||
distributions $P(x)$ and $P(y)$ in the same way.
|
|
||||||
|
\part Use the computed probability distributions to compute the mutual
|
||||||
\part Use that probability distribution to compute the mutual
|
information \eqref{mi} that the answers provide about the
|
||||||
information
|
actually presented object.
|
||||||
|
|
||||||
\[ I[x:y] = \sum_{x\in\{1,2\}}\sum_{y\in\{1,2\}} P(x,y)
|
|
||||||
\log_2\frac{P(x,y)}{P(x)P(y)}\]
|
|
||||||
that the answers provide about the actually presented object.
|
|
||||||
|
|
||||||
The mutual information is a measure from information theory that is
|
|
||||||
used in neuroscience to quantify, for example, how much information
|
|
||||||
a spike train carries about a sensory stimulus.
|
|
||||||
|
|
||||||
\part What is the maximally achievable mutual information?
|
|
||||||
|
|
||||||
Show this numerically by generating your own datasets which
|
|
||||||
naturally should yield maximal information. Consider different
|
|
||||||
distributions of $P(x)$.
|
|
||||||
|
|
||||||
Here you may encounter a problem when computing the mutual
|
|
||||||
information whenever $P(x,y)$ equals zero. For treating this
|
|
||||||
special case think about (plot it) what the limit of $x \log x$ is
|
|
||||||
for $x$ approaching zero. Use this information to fix the
|
|
||||||
computation of the mutual information.
|
|
||||||
|
|
||||||
\part Use a permutation test to compute the $95\%$ confidence
|
\part Use a permutation test to compute the $95\%$ confidence
|
||||||
interval for the mutual information estimate in the dataset from
|
interval for the mutual information estimate in the dataset from
|
||||||
{\tt decisions.mat}. Does the measured mutual information indicate
|
{\tt decisions.mat}. Does the measured mutual information indicate
|
||||||
signifikant information transmission?
|
signifikant information transmission?
|
||||||
|
|
||||||
\end{parts}
|
\end{parts}
|
||||||
|
|
||||||
|
\question What is the maximally achievable mutual information?
|
||||||
|
|
||||||
|
\begin{parts}
|
||||||
|
\part Show this numerically by generating your own datasets which
|
||||||
|
naturally should yield maximal information. Consider different
|
||||||
|
distributions of $P(x)$.
|
||||||
|
|
||||||
\end{questions}
|
\part Compare the maximal mutual information with the corresponding
|
||||||
|
entropy \eqref{entropy}.
|
||||||
|
\end{parts}
|
||||||
|
|
||||||
|
\question What is the minimum possible mutual information?
|
||||||
|
|
||||||
|
This is the mutual information between an output is independent of the
|
||||||
|
input.
|
||||||
|
|
||||||
|
How is the joint distribution $P(x,y)$ related to the marginls
|
||||||
|
$P(x)$ and $P(y)$ if $x$ and $y$ are independent? What is the value
|
||||||
|
of the logarithm in eqn.~\eqref{mi} in this case? So what is the
|
||||||
|
resulting value for the mutual information?
|
||||||
|
|
||||||
|
\end{questions}
|
||||||
|
|
||||||
|
Hint: You may encounter a problem when computing the mutual
|
||||||
|
information whenever $P(x,y)$ equals zero. For treating this special
|
||||||
|
case think about (plot it) what the limit of $x \log x$ is for $x$
|
||||||
|
approaching zero. Use this information to fix the computation of the
|
||||||
|
mutual information.
|
||||||
|
|
||||||
\end{document}
|
\end{document}
|
||||||
|
@ -9,49 +9,50 @@
|
|||||||
|
|
||||||
\input{../instructions.tex}
|
\input{../instructions.tex}
|
||||||
|
|
||||||
\begin{questions}
|
You are recording the activity of neurons that differ in the strength
|
||||||
\question You are recording the activity of a neuron in response to
|
of their intrinsic noise in response to constant stimuli of intensity
|
||||||
constant stimuli of intensity $I$ (think of that, for example,
|
$I$ (think of that, for example, as a current $I$ injected via a
|
||||||
as a current $I$ injected via a patch-electrode into the neuron).
|
patch-electrode into the neuron).
|
||||||
|
|
||||||
Measure the tuning curve (also called the intensity-response curve) of the
|
We first characterize the neurons by their tuning curves (also called
|
||||||
neuron. That is, what is the mean firing rate of the neuron's response
|
intensity-response curve). That is, what is the mean firing rate of
|
||||||
as a function of the constant input current $I$?
|
the neuron's response as a function of the constant input current $I$?
|
||||||
|
|
||||||
How does the intensity-response curve of a neuron depend on the
|
In the second part we demonstrate how intrinsic noise can be useful
|
||||||
level of the intrinsic noise of the neuron?
|
for encoding stimuli on the example of the so called ``subthreshold
|
||||||
|
stochastic resonance''.
|
||||||
How can intrinsic noise be usefull for encoding stimuli?
|
|
||||||
|
The neuron is implemented in the file \texttt{lifspikes.m}. Call it
|
||||||
The neuron is implemented in the file \texttt{lifspikes.m}. Call it
|
with the following parameters:\\[-7ex]
|
||||||
with the following parameters:\\[-7ex]
|
\begin{lstlisting}
|
||||||
\begin{lstlisting}
|
trials = 10;
|
||||||
trials = 10;
|
tmax = 50.0;
|
||||||
tmax = 50.0;
|
current = 10.0; % the constant input current I
|
||||||
current = 10.0; % the constant input current I
|
Dnoise = 1.0; % noise strength
|
||||||
Dnoise = 1.0; % noise strength
|
spikes = lifspikes(trials, current, tmax, Dnoise);
|
||||||
spikes = lifspikes(trials, current, tmax, Dnoise);
|
\end{lstlisting}
|
||||||
\end{lstlisting}
|
The returned \texttt{spikes} is a cell array with \texttt{trials}
|
||||||
The returned \texttt{spikes} is a cell array with \texttt{trials}
|
elements, each being a vector of spike times (in seconds) computed for
|
||||||
elements, each being a vector of spike times (in seconds) computed
|
a duration of \texttt{tmax} seconds. The input current is set via the
|
||||||
for a duration of \texttt{tmax} seconds. The input current is set
|
\texttt{current} variable, the strength of the intrinsic noise via
|
||||||
via the \texttt{current} variable, the strength of the intrinsic
|
\texttt{Dnoise}. If \texttt{current} is a single number, then an input
|
||||||
noise via \texttt{Dnoise}. If \texttt{current} is a single number,
|
current of that intensity is simulated for \texttt{tmax}
|
||||||
then an input current of that intensity is simulated for
|
seconds. Alternatively, \texttt{current} can be a vector containing an
|
||||||
\texttt{tmax} seconds. Alternatively, \texttt{current} can be a
|
input current that changes in time. In this case, \texttt{tmax} is
|
||||||
vector containing an input current that changes in time. In this
|
ignored, and you have to provide a value for the input current for
|
||||||
case, \texttt{tmax} is ignored, and you have to provide a value
|
every 0.0001\,seconds.
|
||||||
for the input current for every 0.0001\,seconds.
|
|
||||||
|
Think of calling the \texttt{lifspikes()} function as a simple way of
|
||||||
Think of calling the \texttt{lifspikes()} function as a simple way
|
doing an electrophysiological experiment. You are presenting a
|
||||||
of doing an electrophysiological experiment. You are presenting a
|
stimulus with a constant intensity $I$ that you set. The neuron
|
||||||
stimulus with a constant intensity $I$ that you set. The neuron
|
responds to this stimulus, and you record this response. After
|
||||||
responds to this stimulus, and you record this response. After
|
detecting the timepoints of the spikes in your recordings you get what
|
||||||
detecting the timepoints of the spikes in your recordings you get
|
the \texttt{lifspikes()} function returns. In addition you can record
|
||||||
what the \texttt{lifspikes()} function returns. In addition you
|
from different neurons with different noise properties by setting the
|
||||||
can record from different neurons with different noise properties
|
\texttt{Dnoise} parameter to different values.
|
||||||
by setting the \texttt{Dnoise} parameter to different values.
|
|
||||||
|
|
||||||
|
\begin{questions}
|
||||||
|
\question Tuning curves
|
||||||
\begin{parts}
|
\begin{parts}
|
||||||
\part First set the noise \texttt{Dnoise=0} (no noise). Compute
|
\part First set the noise \texttt{Dnoise=0} (no noise). Compute
|
||||||
and plot the neuron's $f$-$I$ curve, i.e. the mean firing rate
|
and plot the neuron's $f$-$I$ curve, i.e. the mean firing rate
|
||||||
@ -64,37 +65,43 @@ spikes = lifspikes(trials, current, tmax, Dnoise);
|
|||||||
|
|
||||||
\part Compute the $f$-$I$ curves of neurons with various noise
|
\part Compute the $f$-$I$ curves of neurons with various noise
|
||||||
strengths \texttt{Dnoise}. Use for example $D_{noise} = 10^{-3}$,
|
strengths \texttt{Dnoise}. Use for example $D_{noise} = 10^{-3}$,
|
||||||
$10^{-2}$, and $10^{-1}$.
|
$10^{-2}$, and $10^{-1}$. Depending on the resulting curves you
|
||||||
|
might want to try additional noise levels.
|
||||||
|
|
||||||
How does the intrinsic noise influence the response curve?
|
How does the intrinsic noise level influence the tuning curves?
|
||||||
|
|
||||||
What are possible sources of this intrinsic noise?
|
What are possible sources of this intrinsic noise?
|
||||||
|
|
||||||
\part Show spike raster plots and interspike interval histograms
|
\part Show spike raster plots and interspike interval histograms
|
||||||
of the responses for some interesting values of the input and the
|
of the responses for some interesting values of the input and the
|
||||||
noise strength. For example, you might want to compare the
|
noise strength. For example, you might want to compare the
|
||||||
responses of the four different neurons to the same input, or by
|
responses of the different neurons to the same input, or by the
|
||||||
the same resulting mean firing rate.
|
same resulting mean firing rate.
|
||||||
|
|
||||||
How do the responses differ?
|
How do the responses differ?
|
||||||
|
\end{parts}
|
||||||
|
|
||||||
|
\question Subthreshold stochastic resonance
|
||||||
|
|
||||||
|
Let's now use as an input to the neuron a 1\,s long sine wave $I(t)
|
||||||
|
= I_0 + A \sin(2\pi f t)$ with offset current $I_0$, amplitude $A$,
|
||||||
|
and frequency $f$. Set $I_0=5$, $A=4$, and $f=5$\,Hz.
|
||||||
|
|
||||||
\part Let's now use as an input to the neuron a 1\,s long sine
|
\begin{parts}
|
||||||
wave $I(t) = I_0 + A \sin(2\pi f t)$ with offset current $I_0$,
|
\part Do you get a response of the noiseless ($D_{noise}=0$) neuron?
|
||||||
amplitude $A$, and frequency $f$. Set $I_0=5$, $A=4$, and
|
|
||||||
$f=5$\,Hz.
|
|
||||||
|
|
||||||
Do you get a response of the noiseless ($D_{noise}=0$) neuron?
|
|
||||||
|
|
||||||
What happens if you increase the noise strength?
|
\part What happens if you increase the noise strength?
|
||||||
|
|
||||||
What happens at really large noise strengths?
|
\part What happens at really large noise strengths?
|
||||||
|
|
||||||
Generate some example plots that illustrate your findings.
|
\part Generate some example plots that illustrate your findings.
|
||||||
|
|
||||||
Explain the encoding of the sine wave based on your findings
|
\part Explain the encoding of the sine wave based on your findings
|
||||||
regarding the $f$-$I$ curves.
|
regarding the $f$-$I$ curves.
|
||||||
|
|
||||||
\end{parts}
|
\part Why is this phenomenon called ``subthreshold stochastic resonance''?
|
||||||
|
|
||||||
|
\end{parts}
|
||||||
|
|
||||||
\end{questions}
|
\end{questions}
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user