improved projects

This commit is contained in:
Jan Benda 2018-02-06 10:40:16 +01:00
parent 68622b3f9e
commit db1133352e
7 changed files with 86 additions and 51 deletions

View File

@ -28,6 +28,8 @@
\subsection{Polar plot} \subsection{Polar plot}
\subsection{print instead of saveas????}
\subsection{Movies and animations} \subsection{Movies and animations}
\section{TODO} \section{TODO}

View File

@ -7,7 +7,7 @@
%%%%% layout %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%% layout %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\usepackage[left=20mm,right=20mm,top=25mm,bottom=25mm]{geometry} \usepackage[left=20mm,right=20mm,top=25mm,bottom=25mm]{geometry}
\pagestyle{headandfoot} \pagestyle{headandfoot}
\header{{\bfseries\large Scientific Computing}}{{\bfseries\large Project: \ptitle}}{{\bfseries\large Januar 18th, 2018}} \header{\textbf{\large Scientific Computing Project: \ptitle}}{}{\textbf{\large January 18th, 2018}}
\runningfooter{}{\thepage}{} \runningfooter{}{\thepage}{}
\setlength{\baselineskip}{15pt} \setlength{\baselineskip}{15pt}
@ -15,6 +15,8 @@
\setlength{\parskip}{0.3cm} \setlength{\parskip}{0.3cm}
\renewcommand{\baselinestretch}{1.15} \renewcommand{\baselinestretch}{1.15}
\setcounter{secnumdepth}{-1}
%%%%% listings %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%% listings %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\usepackage{listings} \usepackage{listings}
\lstset{ \lstset{

View File

@ -1,31 +1,31 @@
\setlength{\fboxsep}{2ex} \setlength{\fboxsep}{2ex}
\fbox{\parbox{1\linewidth}{\small \fbox{\parbox{0.95\linewidth}{\small
{\bf Evaluation criteria:} \textbf{Evaluation criteria:}
Each project has three elements that are graded: (i) the code, Each project has three elements that are graded: (i) the code,
(ii) the slides/figures, and (iii) the presentation. (ii) the quality of the figures, and (iii) the presentation (see below).
\vspace{1ex} \vspace{1ex}
{\bf Dates:} \textbf{Dates:}
The {\bf code} and the {\bf presentation} should be uploaded to The code and the presentation should be uploaded to
ILIAS at latest on Sunday, February 4th, 23:59h. We will ILIAS at latest on Sunday, February 4th, 23:59h. We will
store all presentations on one computer to allow fast store all presentations on one computer to allow fast
transitions between talks. The presentations start on Monday, transitions between talks. The presentations start on Monday,
February 5th at 9:15h. February 5th at 9:15h.
\vspace{1ex} \vspace{1ex}
{\bf Files:} \textbf{Files:}
Please hand in your presentation as a pdf file. Bundle Please hand in your presentation as a pdf file. Bundle
everything (the pdf, the code, and the data) into a {\em single} everything (the pdf, the code, and the data) into a {\em single}
zip-file. zip-file.
\vspace{1ex} \vspace{1ex}
{\bf Code:} \textbf{Code:}
The {\bf code} should be executable without any further The code should be executable without any further
adjustments from our side. A single \texttt{main.m} script adjustments from our side. A single \texttt{main.m} script
should coordinate the analysis by calling functions and should coordinate the analysis by calling functions and
sub-scripts and should produce the {\em same} figures sub-scripts and should produce the {\em same} figures
@ -43,17 +43,15 @@
\vspace{1ex} \vspace{1ex}
{\bf Presentation:} \textbf{Presentation:}
The {\bf presentation} should be {\em at most} 10min long and be The presentation should be {\em at most} 10min long and be held
held in English. In the presentation you should (i) briefly in English. In the presentation you should present figures
describe the problem, (ii) present figures introducing, showing, introducing, explaining, showing, and discussing your data,
and discussing your results, and (iii) explain how you solved methods, and results. All data-related figures you show in the
the problem algorithmically (don't show your entire code). All presentation should be produced by your program --- no editing
data-related figures you show in the presentation should be or labeling by PowerPoint or other software. It is always a good
produced by your program --- no editing or labeling by idea to illustrate the problem with basic plots of the
PowerPoint or other software. It is always a good idea to raw-data. Make sure the axis labels are large enough!
illustrate the problem with basic plots of the raw-data. Make
sure the axis labels are large enough!
}} }}

View File

@ -11,10 +11,10 @@
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
\section*{Estimating the adaptation time-constant.} \section{Estimating the adaptation time-constant}
Stimulating a neuron with a constant stimulus for an extended period of time Stimulating a neuron with a constant stimulus for an extended period of time
often leads to a strong initial response that relaxes over time. This often leads to a strong initial response that relaxes over time. This
process is called adaptation and is ubiquitous. Your task here is to process is called adaptation. Your task here is to
estimate the time-constant of the firing-rate adaptation in P-unit estimate the time-constant of the firing-rate adaptation in P-unit
electroreceptors of the weakly electric fish \textit{Apteronotus electroreceptors of the weakly electric fish \textit{Apteronotus
leptorhynchus}. leptorhynchus}.
@ -26,27 +26,41 @@ electroreceptors of the weakly electric fish \textit{Apteronotus
in the file. The contrast of the stimulus is a measure relative to in the file. The contrast of the stimulus is a measure relative to
the amplitude of fish's field, it has no unit. The data is sampled the amplitude of fish's field, it has no unit. The data is sampled
with 20\,kHz sampling frequency and spike times are given in with 20\,kHz sampling frequency and spike times are given in
milliseconds relative to the stimulus onset. milliseconds (not seconds!) relative to the stimulus onset.
\begin{parts} \begin{parts}
\part Estimate for each stimulus intensity the PSTH and plot \part Estimate for each stimulus intensity the PSTH. You will see
it. You will see that there are three parts. (i) The first that there are three parts: (i) The first 200\,ms is the baseline
200\,ms is the baseline (no stimulus) activity. (ii) During the (no stimulus) activity. (ii) During the next 1000\,ms the stimulus
next 1000\,ms the stimulus was switched on. (iii) After stimulus was switched on. (iii) After stimulus offset the neuronal activity
offset the neuronal activity was recorded for further 825\,ms. was recorded for further 825\,ms. Find an appropriate bin-width
for the PSTH.
\part Estimate the adaptation time-constant for both the stimulus \part Estimate the adaptation time-constant for both the stimulus
on- and offset. To do this fit an exponential function to the on- and offset. To do this fit an exponential function
data. For the decay use: $f_{A,\tau,y_0}(t)$ to appropriate regions of the data:
\begin{equation}
f_{A,\tau,y_0}(t) = y_0 + A \cdot e^{-\frac{t}{\tau}},
\end{equation}
where $y_0$ the offset, $A$ the amplitude, $t$ the time, $\tau$
the time-constant.
For the increasing phases use an exponential of the form:
\begin{equation} \begin{equation}
f_{A,\tau, y_0}(t) = y_0 + A \cdot \left(1 - e^{-\frac{t}{\tau}}\right ), f_{A,\tau,y_0}(t) = A \cdot e^{-\frac{t}{\tau}} + y_0,
\end{equation} \end{equation}
\part Plot the best fits into the data. where $t$ is time, $A$ the (positive or negative) amplitude of the
\part Plot the estimated time-constants as a function of stimulus intensity. exponential decay, $\tau$ the adaptation time-constant, and $y_0$
an offset.
Before you do the fitting, familiarize yourself with the three
parameter of the exponential function. What is the value of
$f_{A,\tau,y_0}(t)$ at $t=0$? What is the value for large times? How does
$f_{A,\tau,y_0}(t)$ change if you change either of the parameter?
Which of the parameter could you directly estimate from the data
(without fitting)?
How could you get good estimates for the other parameter?
Do the fit and show the resulting exponential function together
with the data.
\part Do the estimated time-constants depend on stimulus intensity?
Use an appropriate statistical test to support your observation.
\end{parts} \end{parts}
\end{questions} \end{questions}

View File

@ -11,7 +11,7 @@
%%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%% Questions %%%%%%%%%%%%%%%%%%%%%%%%%
\section*{Quantifying the responsiveness of a neuron using the F-I curve.} \section{Quantifying the responsiveness of a neuron by its F-I curve}
The responsiveness of a neuron is often quantified using an F-I The responsiveness of a neuron is often quantified using an F-I
curve. The F-I curve plots the \textbf{F}iring rate of the neuron as a curve. The F-I curve plots the \textbf{F}iring rate of the neuron as a
function of the stimulus \textbf{I}ntensity. function of the stimulus \textbf{I}ntensity.

View File

@ -16,10 +16,9 @@
\question P-unit electroreceptor afferents of the gymnotiform weakly \question P-unit electroreceptor afferents of the gymnotiform weakly
electric fish \textit{Apteronotus leptorhynchus} are spontaneously electric fish \textit{Apteronotus leptorhynchus} are spontaneously
active when the fish is not electrically stimulated. active when the fish is not electrically stimulated.
\begin{itemize}
\item How do the firing rates and the serial correlations of the How do the firing rates and the serial correlations of the
interspike intervals vary between different cells? interspike intervals vary between different cells?
\end{itemize}
In the file \texttt{baselinespikes.mat} you find two variables: In the file \texttt{baselinespikes.mat} you find two variables:
\texttt{cells} is a cell-array with the names of the recorded cells \texttt{cells} is a cell-array with the names of the recorded cells
@ -34,7 +33,7 @@
this project. this project.
By just looking on the spike rasters, what are the differences By just looking on the spike rasters, what are the differences
betwen the cells? between the cells?
\part Compute the firing rate of each cell, i.e. number of spikes per time. \part Compute the firing rate of each cell, i.e. number of spikes per time.
@ -46,15 +45,18 @@
correlations similar betwen the cells? How do they differ? correlations similar betwen the cells? How do they differ?
\part Implement a permutation test for computing the significance \part Implement a permutation test for computing the significance
at a 1\,\% level of the serial correlations. Illustrate for a few at an appropriate significance level of the serial
cells the computed serial correlations and the 1\,\% and 99\,\% correlations. Keep in mind that you test the correlations at 10
percentile from the permutation test. At which lag are the serial different lags. At which lags are the serial correlations
correlations clearly significant? statistically significant?
\part Are the serial correlations somehow dependent on the firing rate? \part Are the serial correlations somehow dependent on the firing rate?
Plot the significant correlations against the firing rate. Do you Plot the significant correlations against the firing rate. Do you
observe any dependence? observe any dependence?
Use an appropriate statistical test to support your observation.
\end{parts} \end{parts}
\end{questions} \end{questions}

View File

@ -16,8 +16,25 @@
\input{regression} \input{regression}
Example for fit with matlab functions lsqcurvefit, polyfit \section{Fitting in practice}
Fit with matlab functions lsqcurvefit, polyfit
\subsection{Non-linear fits}
\begin{itemize}
\item Example that illustrates the Nebenminima Problem (with error surface)
\item You need got initial values for the parameter!
\item Example that fitting gets harder the more parameter yuo have.
\item Try to fix as many parameter before doing the fit.
\item How to test the quality of a fit? Residuals. $\Chi^2$ test. Run-test.
\end{itemize}
\subsection{Linear fits}
\begin{itemize}
\item Polyfit is easy: unique solution!
\item Example for overfitting with polyfit of a high order (=number of data points)
\end{itemize}
Example for overfitting with polyfit of a high order (=number of data points)
\end{document} \end{document}