diff --git a/README.md b/README.md index 24a0519..959d09a 100644 --- a/README.md +++ b/README.md @@ -1,64 +1,248 @@ -# Chirp detection - GP2023 -## Git-Repository and commands - -- Go to the [Bendalab Git-Server](https://whale.am28.uni-tuebingen.de/git/) (https://whale.am28.uni-tuebingen.de/git/) -- Create your own account (and tell me ;D) - * I'll invite you the repository -- Clone the repository -- -```sh -git clone https://whale.am28.uni-tuebingen.de/git/raab/GP2023_chirp_detection.git -``` - -## Basic git commands - -- pull changes in git -```shell -git pull origin -``` -- commit chances -```shell -git commit -m '' file # commit one file -git commit -a -m '' # commit all files -``` -- push commits -```shell -git push origin -``` - -## Branches -Use branches to work on specific topics (e.g. 'algorithm', 'analysis', 'writing', ore even more specific ones) and merge -them into Master-Branch when it works are up to your expectations. - -The "master" branch should always contain a working/correct version of your project. - -- Create/change into branches -```shell -# list all branches (highlight active branch) -git banch -a -# switch into existing -git checkout -# switch into new branch -git checkout master -git checkout -b -``` - - -- Re-merging with master branch -1) get current version of master and implement it into branch -```shell -git checkout master -git pull origin master -git checkout -git rebase master -``` -This resets you branch to the fork-point, executes all commits of the current master before adding the commits of you -branch. You may have to resolve potential conflicts. Afterwards commit the corrected version and push it to your branch. - -2) Update master branch master -- correct way: Create -```shell -git checkout master -git merge -git push origin master -``` + + + + + + + + + + + + + + + + + + +
+
+ + Logo + + +

chirpdetector

+ +

+ An algorithm to detect the chirps of weakly electric fish. +
+ Explore the docs » +
+
+ View Demo + · + Report Bug + · + Request Feature +

+
+ + + + +
+ Table of Contents +
    +
  1. + About The Project + +
  2. +
  3. + Getting Started + +
  4. +
  5. Usage
  6. +
  7. Roadmap
  8. +
  9. Contributing
  10. +
  11. License
  12. +
  13. Contact
  14. +
  15. Acknowledgments
  16. +
+
+ + + + +## About The Project + +[![Product Name Screen Shot][product-screenshot]](https://example.com) + +Here's a blank template to get started: To avoid retyping too much info. Do a search and replace with your text editor for the following: `github_username`, `repo_name`, `twitter_handle`, `linkedin_username`, `email_client`, `email`, `project_title`, `project_description` + +

(back to top)

+ + + + + + + + + + + + + + +

(back to top)

+ + + + +## Getting Started + +This is an example of how you may give instructions on setting up your project locally. +To get a local copy up and running follow these simple example steps. + + + + + + + + + +### Installation + +1. Get a free API Key at [https://example.com](https://example.com) +2. Clone the repo + ```sh + git clone https://github.com/github_username/repo_name.git + ``` +3. Install NPM packages + ```sh + npm install + ``` +4. Enter your API in `config.js` + ```js + const API_KEY = 'ENTER YOUR API'; + ``` + +

(back to top)

+ + + + +## Usage + +Use this space to show useful examples of how a project can be used. Additional screenshots, code examples and demos work well in this space. You may also link to more resources. + +_For more examples, please refer to the [Documentation](https://example.com)_ + +

(back to top)

+ + + + +## Roadmap + +- [ ] Feature 1 +- [ ] Feature 2 +- [ ] Feature 3 + - [ ] Nested Feature + +See the [open issues](https://github.com/github_username/repo_name/issues) for a full list of proposed features (and known issues). + +

(back to top)

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +## Contact + +Your Name - [@twitter_handle](https://twitter.com/twitter_handle) - email@email_client.com + +Project Link: [https://github.com/github_username/repo_name](https://github.com/github_username/repo_name) + +

(back to top)

+ + + + +## Acknowledgments + +* []() +* []() +* []() + +

(back to top)

+ + + + + +[contributors-shield]: https://img.shields.io/github/contributors/github_username/repo_name.svg?style=for-the-badge +[contributors-url]: https://github.com/github_username/repo_name/graphs/contributors +[forks-shield]: https://img.shields.io/github/forks/github_username/repo_name.svg?style=for-the-badge +[forks-url]: https://github.com/github_username/repo_name/network/members +[stars-shield]: https://img.shields.io/github/stars/github_username/repo_name.svg?style=for-the-badge +[stars-url]: https://github.com/github_username/repo_name/stargazers +[issues-shield]: https://img.shields.io/github/issues/github_username/repo_name.svg?style=for-the-badge +[issues-url]: https://github.com/github_username/repo_name/issues +[license-shield]: https://img.shields.io/github/license/github_username/repo_name.svg?style=for-the-badge +[license-url]: https://github.com/github_username/repo_name/blob/master/LICENSE.txt +[linkedin-shield]: https://img.shields.io/badge/-LinkedIn-black.svg?style=for-the-badge&logo=linkedin&colorB=555 +[linkedin-url]: https://linkedin.com/in/linkedin_username +[product-screenshot]: images/screenshot.png +[Next.js]: https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white +[Next-url]: https://nextjs.org/ +[React.js]: https://img.shields.io/badge/React-20232A?style=for-the-badge&logo=react&logoColor=61DAFB +[React-url]: https://reactjs.org/ +[Vue.js]: https://img.shields.io/badge/Vue.js-35495E?style=for-the-badge&logo=vuedotjs&logoColor=4FC08D +[Vue-url]: https://vuejs.org/ +[Angular.io]: https://img.shields.io/badge/Angular-DD0031?style=for-the-badge&logo=angular&logoColor=white +[Angular-url]: https://angular.io/ +[Svelte.dev]: https://img.shields.io/badge/Svelte-4A4A55?style=for-the-badge&logo=svelte&logoColor=FF3E00 +[Svelte-url]: https://svelte.dev/ +[Laravel.com]: https://img.shields.io/badge/Laravel-FF2D20?style=for-the-badge&logo=laravel&logoColor=white +[Laravel-url]: https://laravel.com +[Bootstrap.com]: https://img.shields.io/badge/Bootstrap-563D7C?style=for-the-badge&logo=bootstrap&logoColor=white +[Bootstrap-url]: https://getbootstrap.com +[JQuery.com]: https://img.shields.io/badge/jQuery-0769AD?style=for-the-badge&logo=jquery&logoColor=white +[JQuery-url]: https://jquery.com + diff --git a/README1.md b/README1.md new file mode 100644 index 0000000..24a0519 --- /dev/null +++ b/README1.md @@ -0,0 +1,64 @@ +# Chirp detection - GP2023 +## Git-Repository and commands + +- Go to the [Bendalab Git-Server](https://whale.am28.uni-tuebingen.de/git/) (https://whale.am28.uni-tuebingen.de/git/) +- Create your own account (and tell me ;D) + * I'll invite you the repository +- Clone the repository +- +```sh +git clone https://whale.am28.uni-tuebingen.de/git/raab/GP2023_chirp_detection.git +``` + +## Basic git commands + +- pull changes in git +```shell +git pull origin +``` +- commit chances +```shell +git commit -m '' file # commit one file +git commit -a -m '' # commit all files +``` +- push commits +```shell +git push origin +``` + +## Branches +Use branches to work on specific topics (e.g. 'algorithm', 'analysis', 'writing', ore even more specific ones) and merge +them into Master-Branch when it works are up to your expectations. + +The "master" branch should always contain a working/correct version of your project. + +- Create/change into branches +```shell +# list all branches (highlight active branch) +git banch -a +# switch into existing +git checkout +# switch into new branch +git checkout master +git checkout -b +``` + + +- Re-merging with master branch +1) get current version of master and implement it into branch +```shell +git checkout master +git pull origin master +git checkout +git rebase master +``` +This resets you branch to the fork-point, executes all commits of the current master before adding the commits of you +branch. You may have to resolve potential conflicts. Afterwards commit the corrected version and push it to your branch. + +2) Update master branch master +- correct way: Create +```shell +git checkout master +git merge +git push origin master +``` diff --git a/assets/logo.png b/assets/logo.png new file mode 100644 index 0000000..234652f Binary files /dev/null and b/assets/logo.png differ diff --git a/assets/logo.svg b/assets/logo.svg new file mode 100644 index 0000000..b34ed6c --- /dev/null +++ b/assets/logo.svg @@ -0,0 +1,1184 @@ + + + + diff --git a/code/behavior.py b/code/behavior.py index bbfda4b..71c0926 100644 --- a/code/behavior.py +++ b/code/behavior.py @@ -1,4 +1,5 @@ import os +import os import numpy as np import matplotlib.pyplot as plt @@ -268,4 +269,5 @@ def main(datapath: str): if __name__ == '__main__': # Path to the data datapath = '../data/mount_data/2020-05-13-10_00/' + datapath = '../data/mount_data/2020-05-13-10_00/' main(datapath) diff --git a/code/chirpdetection.py b/code/chirpdetection.py old mode 100644 new mode 100755 index 200a6a4..c4a04e7 --- a/code/chirpdetection.py +++ b/code/chirpdetection.py @@ -4,17 +4,21 @@ from dataclasses import dataclass import numpy as np from IPython import embed import matplotlib.pyplot as plt +import matplotlib.gridspec as gr from scipy.signal import find_peaks -from scipy.ndimage import gaussian_filter1d -from thunderfish.dataloader import DataLoader from thunderfish.powerspectrum import spectrogram, decibel from sklearn.preprocessing import normalize from modules.filters import bandpass_filter, envelope, highpass_filter from modules.filehandling import ConfLoader, LoadData, make_outputdir -from modules.datahandling import flatten, purge_duplicates, group_timestamps from modules.plotstyle import PlotStyle from modules.logger import makeLogger +from modules.datahandling import ( + flatten, + purge_duplicates, + group_timestamps, + instantaneous_frequency, +) logger = makeLogger(__name__) @@ -23,6 +27,12 @@ ps = PlotStyle() @dataclass class PlotBuffer: + + """ + Buffer to save data that is created in the main detection loop + and plot it outside the detecion loop. + """ + config: ConfLoader t0: float dt: float @@ -32,9 +42,12 @@ class PlotBuffer: time: np.ndarray baseline: np.ndarray + baseline_envelope_unfiltered: np.ndarray baseline_envelope: np.ndarray baseline_peaks: np.ndarray + search_frequency: float search: np.ndarray + search_envelope_unfiltered: np.ndarray search_envelope: np.ndarray search_peaks: np.ndarray @@ -49,140 +62,223 @@ class PlotBuffer: # make data for plotting - # # get index of track data in this time window - # window_idx = np.arange(len(self.data.idx))[ - # (self.data.ident == self.track_id) & (self.data.time[self.data.idx] >= self.t0) & ( - # self.data.time[self.data.idx] <= (self.t0 + self.dt)) - # ] + # get index of track data in this time window + window_idx = np.arange(len(self.data.idx))[ + (self.data.ident == self.track_id) + & (self.data.time[self.data.idx] >= self.t0) + & (self.data.time[self.data.idx] <= (self.t0 + self.dt)) + ] # get tracked frequencies and their times - # freq_temp = self.data.freq[window_idx] - # time_temp = self.data.times[window_idx] + freq_temp = self.data.freq[window_idx] + # time_temp = self.data.time[ + # self.data.idx[self.data.ident == self.track_id]][ + # (self.data.time >= self.t0) + # & (self.data.time <= (self.t0 + self.dt)) + # ] + + # remake the band we filtered in + q25, q50, q75 = np.percentile(freq_temp, [25, 50, 75]) + search_upper, search_lower = ( + q50 + self.search_frequency + self.config.minimal_bandwidth / 2, + q50 + self.search_frequency - self.config.minimal_bandwidth / 2, + ) # get indices on raw data - start_idx = self.t0 * self.data.raw_rate - window_duration = self.dt * self.data.raw_rate + start_idx = (self.t0 - 5) * self.data.raw_rate + window_duration = (self.dt + 10) * self.data.raw_rate stop_idx = start_idx + window_duration # get raw data data_oi = self.data.raw[start_idx:stop_idx, self.electrode] - fig, axs = plt.subplots( - 7, - 1, - figsize=(20 / 2.54, 12 / 2.54), - constrained_layout=True, - sharex=True, - sharey='row', + self.time = self.time - self.t0 + self.frequency_time = self.frequency_time - self.t0 + chirps = np.asarray(chirps) - self.t0 + self.t0_old = self.t0 + self.t0 = 0 + + fig = plt.figure( + figsize=(14 / 2.54, 20 / 2.54) ) + gs0 = gr.GridSpec( + 3, 1, figure=fig, height_ratios=[1, 1, 1] + ) + gs1 = gs0[0].subgridspec(1, 1) + gs2 = gs0[1].subgridspec(3, 1, hspace=0.4) + gs3 = gs0[2].subgridspec(3, 1, hspace=0.4) + # gs4 = gs0[5].subgridspec(1, 1) + + ax6 = fig.add_subplot(gs3[2, 0]) + ax0 = fig.add_subplot(gs1[0, 0], sharex=ax6) + ax1 = fig.add_subplot(gs2[0, 0], sharex=ax6) + ax2 = fig.add_subplot(gs2[1, 0], sharex=ax6) + ax3 = fig.add_subplot(gs2[2, 0], sharex=ax6) + ax4 = fig.add_subplot(gs3[0, 0], sharex=ax6) + ax5 = fig.add_subplot(gs3[1, 0], sharex=ax6) + # ax7 = fig.add_subplot(gs4[0, 0], sharex=ax0) + + # ax_leg = fig.add_subplot(gs0[1, 0]) + + waveform_scaler = 1000 + lw = 1.5 + # plot spectrogram - plot_spectrogram(axs[0], data_oi, self.data.raw_rate, self.t0) + _ = plot_spectrogram( + ax0, + data_oi, + self.data.raw_rate, + self.t0 - 5, + [np.max(self.frequency) - 200, np.max(self.frequency) + 200] + ) + + for track_id in self.data.ids: + + t0_track = self.t0_old - 5 + dt_track = self.dt + 10 + window_idx = np.arange(len(self.data.idx))[ + (self.data.ident == track_id) + & (self.data.time[self.data.idx] >= t0_track) + & (self.data.time[self.data.idx] <= (t0_track + dt_track)) + ] + + # get tracked frequencies and their times + f = self.data.freq[window_idx] + t = self.data.time[ + self.data.idx[self.data.ident == self.track_id]] + tmask = (t >= t0_track) & (t <= (t0_track + dt_track)) + if track_id == self.track_id: + ax0.plot(t[tmask]-self.t0_old, f, lw=lw, + zorder=10, color=ps.gblue1) + else: + ax0.plot(t[tmask]-self.t0_old, f, lw=lw, + zorder=10, color=ps.gray, alpha=0.5) + + ax0.fill_between( + np.arange(self.t0, self.t0 + self.dt, 1 / self.data.raw_rate), + q50 - self.config.minimal_bandwidth / 2, + q50 + self.config.minimal_bandwidth / 2, + color=ps.gblue1, + lw=1, + ls="dashed", + alpha=0.5, + ) + + ax0.fill_between( + np.arange(self.t0, self.t0 + self.dt, 1 / self.data.raw_rate), + search_lower, + search_upper, + color=ps.gblue2, + lw=1, + ls="dashed", + alpha=0.5, + ) + # ax0.axhline(q50, spec_times[0], spec_times[-1], + # color=ps.gblue1, lw=2, ls="dashed") + # ax0.axhline(q50 + self.search_frequency, + # spec_times[0], spec_times[-1], + # color=ps.gblue2, lw=2, ls="dashed") for chirp in chirps: - axs[0].scatter(chirp, np.median(self.frequency), c=ps.red) + ax0.scatter( + chirp, np.median(self.frequency) + 150, c=ps.black, marker="v" + ) # plot waveform of filtered signal - axs[1].plot(self.time, self.baseline, c=ps.green) + ax1.plot(self.time, self.baseline * waveform_scaler, + c=ps.gray, lw=lw, alpha=0.5) + ax1.plot(self.time, self.baseline_envelope_unfiltered * + waveform_scaler, c=ps.gblue1, lw=lw, label="baseline envelope") # plot waveform of filtered search signal - axs[2].plot(self.time, self.search) + ax2.plot(self.time, self.search * waveform_scaler, + c=ps.gray, lw=lw, alpha=0.5) + ax2.plot(self.time, self.search_envelope_unfiltered * + waveform_scaler, c=ps.gblue2, lw=lw, label="search envelope") - # plot baseline instantaneos frequency - axs[3].plot(self.frequency_time, self.frequency) + # plot baseline instantaneous frequency + ax3.plot(self.frequency_time, self.frequency, + c=ps.gblue3, lw=lw, label="baseline inst. freq.") # plot filtered and rectified envelope - axs[4].plot(self.time, self.baseline_envelope) - axs[4].scatter( + ax4.plot(self.time, self.baseline_envelope, c=ps.gblue1, lw=lw) + ax4.scatter( (self.time)[self.baseline_peaks], self.baseline_envelope[self.baseline_peaks], - c=ps.red, + edgecolors=ps.red, + zorder=10, + marker="o", + facecolors="none", ) # plot envelope of search signal - axs[5].plot(self.time, self.search_envelope) - axs[5].scatter( + ax5.plot(self.time, self.search_envelope, c=ps.gblue2, lw=lw) + ax5.scatter( (self.time)[self.search_peaks], self.search_envelope[self.search_peaks], - c=ps.red, + edgecolors=ps.red, + zorder=10, + marker="o", + facecolors="none", ) # plot filtered instantaneous frequency - axs[6].plot(self.frequency_time, self.frequency_filtered) - axs[6].scatter( + ax6.plot(self.frequency_time, + self.frequency_filtered, c=ps.gblue3, lw=lw) + ax6.scatter( self.frequency_time[self.frequency_peaks], self.frequency_filtered[self.frequency_peaks], - c=ps.red, + edgecolors=ps.red, + zorder=10, + marker="o", + facecolors="none", ) - axs[0].set_ylim(np.max(self.frequency)-200, - top=np.max(self.frequency)+200) - axs[6].set_xlabel("Time [s]") - axs[0].set_title("Spectrogram") - axs[1].set_title("Fitered baseline") - axs[2].set_title("Fitered above") - axs[3].set_title("Fitered baseline instanenous frequency") - axs[4].set_title("Filtered envelope of baseline envelope") - axs[5].set_title("Search envelope") - axs[6].set_title( - "Filtered absolute instantaneous frequency") - - if plot == 'show': - plt.show() - elif plot == 'save': - make_outputdir(self.config.outputdir) - out = make_outputdir(self.config.outputdir + - self.data.datapath.split('/')[-2] + '/') - - plt.savefig(f"{out}{self.track_id}_{self.t0}.pdf") - plt.close() - - -def instantaneos_frequency( - signal: np.ndarray, samplerate: int -) -> tuple[np.ndarray, np.ndarray]: - """ - Compute the instantaneous frequency of a signal. - - Parameters - ---------- - signal : np.ndarray - Signal to compute the instantaneous frequency from. - samplerate : int - Samplerate of the signal. - - Returns - ------- - tuple[np.ndarray, np.ndarray] - """ - # calculate instantaneos frequency with zero crossings - roll_signal = np.roll(signal, shift=1) - time_signal = np.arange(len(signal)) / samplerate - period_index = np.arange(len(signal))[( - roll_signal < 0) & (signal >= 0)][1:-1] + ax0.set_ylabel("frequency [Hz]") + ax1.set_ylabel("a.u.") + ax2.set_ylabel("a.u.") + ax3.set_ylabel("Hz") + ax5.set_ylabel("a.u.") + ax6.set_xlabel("time [s]") - upper_bound = np.abs(signal[period_index]) - lower_bound = np.abs(signal[period_index - 1]) - upper_time = np.abs(time_signal[period_index]) - lower_time = np.abs(time_signal[period_index - 1]) + plt.setp(ax0.get_xticklabels(), visible=False) + plt.setp(ax1.get_xticklabels(), visible=False) + plt.setp(ax2.get_xticklabels(), visible=False) + plt.setp(ax3.get_xticklabels(), visible=False) + plt.setp(ax4.get_xticklabels(), visible=False) + plt.setp(ax5.get_xticklabels(), visible=False) - # create ratio - lower_ratio = lower_bound / (lower_bound + upper_bound) + # ps.letter_subplots([ax0, ax1, ax4], xoffset=-0.21) - # appy to time delta - time_delta = upper_time - lower_time - true_zero = lower_time + lower_ratio * time_delta + # ax7.set_xticks(np.arange(0, 5.5, 1)) + # ax7.spines.bottom.set_bounds((0, 5)) - # create new time array - inst_freq_time = true_zero[:-1] + 0.5 * np.diff(true_zero) + ax0.set_xlim(0, self.config.window) + plt.subplots_adjust(left=0.165, right=0.975, + top=0.98, bottom=0.074, hspace=0.2) + fig.align_labels() - # compute frequency - inst_freq = gaussian_filter1d(1 / np.diff(true_zero), 5) + if plot == "show": + plt.show() + elif plot == "save": + make_outputdir(self.config.outputdir) + out = make_outputdir( + self.config.outputdir + self.data.datapath.split("/")[-2] + "/" + ) - return inst_freq_time, inst_freq + plt.savefig(f"{out}{self.track_id}_{self.t0_old}.pdf") + plt.savefig(f"{out}{self.track_id}_{self.t0_old}.svg") + plt.close() -def plot_spectrogram(axis, signal: np.ndarray, samplerate: float, t0: float) -> None: +def plot_spectrogram( + axis, + signal: np.ndarray, + samplerate: float, + window_start_seconds: float, + ylims: list[float] +) -> np.ndarray: """ Plot a spectrogram of a signal. @@ -194,7 +290,7 @@ def plot_spectrogram(axis, signal: np.ndarray, samplerate: float, t0: float) -> Signal to plot the spectrogram from. samplerate : float Samplerate of the signal. - t0 : float + window_start_seconds : float Start time of the signal. """ @@ -204,21 +300,36 @@ def plot_spectrogram(axis, signal: np.ndarray, samplerate: float, t0: float) -> spec_power, spec_freqs, spec_times = spectrogram( signal, ratetime=samplerate, - freq_resolution=50, - overlap_frac=0.2, + freq_resolution=10, + overlap_frac=0.5, ) - axis.pcolormesh( - spec_times + t0, - spec_freqs, - decibel(spec_power), + fmask = np.zeros(spec_freqs.shape, dtype=bool) + fmask[(spec_freqs > ylims[0]) & (spec_freqs < ylims[1])] = True + + axis.imshow( + decibel(spec_power[fmask, :]), + extent=[ + spec_times[0] + window_start_seconds, + spec_times[-1] + window_start_seconds, + spec_freqs[fmask][0], + spec_freqs[fmask][-1], + ], + aspect="auto", + origin="lower", + interpolation="gaussian", + alpha=1, ) - - axis.set_ylim(200, 1200) + # axis.use_sticky_edges = False + return spec_times -def double_bandpass( - data: DataLoader, samplerate: int, freqs: np.ndarray, search_freq: float +def extract_frequency_bands( + raw_data: np.ndarray, + samplerate: int, + baseline_track: np.ndarray, + searchband_center: float, + minimal_bandwidth: float, ) -> tuple[np.ndarray, np.ndarray]: """ Apply a bandpass filter to the baseline of a signal and a second bandpass @@ -226,14 +337,16 @@ def double_bandpass( Parameters ---------- - data : DataLoader + raw_data : np.ndarray Data to apply the filter to. samplerate : int Samplerate of the signal. - freqs : np.ndarray + baseline_track : np.ndarray Tracked fundamental frequencies of the signal. - search_freq : float + searchband_center: float Frequency to search for above or below the baseline. + minimal_bandwidth : float + Minimal bandwidth of the filter. Returns ------- @@ -241,66 +354,211 @@ def double_bandpass( """ # compute boundaries to filter baseline - q25, q75 = np.percentile(freqs, [25, 75]) + q25, q50, q75 = np.percentile(baseline_track, [25, 50, 75]) # check if percentile delta is too small - if q75 - q25 < 5: - median = np.median(freqs) - q25, q75 = median - 2.5, median + 2.5 + if q75 - q25 < 10: + q25, q75 = q50 - minimal_bandwidth / 2, q50 + minimal_bandwidth / 2 # filter baseline - filtered_baseline = bandpass_filter(data, samplerate, lowf=q25, highf=q75) + filtered_baseline = bandpass_filter( + raw_data, samplerate, lowf=q25, highf=q75 + ) # filter search area filtered_search_freq = bandpass_filter( - data, samplerate, lowf=q25 + search_freq, highf=q75 + search_freq + raw_data, + samplerate, + lowf=searchband_center + q50 - minimal_bandwidth / 2, + highf=searchband_center + q50 + minimal_bandwidth / 2, ) - return (filtered_baseline, filtered_search_freq) + return filtered_baseline, filtered_search_freq -def freqmedian_allfish(data: LoadData, t0: float, dt: float) -> tuple[float, list[int]]: +def window_median_all_track_ids( + data: LoadData, window_start_seconds: float, window_duration_seconds: float +) -> tuple[list[tuple[float, float, float]], list[int]]: """ - Calculate the median frequency of all fish in a given time window. + Calculate the median and quantiles of the frequency of all fish in a + given time window. + + Iterate over all track ids and calculate the 25, 50 and 75 percentile + in a given time window to pass this data to 'find_searchband' function, + which then determines whether other fish in the current window fall + within the searchband of the current fish and then determine the + gaps that are outside of the percentile ranges. Parameters ---------- data : LoadData Data to calculate the median frequency from. - t0 : float + window_start_seconds : float Start time of the window. - dt : float + window_duration_seconds : float Duration of the window. Returns ------- - tuple[float, list[int]] + tuple[list[tuple[float, float, float]], list[int]] """ - median_freq = [] + frequency_percentiles = [] track_ids = [] for _, track_id in enumerate(np.unique(data.ident[~np.isnan(data.ident)])): + + # the window index combines the track id and the time window window_idx = np.arange(len(data.idx))[ - (data.ident == track_id) & (data.time[data.idx] >= t0) & ( - data.time[data.idx] <= (t0 + dt)) + (data.ident == track_id) + & (data.time[data.idx] >= window_start_seconds) + & ( + data.time[data.idx] + <= (window_start_seconds + window_duration_seconds) + ) ] if len(data.freq[window_idx]) > 0: - median_freq.append(np.median(data.freq[window_idx])) + frequency_percentiles.append( + np.percentile(data.freq[window_idx], [25, 50, 75])) track_ids.append(track_id) # convert to numpy array - median_freq = np.asarray(median_freq) + frequency_percentiles = np.asarray(frequency_percentiles) track_ids = np.asarray(track_ids) - return median_freq, track_ids + return frequency_percentiles, track_ids + + +def find_searchband( + current_frequency: np.ndarray, + percentiles_ids: np.ndarray, + frequency_percentiles: np.ndarray, + config: ConfLoader, + data: LoadData, +) -> float: + """ + Find the search frequency band for each fish by checking which fish EODs + are above the current EOD and finding a gap in them. + + Parameters + ---------- + current_frequency : np.ndarray + Current EOD frequency array / the current fish of interest. + percentiles_ids : np.ndarray + Array of track IDs of the medians of all other fish in the current + window. + frequency_percentiles : np.ndarray + Array of percentiles frequencies of all other fish in the current window. + config : ConfLoader + Configuration file. + data : LoadData + Data to find the search frequency from. + + Returns + ------- + float + + """ + # frequency window where second filter filters is potentially allowed + # to filter. This is the search window, in which we want to find + # a gap in the other fish's EODs. + + search_window = np.arange( + np.median(current_frequency) + config.search_df_lower, + np.median(current_frequency) + config.search_df_upper, + config.search_res, + ) + + # search window in boolean + search_window_bool = np.ones_like(len(search_window), dtype=bool) + + # make seperate arrays from the qartiles + q25 = np.asarray([i[0] for i in frequency_percentiles]) + q75 = np.asarray([i[2] for i in frequency_percentiles]) + + # get tracks that fall into search window + check_track_ids = percentiles_ids[ + (q25 > search_window[0]) & ( + q75 < search_window[-1]) + ] + + # iterate through theses tracks + if check_track_ids.size != 0: + + for j, check_track_id in enumerate(check_track_ids): + + q25_temp = q25[percentiles_ids == check_track_id] + q75_temp = q75[percentiles_ids == check_track_id] + + print(q25_temp, q75_temp) + + search_window_bool[ + (search_window > q25_temp) & (search_window < q75_temp) + ] = False + + # find gaps in search window + search_window_indices = np.arange(len(search_window)) + + # get search window gaps + # taking the diff of a boolean array gives non zero values where the + # array changes from true to false or vice versa + + search_window_gaps = np.diff(search_window_bool, append=np.nan) + nonzeros = search_window_gaps[np.nonzero(search_window_gaps)[0]] + nonzeros = nonzeros[~np.isnan(nonzeros)] + + # if the first value is -1, the array starst with true, so a gap + if nonzeros[0] == -1: + stops = search_window_indices[search_window_gaps == -1] + starts = np.append( + 0, search_window_indices[search_window_gaps == 1] + ) + + # if the last value is -1, the array ends with true, so a gap + if nonzeros[-1] == 1: + stops = np.append( + search_window_indices[search_window_gaps == -1], + len(search_window) - 1, + ) + + # else it starts with false, so no gap + if nonzeros[0] == 1: + stops = search_window_indices[search_window_gaps == -1] + starts = search_window_indices[search_window_gaps == 1] + + # if the last value is -1, the array ends with true, so a gap + if nonzeros[-1] == 1: + stops = np.append( + search_window_indices[search_window_gaps == -1], + len(search_window), + ) + + # get the frequency ranges of the gaps + search_windows = [search_window[x:y] for x, y in zip(starts, stops)] + search_windows_lens = [len(x) for x in search_windows] + longest_search_window = search_windows[np.argmax(search_windows_lens)] + + # the center of the search frequency band is then the center of + # the longest gap + + search_freq = ( + longest_search_window[-1] - longest_search_window[0] + ) / 2 + + return search_freq + + return config.default_search_freq def main(datapath: str, plot: str) -> None: - assert plot in ["save", "show", "false"] + assert plot in [ + "save", + "show", + "false", + ], "plot must be 'save', 'show' or 'false'" # load raw file data = LoadData(datapath) @@ -313,13 +571,15 @@ def main(datapath: str, plot: str) -> None: window_overlap = config.overlap * data.raw_rate window_edge = config.edge * data.raw_rate - # check if window duration is even + # check if window duration and window ovelap is even, otherwise the half + # of the duration or window overlap would return a float, thus an + # invalid index + if window_duration % 2 == 0: window_duration = int(window_duration) else: raise ValueError("Window duration must be even.") - # check if window ovelap is even if window_overlap % 2 == 0: window_overlap = int(window_overlap) else: @@ -328,339 +588,408 @@ def main(datapath: str, plot: str) -> None: # make time array for raw data raw_time = np.arange(data.raw.shape[0]) / data.raw_rate - # # good chirp times for data: 2022-06-02-10_00 - # t0 = (3 * 60 * 60 + 6 * 60 + 43.5) * data.raw_rate - # dt = 60 * data.raw_rate + # good chirp times for data: 2022-06-02-10_00 + # window_start_index = (3 * 60 * 60 + 6 * 60 + 43.5 + 5) * data.raw_rate + # window_duration_index = 60 * data.raw_rate + + # t0 = 0 + # dt = data.raw.shape[0] + # window_start_seconds = (23495 + ((28336-23495)/3)) * data.raw_rate + # window_duration_seconds = (28336 - 23495) * data.raw_rate - t0 = 0 - dt = data.raw.shape[0] + window_start_index = 0 + window_duration_index = data.raw.shape[0] # generate starting points of rolling window - window_starts = np.arange( - t0, - t0 + dt, + window_start_indices = np.arange( + window_start_index, + window_start_index + window_duration_index, window_duration - (window_overlap + 2 * window_edge), - dtype=int + dtype=int, ) # ititialize lists to store data - chirps = [] - fish_ids = [] + multiwindow_chirps = [] + multiwindow_ids = [] - for st, start_index in enumerate(window_starts): + for st, window_start_index in enumerate(window_start_indices): - logger.info(f"Processing window {st} of {len(window_starts)}") + logger.info(f"Processing window {st+1} of {len(window_start_indices)}") - # make t0 and dt - t0 = start_index / data.raw_rate - dt = window_duration / data.raw_rate + window_start_seconds = window_start_index / data.raw_rate + window_duration_seconds = window_duration / data.raw_rate # set index window - stop_index = start_index + window_duration + window_stop_index = window_start_index + window_duration # calucate median of fish frequencies in window - median_freq, median_ids = freqmedian_allfish(data, t0, dt) + median_freq, median_ids = window_median_all_track_ids( + data, window_start_seconds, window_duration_seconds + ) # iterate through all fish - for tr, track_id in enumerate(np.unique(data.ident[~np.isnan(data.ident)])): + for tr, track_id in enumerate( + np.unique(data.ident[~np.isnan(data.ident)]) + ): logger.debug(f"Processing track {tr} of {len(data.ids)}") # get index of track data in this time window - window_idx = np.arange(len(data.idx))[ - (data.ident == track_id) & (data.time[data.idx] >= t0) & ( - data.time[data.idx] <= (t0 + dt)) + track_window_index = np.arange(len(data.idx))[ + (data.ident == track_id) + & (data.time[data.idx] >= window_start_seconds) + & ( + data.time[data.idx] + <= (window_start_seconds + window_duration_seconds) + ) ] # get tracked frequencies and their times - freq_temp = data.freq[window_idx] - powers_temp = data.powers[window_idx, :] + current_frequencies = data.freq[track_window_index] + current_powers = data.powers[track_window_index, :] # approximate sampling rate to compute expected durations if there # is data available for this time window for this fish id + track_samplerate = np.mean(1 / np.diff(data.time)) - expected_duration = ((t0 + dt) - t0) * track_samplerate + expected_duration = ( + (window_start_seconds + window_duration_seconds) + - window_start_seconds + ) * track_samplerate # check if tracked data available in this window - if len(freq_temp) < expected_duration * 0.5: + if len(current_frequencies) < expected_duration / 2: logger.warning( - f"Track {track_id} has no data in window {st}, skipping.") + f"Track {track_id} has no data in window {st}, skipping." + ) continue # check if there are powers available in this window - nanchecker = np.unique(np.isnan(powers_temp)) - if (len(nanchecker) == 1) and nanchecker[0] == True: + nanchecker = np.unique(np.isnan(current_powers)) + if (len(nanchecker) == 1) and nanchecker[0] is True: logger.warning( - f"No powers available for track {track_id} window {st}, skipping.") + f"No powers available for track {track_id} window {st}," + "skipping." + ) continue - best_electrodes = np.argsort(np.nanmean( - powers_temp, axis=0))[-config.number_electrodes:] - - # frequency where second filter filters - search_window = np.arange( - np.median(freq_temp)+config.search_df_lower, np.median( - freq_temp)+config.search_df_upper, config.search_res) - - # search window in boolean - search_window_bool = np.ones(len(search_window), dtype=bool) + # find the strongest electrodes for the current fish in the current + # window - # get tracks that fall into search window - check_track_ids = median_ids[(median_freq > search_window[0]) & ( - median_freq < search_window[-1])] + best_electrode_index = np.argsort( + np.nanmean(current_powers, axis=0) + )[-config.number_electrodes:] - # iterate through theses tracks - if check_track_ids.size != 0: + # find a frequency above the baseline of the current fish in which + # no other fish is active to search for chirps there - for j, check_track_id in enumerate(check_track_ids): + search_frequency = find_searchband( + config=config, + current_frequency=current_frequencies, + percentiles_ids=median_ids, + data=data, + frequency_percentiles=median_freq, + ) - q1, q2 = np.percentile( - data.freq[data.ident == check_track_id], - config.search_freq_percentiles - ) - - search_window_bool[(search_window > q1) & ( - search_window < q2)] = False - - # find gaps in search window - search_window_indices = np.arange(len(search_window)) - - # get search window gaps - search_window_gaps = np.diff(search_window_bool, append=np.nan) - nonzeros = search_window_gaps[np.nonzero( - search_window_gaps)[0]] - nonzeros = nonzeros[~np.isnan(nonzeros)] - - # if the first value is -1, the array starst with true, so a gap - if nonzeros[0] == -1: - stops = search_window_indices[search_window_gaps == -1] - starts = np.append( - 0, search_window_indices[search_window_gaps == 1]) - - # if the last value is -1, the array ends with true, so a gap - if nonzeros[-1] == 1: - stops = np.append( - search_window_indices[search_window_gaps == -1], - len(search_window) - 1 - ) - - # else it starts with false, so no gap - if nonzeros[0] == 1: - stops = search_window_indices[search_window_gaps == -1] - starts = search_window_indices[search_window_gaps == 1] - - # if the last value is -1, the array ends with true, so a gap - if nonzeros[-1] == 1: - stops = np.append( - search_window_indices[search_window_gaps == -1], - len(search_window) - ) - - # get the frequency ranges of the gaps - search_windows = [search_window[x:y] - for x, y in zip(starts, stops)] - search_windows_lens = [len(x) for x in search_windows] - longest_search_window = search_windows[np.argmax( - search_windows_lens)] - - search_freq = ( - longest_search_window[1] - longest_search_window[0]) / 2 + # add all chirps that are detected on mulitple electrodes for one + # fish fish in one window to this list - else: - search_freq = config.default_search_freq - - # ----------- chrips on the two best electrodes----------- - chirps_electrodes = [] + multielectrode_chirps = [] # iterate through electrodes - for el, electrode in enumerate(best_electrodes): + for el, electrode_index in enumerate(best_electrode_index): logger.debug( - f"Processing electrode {el} of {len(best_electrodes)}") + f"Processing electrode {el+1} of " + f"{len(best_electrode_index)}" + ) + + # LOAD DATA FOR CURRENT ELECTRODE AND CURRENT FISH ------------ # load region of interest of raw data file - data_oi = data.raw[start_index:stop_index, :] - time_oi = raw_time[start_index:stop_index] + current_raw_data = data.raw[ + window_start_index:window_stop_index, electrode_index + ] + current_raw_time = raw_time[ + window_start_index:window_stop_index + ] + + # EXTRACT FEATURES -------------------------------------------- # filter baseline and above - baseline, search = double_bandpass( - data_oi[:, electrode], - data.raw_rate, - freq_temp, - search_freq + baselineband, searchband = extract_frequency_bands( + raw_data=current_raw_data, + samplerate=data.raw_rate, + baseline_track=current_frequencies, + searchband_center=search_frequency, + minimal_bandwidth=config.minimal_bandwidth, ) - # compute instantaneous frequency on broad signal - broad_baseline = bandpass_filter( - data_oi[:, electrode], - data.raw_rate, - lowf=np.mean(freq_temp)-5, - highf=np.mean(freq_temp)+100 - ) + # compute envelope of baseline band to find dips + # in the baseline envelope - # compute instantaneous frequency on narrow signal - baseline_freq_time, baseline_freq = instantaneos_frequency( - baseline, data.raw_rate + baseline_envelope_unfiltered = envelope( + signal=baselineband, + samplerate=data.raw_rate, + cutoff_frequency=config.baseline_envelope_cutoff, ) - # compute envelopes - baseline_envelope_unfiltered = envelope( - baseline, data.raw_rate, config.envelope_cutoff) - search_envelope = envelope( - search, data.raw_rate, config.envelope_cutoff) - - # highpass filter envelopes - baseline_envelope = highpass_filter( - baseline_envelope_unfiltered, - data.raw_rate, - config.envelope_highpass_cutoff + # highpass filter baseline envelope to remove slower + # fluctuations e.g. due to motion envelope + + baseline_envelope = bandpass_filter( + signal=baseline_envelope_unfiltered, + samplerate=data.raw_rate, + lowf=config.baseline_envelope_bandpass_lowf, + highf=config.baseline_envelope_bandpass_highf, ) - # envelopes of filtered envelope of filtered baseline + # highbass filter introduced filter effects, i.e. oscillations + # around peaks. Compute the envelope of the highpass filtered + # and inverted baseline envelope to remove these oscillations + + baseline_envelope = -baseline_envelope + baseline_envelope = envelope( - np.abs(baseline_envelope), - data.raw_rate, - config.envelope_envelope_cutoff + signal=baseline_envelope, + samplerate=data.raw_rate, + cutoff_frequency=config.baseline_envelope_envelope_cutoff, ) - # bandpass filter the instantaneous - inst_freq_filtered = bandpass_filter( - baseline_freq, - data.raw_rate, - lowf=config.instantaneous_lowf, - highf=config.instantaneous_highf + # compute the envelope of the search band. Peaks in the search + # band envelope correspond to troughs in the baseline envelope + # during chirps + + search_envelope_unfiltered = envelope( + signal=searchband, + samplerate=data.raw_rate, + cutoff_frequency=config.search_envelope_cutoff, + ) + search_envelope = search_envelope_unfiltered + + # compute instantaneous frequency of the baseline band to find + # anomalies during a chirp, i.e. a frequency jump upwards or + # sometimes downwards. We do not fully understand why the + # instantaneous frequency can also jump downwards during a + # chirp. This phenomenon is only observed on chirps on a narrow + # filtered baseline such as the one we are working with. + + ( + baseline_frequency_time, + baseline_frequency, + ) = instantaneous_frequency( + signal=baselineband, + samplerate=data.raw_rate, + smoothing_window=config.baseline_frequency_smoothing, ) - # CUT OFF OVERLAP --------------------------------------------- + # bandpass filter the instantaneous frequency to remove slow + # fluctuations. Just as with the baseline envelope, we then + # compute the envelope of the signal to remove the oscillations + # around the peaks - # cut off first and last 0.5 * overlap at start and end - valid = np.arange( - int(window_edge), len(baseline_envelope) - - int(window_edge) + baseline_frequency_samplerate = np.mean( + np.diff(baseline_frequency_time) ) - baseline_envelope_unfiltered = baseline_envelope_unfiltered[valid] - baseline_envelope = baseline_envelope[valid] - search_envelope = search_envelope[valid] - - # get inst freq valid snippet - valid_t0 = int(window_edge) / data.raw_rate - valid_t1 = baseline_freq_time[-1] - \ - (int(window_edge) / data.raw_rate) - - inst_freq_filtered = inst_freq_filtered[ - (baseline_freq_time >= valid_t0) & ( - baseline_freq_time <= valid_t1) - ] - baseline_freq = baseline_freq[ - (baseline_freq_time >= valid_t0) & ( - baseline_freq_time <= valid_t1) - ] + baseline_frequency_filtered = np.abs( + baseline_frequency - np.median(baseline_frequency) + ) + + baseline_frequency_filtered = highpass_filter( + signal=baseline_frequency_filtered, + samplerate=baseline_frequency_samplerate, + cutoff=config.baseline_frequency_highpass_cutoff, + ) + + baseline_frequency_filtered = envelope( + signal=-baseline_frequency_filtered, + samplerate=baseline_frequency_samplerate, + cutoff_frequency=config.baseline_frequency_envelope_cutoff, + ) - baseline_freq_time = baseline_freq_time[ - (baseline_freq_time >= valid_t0) & ( - baseline_freq_time <= valid_t1) - ] + t0 + # CUT OFF OVERLAP --------------------------------------------- + + # cut off snippet at start and end of each window to remove + # filter effects + + # get arrays with raw samplerate without edges + no_edges = np.arange( + int(window_edge), len(baseline_envelope) - int(window_edge) + ) + current_raw_time = current_raw_time[no_edges] + baselineband = baselineband[no_edges] + baseline_envelope_unfiltered = baseline_envelope_unfiltered[no_edges] + searchband = searchband[no_edges] + baseline_envelope = baseline_envelope[no_edges] + search_envelope_unfiltered = search_envelope_unfiltered[no_edges] + search_envelope = search_envelope[no_edges] + + # get instantaneous frequency withoup edges + no_edges_t0 = int(window_edge) / data.raw_rate + no_edges_t1 = baseline_frequency_time[-1] - ( + int(window_edge) / data.raw_rate + ) + no_edges = (baseline_frequency_time >= no_edges_t0) & ( + baseline_frequency_time <= no_edges_t1 + ) - # overwrite raw time to valid region - time_oi = time_oi[valid] - baseline = baseline[valid] - broad_baseline = broad_baseline[valid] - search = search[valid] + baseline_frequency_filtered = baseline_frequency_filtered[ + no_edges + ] + baseline_frequency = baseline_frequency[no_edges] + baseline_frequency_time = ( + baseline_frequency_time[no_edges] + window_start_seconds + ) # NORMALIZE --------------------------------------------------- + # normalize all three feature arrays to the same range to make + # peak detection simpler + baseline_envelope = normalize([baseline_envelope])[0] search_envelope = normalize([search_envelope])[0] - inst_freq_filtered = normalize([np.abs(inst_freq_filtered)])[0] + baseline_frequency_filtered = normalize( + [baseline_frequency_filtered] + )[0] # PEAK DETECTION ---------------------------------------------- # detect peaks baseline_enelope - prominence = np.percentile( - baseline_envelope, config.baseline_prominence_percentile) - baseline_peaks, _ = find_peaks( - baseline_envelope, prominence=prominence) - + baseline_peak_indices, _ = find_peaks( + baseline_envelope, prominence=config.prominence + ) # detect peaks search_envelope - prominence = np.percentile( - search_envelope, config.search_prominence_percentile) - search_peaks, _ = find_peaks( - search_envelope, prominence=prominence) - - # detect peaks inst_freq_filtered - prominence = np.percentile( - inst_freq_filtered, - config.instantaneous_prominence_percentile + search_peak_indices, _ = find_peaks( + search_envelope, prominence=config.prominence ) - inst_freq_peaks, _ = find_peaks( - inst_freq_filtered, - prominence=prominence + # detect peaks inst_freq_filtered + frequency_peak_indices, _ = find_peaks( + baseline_frequency_filtered, prominence=config.prominence ) - # DETECT CHIRPS IN SEARCH WINDOW ------------------------------- + # DETECT CHIRPS IN SEARCH WINDOW ------------------------------ + + # get the peak timestamps from the peak indices + baseline_peak_timestamps = current_raw_time[ + baseline_peak_indices + ] + search_peak_timestamps = current_raw_time[ + search_peak_indices] + + frequency_peak_timestamps = baseline_frequency_time[ + frequency_peak_indices + ] - baseline_ts = time_oi[baseline_peaks] - search_ts = time_oi[search_peaks] - freq_ts = baseline_freq_time[inst_freq_peaks] + # check if one list is empty and if so, skip to the next + # electrode because a chirp cannot be detected if one is empty + + one_feature_empty = ( + len(baseline_peak_timestamps) == 0 + or len(search_peak_timestamps) == 0 + or len(frequency_peak_timestamps) == 0 + ) - # check if one list is empty - if len(baseline_ts) == 0 or len(search_ts) == 0 or len(freq_ts) == 0: + if one_feature_empty: continue - current_chirps = group_timestamps( - [list(baseline_ts), list(search_ts), list(freq_ts)], 3, config.chirp_window_threshold) - # for checking if there are chirps on multiple electrodes - if len(current_chirps) == 0: + # group peak across feature arrays but only if they + # occur in all 3 feature arrays + + sublists = [ + list(baseline_peak_timestamps), + list(search_peak_timestamps), + list(frequency_peak_timestamps), + ] + + singleelectrode_chirps = group_timestamps( + sublists=sublists, + at_least_in=3, + difference_threshold=config.chirp_window_threshold, + ) + + # check it there are chirps detected after grouping, continue + # with the loop if not + + if len(singleelectrode_chirps) == 0: continue - chirps_electrodes.append(current_chirps) + # append chirps from this electrode to the multilectrode list + multielectrode_chirps.append(singleelectrode_chirps) + + # only initialize the plotting buffer if chirps are detected + chirp_detected = ( + (el == config.number_electrodes - 1) + & (len(singleelectrode_chirps) > 0) + & (plot in ["show", "save"]) + ) - if (el == config.number_electrodes - 1) & \ - (len(current_chirps) > 0) & \ - (plot in ["show", "save"]): + if chirp_detected: logger.debug("Detected chirp, ititialize buffer ...") # save data to Buffer buffer = PlotBuffer( config=config, - t0=t0, - dt=dt, - electrode=electrode, + t0=window_start_seconds, + dt=window_duration_seconds, + electrode=electrode_index, track_id=track_id, data=data, - time=time_oi, - baseline=baseline, + time=current_raw_time, + baseline_envelope_unfiltered=baseline_envelope_unfiltered, + baseline=baselineband, baseline_envelope=baseline_envelope, - baseline_peaks=baseline_peaks, - search=search, + baseline_peaks=baseline_peak_indices, + search_frequency=search_frequency, + search=searchband, + search_envelope_unfiltered=search_envelope_unfiltered, search_envelope=search_envelope, - search_peaks=search_peaks, - frequency_time=baseline_freq_time, - frequency=baseline_freq, - frequency_filtered=inst_freq_filtered, - frequency_peaks=inst_freq_peaks, + search_peaks=search_peak_indices, + frequency_time=baseline_frequency_time, + frequency=baseline_frequency, + frequency_filtered=baseline_frequency_filtered, + frequency_peaks=frequency_peak_indices, ) logger.debug("Buffer initialized!") logger.debug( - f"Processed all electrodes for fish {track_id} for this window, sorting chirps ...") + f"Processed all electrodes for fish {track_id} for this" + "window, sorting chirps ..." + ) - if len(chirps_electrodes) == 0: + # check if there are chirps detected in multiple electrodes and + # continue the loop if not + + if len(multielectrode_chirps) == 0: continue - the_real_chirps = group_timestamps(chirps_electrodes, 2, 0.05) + # validate multielectrode chirps, i.e. check if they are + # detected in at least 'config.min_electrodes' electrodes + + multielectrode_chirps_validated = group_timestamps( + sublists=multielectrode_chirps, + at_least_in=config.minimum_electrodes, + difference_threshold=config.chirp_window_threshold, + ) - chirps.append(the_real_chirps) - fish_ids.append(track_id) + # add validated chirps to the list that tracks chirps across there + # rolling time windows - logger.debug('Found %d chirps, starting plotting ... ' % - len(the_real_chirps)) - if len(the_real_chirps) > 0: + multiwindow_chirps.append(multielectrode_chirps_validated) + multiwindow_ids.append(track_id) + + logger.info( + f"Found {len(multielectrode_chirps_validated)}" + f" chirps for fish {track_id} in this window!" + ) + # if chirps are detected and the plot flag is set, plot the + # chirps, otheswise try to delete the buffer if it exists + + if len(multielectrode_chirps_validated) > 0: try: - buffer.plot_buffer(the_real_chirps, plot) + buffer.plot_buffer(multielectrode_chirps_validated, plot) except NameError: pass else: @@ -669,29 +998,55 @@ def main(datapath: str, plot: str) -> None: except NameError: pass - chirps_new = [] - chirps_ids = [] - for tr in np.unique(fish_ids): - tr_index = np.asarray(fish_ids) == tr - ts = flatten(list(compress(chirps, tr_index))) - chirps_new.extend(ts) - chirps_ids.extend(list(np.ones_like(ts)*tr)) + # flatten list of lists containing chirps and create + # an array of fish ids that correspond to the chirps + + multiwindow_chirps_flat = [] + multiwindow_ids_flat = [] + for track_id in np.unique(multiwindow_ids): + + # get chirps for this fish and flatten the list + current_track_bool = np.asarray(multiwindow_ids) == track_id + current_track_chirps = flatten( + list(compress(multiwindow_chirps, current_track_bool)) + ) + + # add flattened chirps to the list + multiwindow_chirps_flat.extend(current_track_chirps) + multiwindow_ids_flat.extend( + list(np.ones_like(current_track_chirps) * track_id) + ) + + # purge duplicates, i.e. chirps that are very close to each other + # duplites arise due to overlapping windows - # purge duplicates purged_chirps = [] - purged_chirps_ids = [] - for tr in np.unique(fish_ids): - tr_chirps = np.asarray(chirps_new)[np.asarray(chirps_ids) == tr] + purged_ids = [] + for track_id in np.unique(multiwindow_ids_flat): + tr_chirps = np.asarray(multiwindow_chirps_flat)[ + np.asarray(multiwindow_ids_flat) == track_id + ] if len(tr_chirps) > 0: tr_chirps_purged = purge_duplicates( - tr_chirps, config.chirp_window_threshold) + tr_chirps, config.chirp_window_threshold + ) purged_chirps.extend(list(tr_chirps_purged)) - purged_chirps_ids.extend(list(np.ones_like(tr_chirps_purged)*tr)) + purged_ids.extend(list(np.ones_like(tr_chirps_purged) * track_id)) + + # sort chirps by time + purged_chirps = np.asarray(purged_chirps) + purged_ids = np.asarray(purged_ids) + purged_ids = purged_ids[np.argsort(purged_chirps)] + purged_chirps = purged_chirps[np.argsort(purged_chirps)] - np.save(datapath + 'chirps.npy', purged_chirps) - np.save(datapath + 'chirps_ids.npy', purged_chirps_ids) + # save them into the data directory + np.save(datapath + "chirps.npy", purged_chirps) + np.save(datapath + "chirp_ids.npy", purged_ids) if __name__ == "__main__": + # datapath = "/home/weygoldt/Data/uni/chirpdetection/GP2023_chirp_detection/data/mount_data/2020-05-13-10_00/" datapath = "../data/2022-06-02-10_00/" + # datapath = "/home/weygoldt/Data/uni/efishdata/2016-colombia/fishgrid/2016-04-09-22_25/" + # datapath = "/home/weygoldt/Data/uni/chirpdetection/GP2023_chirp_detection/data/mount_data/2020-03-13-10_00/" main(datapath, plot="save") diff --git a/code/chirpdetector_conf.yml b/code/chirpdetector_conf.yml index 2c30fa7..2f4fc9a 100755 --- a/code/chirpdetector_conf.yml +++ b/code/chirpdetector_conf.yml @@ -1,48 +1,46 @@ +# directory setup dataroot: "../data/" outputdir: "../output/" # Duration and overlap of the analysis window in seconds -window: 5 +window: 10 overlap: 1 edge: 0.25 # Number of electrodes to go over number_electrodes: 3 +minimum_electrodes: 2 -# Boundary for search frequency in Hz -search_boundary: 100 +# Search window bandwidth and minimal baseline bandwidth +minimal_bandwidth: 20 -# Cutoff frequency for envelope estimation by lowpass filter -envelope_cutoff: 25 +# Instantaneous frequency smoothing usint a gaussian kernel of this width +baseline_frequency_smoothing: 5 -# Cutoff frequency for envelope highpass filter -envelope_highpass_cutoff: 3 +# Baseline processing parameters +baseline_envelope_cutoff: 25 +baseline_envelope_bandpass_lowf: 4 +baseline_envelope_bandpass_highf: 100 +baseline_envelope_envelope_cutoff: 4 -# Cutoff frequency for envelope of envelope -envelope_envelope_cutoff: 5 +# search envelope processing parameters +search_envelope_cutoff: 5 # Instantaneous frequency bandpass filter cutoff frequencies -instantaneous_lowf: 15 -instantaneous_highf: 8000 +baseline_frequency_highpass_cutoff: 0.000005 +baseline_frequency_envelope_cutoff: 0.000005 -# Baseline envelope peak detection parameters -baseline_prominence_percentile: 90 - -# Search envelope peak detection parameters -search_prominence_percentile: 90 - -# Instantaneous frequency peak detection parameters -instantaneous_prominence_percentile: 90 +# peak detecion parameters +prominence: 0.005 # search freq parameter -search_df_lower: 25 +search_df_lower: 20 search_df_upper: 100 search_res: 1 -search_freq_percentiles: - - 5 - - 95 +search_bandwidth: 10 default_search_freq: 50 +# Classify events as chirps if they are less than this time apart chirp_window_threshold: 0.05 diff --git a/code/modules/datahandling.py b/code/modules/datahandling.py index 1de68d8..a1e9f18 100644 --- a/code/modules/datahandling.py +++ b/code/modules/datahandling.py @@ -1,5 +1,78 @@ import numpy as np from typing import List, Any +from scipy.ndimage import gaussian_filter1d +from scipy.stats import gamma, norm + + +def scale01(data): + """ + Normalize data to [0, 1] + + Parameters + ---------- + data : np.ndarray + Data to normalize. + + Returns + ------- + np.ndarray + Normalized data. + + """ + return (2*((data - np.min(data)) / (np.max(data) - np.min(data)))) - 1 + + +def instantaneous_frequency( + signal: np.ndarray, + samplerate: int, + smoothing_window: int, +) -> tuple[np.ndarray, np.ndarray]: + """ + Compute the instantaneous frequency of a signal that is approximately + sinusoidal and symmetric around 0. + + Parameters + ---------- + signal : np.ndarray + Signal to compute the instantaneous frequency from. + samplerate : int + Samplerate of the signal. + smoothing_window : int + Window size for the gaussian filter. + + Returns + ------- + tuple[np.ndarray, np.ndarray] + + """ + # calculate instantaneous frequency with zero crossings + roll_signal = np.roll(signal, shift=1) + time_signal = np.arange(len(signal)) / samplerate + period_index = np.arange(len(signal))[(roll_signal < 0) & (signal >= 0)][ + 1:-1 + ] + + upper_bound = np.abs(signal[period_index]) + lower_bound = np.abs(signal[period_index - 1]) + upper_time = np.abs(time_signal[period_index]) + lower_time = np.abs(time_signal[period_index - 1]) + + # create ratio + lower_ratio = lower_bound / (lower_bound + upper_bound) + + # appy to time delta + time_delta = upper_time - lower_time + true_zero = lower_time + lower_ratio * time_delta + + # create new time array + instantaneous_frequency_time = true_zero[:-1] + 0.5 * np.diff(true_zero) + + # compute frequency + instantaneous_frequency = gaussian_filter1d( + 1 / np.diff(true_zero), smoothing_window + ) + + return instantaneous_frequency_time, instantaneous_frequency def purge_duplicates( @@ -64,7 +137,7 @@ def purge_duplicates( def group_timestamps( - sublists: List[List[float]], n: int, threshold: float + sublists: List[List[float]], at_least_in: int, difference_threshold: float ) -> List[float]: """ Groups timestamps that are less than `threshold` milliseconds apart from @@ -100,7 +173,7 @@ def group_timestamps( # Group timestamps that are less than threshold milliseconds apart for i in range(1, len(timestamps)): - if timestamps[i] - timestamps[i - 1] < threshold: + if timestamps[i] - timestamps[i - 1] < difference_threshold: current_group.append(timestamps[i]) else: groups.append(current_group) @@ -111,7 +184,7 @@ def group_timestamps( # Retain only groups that contain at least n timestamps final_groups = [] for group in groups: - if len(group) >= n: + if len(group) >= at_least_in: final_groups.append(group) # Calculate the mean of each group @@ -137,6 +210,117 @@ def flatten(list: List[List[Any]]) -> List: return [item for sublist in list for item in sublist] +def causal_kde1d(spikes, time, width, shape=2): + """ + causalkde computes a kernel density estimate using a causal kernel (i.e. exponential or gamma distribution). + A shape of 1 turns the gamma distribution into an exponential. + + Parameters + ---------- + spikes : array-like + spike times + time : array-like + sampling time + width : float + kernel width + shape : int, optional + shape of gamma distribution, by default 1 + + Returns + ------- + rate : array-like + instantaneous firing rate + """ + + # compute dt + dt = time[1] - time[0] + + # time on which to compute kernel: + tmax = 10 * width + + # kernel not wider than time + if 2 * tmax > time[-1] - time[0]: + tmax = 0.5 * (time[-1] - time[0]) + + # kernel time + ktime = np.arange(-tmax, tmax, dt) + + # gamma kernel centered in ktime: + kernel = gamma.pdf( + x=ktime, + a=shape, + loc=0, + scale=width, + ) + + # indices of spikes in time array: + indices = np.asarray((spikes - time[0]) / dt, dtype=int) + + # binary spike train: + brate = np.zeros(len(time)) + brate[indices[(indices >= 0) & (indices < len(time))]] = 1.0 + + # convolution with kernel: + rate = np.convolve(brate, kernel, mode="same") + + return rate + + +def acausal_kde1d(spikes, time, width): + """ + causalkde computes a kernel density estimate using a causal kernel (i.e. exponential or gamma distribution). + A shape of 1 turns the gamma distribution into an exponential. + + Parameters + ---------- + spikes : array-like + spike times + time : array-like + sampling time + width : float + kernel width + shape : int, optional + shape of gamma distribution, by default 1 + + Returns + ------- + rate : array-like + instantaneous firing rate + """ + + # compute dt + dt = time[1] - time[0] + + # time on which to compute kernel: + tmax = 10 * width + + # kernel not wider than time + if 2 * tmax > time[-1] - time[0]: + tmax = 0.5 * (time[-1] - time[0]) + + # kernel time + ktime = np.arange(-tmax, tmax, dt) + + # gamma kernel centered in ktime: + kernel = norm.pdf( + x=ktime, + loc=0, + scale=width, + ) + + # indices of spikes in time array: + indices = np.asarray((spikes - time[0]) / dt, dtype=int) + + # binary spike train: + brate = np.zeros(len(time)) + brate[indices[(indices >= 0) & (indices < len(time))]] = 1.0 + + # convolution with kernel: + rate = np.convolve(brate, kernel, mode="same") + + return rate + + if __name__ == "__main__": timestamps = [ diff --git a/code/modules/filters.py b/code/modules/filters.py index 5192cdc..e6d9896 100644 --- a/code/modules/filters.py +++ b/code/modules/filters.py @@ -3,8 +3,8 @@ import numpy as np def bandpass_filter( - data: np.ndarray, - rate: float, + signal: np.ndarray, + samplerate: float, lowf: float, highf: float, ) -> np.ndarray: @@ -12,7 +12,7 @@ def bandpass_filter( Parameters ---------- - data : np.ndarray + signal : np.ndarray The data to be filtered rate : float The sampling rate @@ -26,21 +26,22 @@ def bandpass_filter( np.ndarray The filtered data """ - sos = butter(2, (lowf, highf), "bandpass", fs=rate, output="sos") - fdata = sosfiltfilt(sos, data) - return fdata + sos = butter(2, (lowf, highf), "bandpass", fs=samplerate, output="sos") + filtered_signal = sosfiltfilt(sos, signal) + + return filtered_signal def highpass_filter( - data: np.ndarray, - rate: float, + signal: np.ndarray, + samplerate: float, cutoff: float, ) -> np.ndarray: """Highpass filter a signal. Parameters ---------- - data : np.ndarray + signal : np.ndarray The data to be filtered rate : float The sampling rate @@ -52,14 +53,15 @@ def highpass_filter( np.ndarray The filtered data """ - sos = butter(2, cutoff, "highpass", fs=rate, output="sos") - fdata = sosfiltfilt(sos, data) - return fdata + sos = butter(2, cutoff, "highpass", fs=samplerate, output="sos") + filtered_signal = sosfiltfilt(sos, signal) + + return filtered_signal def lowpass_filter( - data: np.ndarray, - rate: float, + signal: np.ndarray, + samplerate: float, cutoff: float ) -> np.ndarray: """Lowpass filter a signal. @@ -78,21 +80,25 @@ def lowpass_filter( np.ndarray The filtered data """ - sos = butter(2, cutoff, "lowpass", fs=rate, output="sos") - fdata = sosfiltfilt(sos, data) - return fdata + sos = butter(2, cutoff, "lowpass", fs=samplerate, output="sos") + filtered_signal = sosfiltfilt(sos, signal) + return filtered_signal -def envelope(data: np.ndarray, rate: float, freq: float) -> np.ndarray: + +def envelope(signal: np.ndarray, + samplerate: float, + cutoff_frequency: float + ) -> np.ndarray: """Calculate the envelope of a signal using a lowpass filter. Parameters ---------- - data : np.ndarray + signal : np.ndarray The signal to calculate the envelope of - rate : float + samplingrate : float The sampling rate of the signal - freq : float + cutoff_frequency : float The cutoff frequency of the lowpass filter Returns @@ -100,6 +106,7 @@ def envelope(data: np.ndarray, rate: float, freq: float) -> np.ndarray: np.ndarray The envelope of the signal """ - sos = butter(2, freq, "lowpass", fs=rate, output="sos") - envelope = np.sqrt(2) * sosfiltfilt(sos, np.abs(data)) + sos = butter(2, cutoff_frequency, "lowpass", fs=samplerate, output="sos") + envelope = np.sqrt(2) * sosfiltfilt(sos, np.abs(signal)) + return envelope diff --git a/code/modules/plotstyle.py b/code/modules/plotstyle.py index 9e382a7..2325f62 100644 --- a/code/modules/plotstyle.py +++ b/code/modules/plotstyle.py @@ -30,10 +30,14 @@ def PlotStyle() -> None: purple = "#cba6f7" pink = "#f5c2e7" lavender = "#b4befe" + gblue1 = "#8cb8ff" + gblue2 = "#7cdcdc" + gblue3 = "#82e896" @classmethod def lims(cls, track1, track2): - """Helper function to get frequency y axis limits from two fundamental frequency tracks. + """Helper function to get frequency y axis limits from two + fundamental frequency tracks. Args: track1 (array): First track @@ -91,6 +95,16 @@ def PlotStyle() -> None: ax.tick_params(left=False, labelleft=False) ax.patch.set_visible(False) + @classmethod + def hide_xax(cls, ax): + ax.xaxis.set_visible(False) + ax.spines["bottom"].set_visible(False) + + @classmethod + def hide_yax(cls, ax): + ax.yaxis.set_visible(False) + ax.spines["left"].set_visible(False) + @classmethod def set_boxplot_color(cls, bp, color): plt.setp(bp["boxes"], color=color) @@ -216,8 +230,8 @@ def PlotStyle() -> None: plt.rc("figure", titlesize=BIGGER_SIZE) # fontsize of the figure title plt.rcParams["image.cmap"] = 'cmo.haline' - # plt.rcParams["axes.xmargin"] = 0.1 - # plt.rcParams["axes.ymargin"] = 0.15 + plt.rcParams["axes.xmargin"] = 0.05 + plt.rcParams["axes.ymargin"] = 0.1 plt.rcParams["axes.titlelocation"] = "left" plt.rcParams["axes.titlesize"] = BIGGER_SIZE # plt.rcParams["axes.titlepad"] = -10 @@ -230,9 +244,9 @@ def PlotStyle() -> None: plt.rcParams["legend.borderaxespad"] = 0.5 plt.rcParams["legend.fancybox"] = False - # specify the custom font to use - plt.rcParams["font.family"] = "sans-serif" - plt.rcParams["font.sans-serif"] = "Helvetica Now Text" + # # specify the custom font to use + # plt.rcParams["font.family"] = "sans-serif" + # plt.rcParams["font.sans-serif"] = "Helvetica Now Text" # dark mode modifications plt.rcParams["boxplot.flierprops.color"] = white @@ -271,7 +285,7 @@ def PlotStyle() -> None: plt.rcParams["ytick.color"] = gray # color of the ticks plt.rcParams["grid.color"] = dark_gray # grid color plt.rcParams["figure.facecolor"] = black # figure face color - plt.rcParams["figure.edgecolor"] = "#555169" # figure edge color + plt.rcParams["figure.edgecolor"] = black # figure edge color plt.rcParams["savefig.facecolor"] = black # figure face color when saving return style diff --git a/code/plot_introduction_specs.py b/code/plot_introduction_specs.py new file mode 100644 index 0000000..3f8395e --- /dev/null +++ b/code/plot_introduction_specs.py @@ -0,0 +1,121 @@ +import numpy as np +import matplotlib.pyplot as plt +from thunderfish.powerspectrum import spectrogram, decibel + +from modules.filehandling import LoadData +from modules.datahandling import instantaneous_frequency +from modules.filters import bandpass_filter +from modules.plotstyle import PlotStyle + +ps = PlotStyle() + + +def main(): + + # Load data + datapath = "../data/2022-06-02-10_00/" + data = LoadData(datapath) + + # good chirp times for data: 2022-06-02-10_00 + window_start_seconds = 3 * 60 * 60 + 6 * 60 + 43.5 + 9 + 6.25 + window_start_index = window_start_seconds * data.raw_rate + window_duration_seconds = 0.2 + window_duration_index = window_duration_seconds * data.raw_rate + + timescaler = 1000 + + raw = data.raw[window_start_index:window_start_index + + window_duration_index, 10] + + fig, (ax1, ax2, ax3) = plt.subplots( + 3, 1, figsize=(12 * ps.cm, 10*ps.cm), sharex=True, sharey=True) + + # plot instantaneous frequency + filtered1 = bandpass_filter( + signal=raw, lowf=750, highf=1200, samplerate=data.raw_rate) + filtered2 = bandpass_filter( + signal=raw, lowf=550, highf=700, samplerate=data.raw_rate) + + freqtime1, freq1 = instantaneous_frequency( + filtered1, data.raw_rate, smoothing_window=3) + freqtime2, freq2 = instantaneous_frequency( + filtered2, data.raw_rate, smoothing_window=3) + + ax1.plot(freqtime1*timescaler, freq1, color=ps.gblue1, + lw=2, label=f"fish 1, {np.median(freq1):.0f} Hz") + ax1.plot(freqtime2*timescaler, freq2, color=ps.gblue3, + lw=2, label=f"fish 2, {np.median(freq2):.0f} Hz") + ax1.legend(bbox_to_anchor=(0, 1.02, 1, 0.2), loc="lower center", + mode="normal", borderaxespad=0, ncol=2) + ps.hide_xax(ax1) + + # plot fine spectrogram + spec_power, spec_freqs, spec_times = spectrogram( + raw, + ratetime=data.raw_rate, + freq_resolution=150, + overlap_frac=0.2, + ) + + ylims = [300, 1200] + fmask = np.zeros(spec_freqs.shape, dtype=bool) + fmask[(spec_freqs > ylims[0]) & (spec_freqs < ylims[1])] = True + + ax2.imshow( + decibel(spec_power[fmask, :]), + extent=[ + spec_times[0]*timescaler, + spec_times[-1]*timescaler, + spec_freqs[fmask][0], + spec_freqs[fmask][-1], + ], + aspect="auto", + origin="lower", + interpolation="gaussian", + alpha=1, + ) + ps.hide_xax(ax2) + + # plot coarse spectrogram + spec_power, spec_freqs, spec_times = spectrogram( + raw, + ratetime=data.raw_rate, + freq_resolution=10, + overlap_frac=0.3, + ) + fmask = np.zeros(spec_freqs.shape, dtype=bool) + fmask[(spec_freqs > ylims[0]) & (spec_freqs < ylims[1])] = True + ax3.imshow( + decibel(spec_power[fmask, :]), + extent=[ + spec_times[0]*timescaler, + spec_times[-1]*timescaler, + spec_freqs[fmask][0], + spec_freqs[fmask][-1], + ], + aspect="auto", + origin="lower", + interpolation="gaussian", + alpha=1, + ) + # ps.hide_xax(ax3) + + ax3.set_xlabel("time [ms]") + ax2.set_ylabel("frequency [Hz]") + + ax1.set_yticks(np.arange(400, 1201, 400)) + ax1.spines.left.set_bounds((400, 1200)) + ax2.set_yticks(np.arange(400, 1201, 400)) + ax2.spines.left.set_bounds((400, 1200)) + ax3.set_yticks(np.arange(400, 1201, 400)) + ax3.spines.left.set_bounds((400, 1200)) + + plt.subplots_adjust(left=0.17, right=0.98, top=0.9, + bottom=0.14, hspace=0.35) + + plt.savefig('../poster/figs/introplot.pdf') + plt.show() + + +if __name__ == '__main__': + main() diff --git a/poster/figs/Untitled.png b/poster/figs/Untitled.png new file mode 100644 index 0000000..3259ce2 Binary files /dev/null and b/poster/figs/Untitled.png differ diff --git a/poster/figs/algorithm.pdf b/poster/figs/algorithm.pdf new file mode 100644 index 0000000..2e2c453 Binary files /dev/null and b/poster/figs/algorithm.pdf differ diff --git a/poster/figs/introplot.pdf b/poster/figs/introplot.pdf new file mode 100644 index 0000000..cbead3e Binary files /dev/null and b/poster/figs/introplot.pdf differ diff --git a/poster/figs/logo.png b/poster/figs/logo.png new file mode 100644 index 0000000..234652f Binary files /dev/null and b/poster/figs/logo.png differ diff --git a/poster/figs/logo.svg b/poster/figs/logo.svg new file mode 100644 index 0000000..b34ed6c --- /dev/null +++ b/poster/figs/logo.svg @@ -0,0 +1,1184 @@ + + + + diff --git a/poster/figs/placeholder1.png b/poster/figs/placeholder1.png new file mode 100644 index 0000000..2dc3349 Binary files /dev/null and b/poster/figs/placeholder1.png differ diff --git a/poster/main.pdf b/poster/main.pdf index 06827b3..4c1a7c1 100644 Binary files a/poster/main.pdf and b/poster/main.pdf differ diff --git a/poster/main.tex b/poster/main.tex index da4bff1..ca20bb3 100644 --- a/poster/main.tex +++ b/poster/main.tex @@ -7,65 +7,101 @@ blockverticalspace=2mm, colspace=20mm, subcolspace=0mm]{tikzposter} %Default val \begin{document} \renewcommand{\baselinestretch}{1} -\title{\parbox{1900pt}{A dark template to make colorful figures pop}} +\title{\parbox{1900pt}{Pushing the limits of time-frequency uncertainty in the +detection of transient communication signals in weakly electric fish}} \author{Sina Prause, Alexander Wendt, Patrick Weygoldt} -\institute{Supervised by Till Raab \& Jan Benda} +\institute{Supervised by Till Raab \& Jan Benda, Neurothology Group, +University of Tübingen} \usetitlestyle[]{sampletitle} \maketitle \renewcommand{\baselinestretch}{1.4} \begin{columns} -\column{0.3} +\column{0.5} \myblock[TranspBlock]{Introduction}{ - \lipsum[1][1-5] + \begin{minipage}[t]{0.55\linewidth} + The time-frequency tradeoff makes reliable signal detecion and simultaneous + sender identification of freely interacting individuals impossible. + This profoundly limits our current understanding of chirps to experiments + with single - or physically separated - individuals. + \end{minipage} \hfill + \begin{minipage}[t]{0.40\linewidth} + \vspace{-1.5cm} \begin{tikzfigure}[] - \label{griddrawing} - \includegraphics[width=\linewidth]{example-image-a} + \label{tradeoff} + \includegraphics[width=\linewidth]{figs/introplot} \end{tikzfigure} + \end{minipage} } -\myblock[TranspBlock]{Methods}{ - \begin{tikzfigure}[] - \label{detector} - \includegraphics[width=\linewidth]{example-image-b} - \end{tikzfigure} +\myblock[TranspBlock]{A chirp detection algorithm}{ + \begin{tikzfigure}[] + \label{modulations} + \includegraphics[width=\linewidth]{figs/algorithm} + \end{tikzfigure} } -\column{0.4} -\myblock[TranspBlock]{Results}{ - \lipsum[3][1-5] +\column{0.5} +\myblock[TranspBlock]{Chirps and diadic competitions}{ + \begin{minipage}[t]{0.7\linewidth} \begin{tikzfigure}[] \label{modulations} - \includegraphics[width=\linewidth]{example-image-c} + \includegraphics[width=\linewidth]{figs/placeholder1} \end{tikzfigure} -} + \end{minipage} \hfill + \begin{minipage}[t]{0.25\linewidth} + \lipsum[3][1-3] + \end{minipage} -\myblock[TranspBlock]{More Stuff}{ - \lipsum[3][1-9] -} + \begin{minipage}[t]{0.7\linewidth} + \begin{tikzfigure}[] + \label{modulations} + \includegraphics[width=\linewidth]{figs/placeholder1} + \end{tikzfigure} + \end{minipage} \hfill + \begin{minipage}[t]{0.25\linewidth} + \lipsum[3][1-3] + \end{minipage} -\column{0.3} -\myblock[TranspBlock]{More Results}{ + \begin{minipage}[t]{0.7\linewidth} \begin{tikzfigure}[] - \label{results} - \includegraphics[width=\linewidth]{example-image-a} + \label{modulations} + \includegraphics[width=\linewidth]{figs/placeholder1} \end{tikzfigure} + \end{minipage} \hfill + \begin{minipage}[t]{0.25\linewidth} + \lipsum[3][1-3] + \end{minipage} + - \begin{multicols}{2} - \lipsum[5][1-8] - \end{multicols} - \vspace{-1cm} } \myblock[TranspBlock]{Conclusion}{ - \begin{itemize} - \setlength\itemsep{0.5em} - \item \lipsum[1][1] - \item \lipsum[1][1] - \item \lipsum[1][1] - \end{itemize} - \vspace{0.2cm} - } + \lipsum[3][1-9] +} + +% \column{0.3} +% \myblock[TranspBlock]{More Results}{ +% \begin{tikzfigure}[] +% \label{results} +% \includegraphics[width=\linewidth]{example-image-a} +% \end{tikzfigure} + +% \begin{multicols}{2} +% \lipsum[5][1-8] +% \end{multicols} +% \vspace{-1cm} +% } + +% \myblock[TranspBlock]{Conclusion}{ +% \begin{itemize} +% \setlength\itemsep{0.5em} +% \item \lipsum[1][1] +% \item \lipsum[1][1] +% \item \lipsum[1][1] +% \end{itemize} +% \vspace{0.2cm} +% } \end{columns} \node[ @@ -78,6 +114,6 @@ blockverticalspace=2mm, colspace=20mm, subcolspace=0mm]{tikzposter} %Default val fill=boxes, color=boxes, ] at (-0.51\paperwidth,-43.5) { - \textcolor{text}{\normalsize Contact: name.surname@student.uni-tuebingen.de}}; +\textcolor{text}{\normalsize Contact: \{name\}.\{surname\}@student.uni-tuebingen.de}}; \end{document}