Merge branch 'master' into chirp_simulation

This commit is contained in:
wendtalexander 2023-01-25 23:31:01 +01:00
commit 7b74a7c024
46 changed files with 7526 additions and 611 deletions

4
.gitignore vendored
View File

@ -1,9 +1,10 @@
# Created by https://www.toptal.com/developers/gitignore/api/python,visualstudiocode
# Created by https://www.toptal.com/developers/gitignore/api/python,visualstudiocode
# Edit at https://www.toptal.com/developers/gitignore?templates=python,visualstudiocode
# Own stuff
data
env
output
# Mac Stuff
*.DS_Store
@ -13,6 +14,7 @@ env
__pycache__/
*.py[cod]
*$py.class
poster/main.pdf
# C extensions
*.so

312
README.md
View File

@ -1,64 +1,248 @@
# Chirp detection - GP2023
## Git-Repository and commands
- Go to the [Bendalab Git-Server](https://whale.am28.uni-tuebingen.de/git/) (https://whale.am28.uni-tuebingen.de/git/)
- Create your own account (and tell me ;D)
* I'll invite you the repository
- Clone the repository
-
```sh
git clone https://whale.am28.uni-tuebingen.de/git/raab/GP2023_chirp_detection.git
```
## Basic git commands
- pull changes in git
```shell
git pull origin <branch>
```
- commit chances
```shell
git commit -m '<explaination>' file # commit one file
git commit -a -m '<explaination>' # commit all files
```
- push commits
```shell
git push origin <branch>
```
## Branches
Use branches to work on specific topics (e.g. 'algorithm', 'analysis', 'writing', ore even more specific ones) and merge
them into Master-Branch when it works are up to your expectations.
The "master" branch should always contain a working/correct version of your project.
- Create/change into branches
```shell
# list all branches (highlight active branch)
git banch -a
# switch into existing
git checkout <existing branch>
# switch into new branch
git checkout master
git checkout -b <new branch>
```
- Re-merging with master branch
1) get current version of master and implement it into branch
```shell
git checkout master
git pull origin master
git checkout <branch>
git rebase master
```
This resets you branch to the fork-point, executes all commits of the current master before adding the commits of you
branch. You may have to resolve potential conflicts. Afterwards commit the corrected version and push it to your branch.
2) Update master branch master
- correct way: Create
```shell
git checkout master
git merge <branch>
git push origin master
```
<!-- Improved compatibility of back to top link: See: https://github.com/othneildrew/Best-README-Template/pull/73 -->
<a name="readme-top"></a>
<!--
*** Thanks for checking out the Best-README-Template. If you have a suggestion
*** that would make this better, please fork the repo and create a pull request
*** or simply open an issue with the tag "enhancement".
*** Don't forget to give the project a star!
*** Thanks again! Now go create something AMAZING! :D
-->
<!-- PROJECT SHIELDS -->
<!--
*** I'm using markdown "reference style" links for readability.
*** Reference links are enclosed in brackets [ ] instead of parentheses ( ).
*** See the bottom of this document for the declaration of the reference variables
*** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use.
*** https://www.markdownguide.org/basic-syntax/#reference-style-links
-->
<!-- [![Contributors][contributors-shield]][contributors-url] -->
<!-- [![Forks][forks-shield]][forks-url] -->
<!-- [![Stargazers][stars-shield]][stars-url] -->
<!-- [![Issues][issues-shield]][issues-url] -->
<!-- [![MIT License][license-shield]][license-url] -->
<!-- [![LinkedIn][linkedin-shield]][linkedin-url] -->
<!-- PROJECT LOGO -->
<br />
<div align="center">
<a href="https://github.com/github_username/repo_name">
<img src="assets/logo.png" alt="Logo" width="150" height="100">
</a>
<h3 align="center">chirpdetector</h3>
<p align="center">
An algorithm to detect the chirps of weakly electric fish.
<br />
<a href="https://github.com/github_username/repo_name"><strong>Explore the docs »</strong></a>
<br />
<br />
<a href="https://github.com/github_username/repo_name">View Demo</a>
·
<a href="https://github.com/github_username/repo_name/issues">Report Bug</a>
·
<a href="https://github.com/github_username/repo_name/issues">Request Feature</a>
</p>
</div>
<!-- TABLE OF CONTENTS -->
<details>
<summary>Table of Contents</summary>
<ol>
<li>
<a href="#about-the-project">About The Project</a>
<ul>
<li><a href="#built-with">Built With</a></li>
</ul>
</li>
<li>
<a href="#getting-started">Getting Started</a>
<ul>
<li><a href="#prerequisites">Prerequisites</a></li>
<li><a href="#installation">Installation</a></li>
</ul>
</li>
<li><a href="#usage">Usage</a></li>
<li><a href="#roadmap">Roadmap</a></li>
<li><a href="#contributing">Contributing</a></li>
<li><a href="#license">License</a></li>
<li><a href="#contact">Contact</a></li>
<li><a href="#acknowledgments">Acknowledgments</a></li>
</ol>
</details>
<!-- ABOUT THE PROJECT -->
## About The Project
[![Product Name Screen Shot][product-screenshot]](https://example.com)
Here's a blank template to get started: To avoid retyping too much info. Do a search and replace with your text editor for the following: `github_username`, `repo_name`, `twitter_handle`, `linkedin_username`, `email_client`, `email`, `project_title`, `project_description`
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- ### Built With -->
<!-- * [![Next][Next.js]][Next-url] -->
<!-- * [![React][React.js]][React-url] -->
<!-- * [![Vue][Vue.js]][Vue-url] -->
<!-- * [![Angular][Angular.io]][Angular-url] -->
<!-- * [![Svelte][Svelte.dev]][Svelte-url] -->
<!-- * [![Laravel][Laravel.com]][Laravel-url] -->
<!-- * [![Bootstrap][Bootstrap.com]][Bootstrap-url] -->
<!-- * [![JQuery][JQuery.com]][JQuery-url] -->
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- GETTING STARTED -->
## Getting Started
This is an example of how you may give instructions on setting up your project locally.
To get a local copy up and running follow these simple example steps.
<!-- ### Prerequisites -->
<!-- This is an example of how to list things you need to use the software and how to install them. -->
<!-- * npm -->
<!-- ```sh -->
<!-- npm install npm@latest -g -->
<!-- ``` -->
### Installation
1. Get a free API Key at [https://example.com](https://example.com)
2. Clone the repo
```sh
git clone https://github.com/github_username/repo_name.git
```
3. Install NPM packages
```sh
npm install
```
4. Enter your API in `config.js`
```js
const API_KEY = 'ENTER YOUR API';
```
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- USAGE EXAMPLES -->
## Usage
Use this space to show useful examples of how a project can be used. Additional screenshots, code examples and demos work well in this space. You may also link to more resources.
_For more examples, please refer to the [Documentation](https://example.com)_
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- ROADMAP -->
## Roadmap
- [ ] Feature 1
- [ ] Feature 2
- [ ] Feature 3
- [ ] Nested Feature
See the [open issues](https://github.com/github_username/repo_name/issues) for a full list of proposed features (and known issues).
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- <!-1- CONTRIBUTING -1-> -->
<!-- ## Contributing -->
<!-- Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**. -->
<!-- If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". -->
<!-- Don't forget to give the project a star! Thanks again! -->
<!-- 1. Fork the Project -->
<!-- 2. Create your Feature Branch (`git checkout -b feature/AmazingFeature`) -->
<!-- 3. Commit your Changes (`git commit -m 'Add some AmazingFeature'`) -->
<!-- 4. Push to the Branch (`git push origin feature/AmazingFeature`) -->
<!-- 5. Open a Pull Request -->
<!-- <p align="right">(<a href="#readme-top">back to top</a>)</p> -->
<!-- <!-1- LICENSE -1-> -->
<!-- ## License -->
<!-- Distributed under the MIT License. See `LICENSE.txt` for more information. -->
<!-- <p align="right">(<a href="#readme-top">back to top</a>)</p> -->
<!-- CONTACT -->
## Contact
Your Name - [@twitter_handle](https://twitter.com/twitter_handle) - email@email_client.com
Project Link: [https://github.com/github_username/repo_name](https://github.com/github_username/repo_name)
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- ACKNOWLEDGMENTS -->
## Acknowledgments
* []()
* []()
* []()
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<!-- MARKDOWN LINKS & IMAGES -->
<!-- https://www.markdownguide.org/basic-syntax/#reference-style-links -->
[contributors-shield]: https://img.shields.io/github/contributors/github_username/repo_name.svg?style=for-the-badge
[contributors-url]: https://github.com/github_username/repo_name/graphs/contributors
[forks-shield]: https://img.shields.io/github/forks/github_username/repo_name.svg?style=for-the-badge
[forks-url]: https://github.com/github_username/repo_name/network/members
[stars-shield]: https://img.shields.io/github/stars/github_username/repo_name.svg?style=for-the-badge
[stars-url]: https://github.com/github_username/repo_name/stargazers
[issues-shield]: https://img.shields.io/github/issues/github_username/repo_name.svg?style=for-the-badge
[issues-url]: https://github.com/github_username/repo_name/issues
[license-shield]: https://img.shields.io/github/license/github_username/repo_name.svg?style=for-the-badge
[license-url]: https://github.com/github_username/repo_name/blob/master/LICENSE.txt
[linkedin-shield]: https://img.shields.io/badge/-LinkedIn-black.svg?style=for-the-badge&logo=linkedin&colorB=555
[linkedin-url]: https://linkedin.com/in/linkedin_username
[product-screenshot]: images/screenshot.png
[Next.js]: https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white
[Next-url]: https://nextjs.org/
[React.js]: https://img.shields.io/badge/React-20232A?style=for-the-badge&logo=react&logoColor=61DAFB
[React-url]: https://reactjs.org/
[Vue.js]: https://img.shields.io/badge/Vue.js-35495E?style=for-the-badge&logo=vuedotjs&logoColor=4FC08D
[Vue-url]: https://vuejs.org/
[Angular.io]: https://img.shields.io/badge/Angular-DD0031?style=for-the-badge&logo=angular&logoColor=white
[Angular-url]: https://angular.io/
[Svelte.dev]: https://img.shields.io/badge/Svelte-4A4A55?style=for-the-badge&logo=svelte&logoColor=FF3E00
[Svelte-url]: https://svelte.dev/
[Laravel.com]: https://img.shields.io/badge/Laravel-FF2D20?style=for-the-badge&logo=laravel&logoColor=white
[Laravel-url]: https://laravel.com
[Bootstrap.com]: https://img.shields.io/badge/Bootstrap-563D7C?style=for-the-badge&logo=bootstrap&logoColor=white
[Bootstrap-url]: https://getbootstrap.com
[JQuery.com]: https://img.shields.io/badge/jQuery-0769AD?style=for-the-badge&logo=jquery&logoColor=white
[JQuery-url]: https://jquery.com

64
README1.md Normal file
View File

@ -0,0 +1,64 @@
# Chirp detection - GP2023
## Git-Repository and commands
- Go to the [Bendalab Git-Server](https://whale.am28.uni-tuebingen.de/git/) (https://whale.am28.uni-tuebingen.de/git/)
- Create your own account (and tell me ;D)
* I'll invite you the repository
- Clone the repository
-
```sh
git clone https://whale.am28.uni-tuebingen.de/git/raab/GP2023_chirp_detection.git
```
## Basic git commands
- pull changes in git
```shell
git pull origin <branch>
```
- commit chances
```shell
git commit -m '<explaination>' file # commit one file
git commit -a -m '<explaination>' # commit all files
```
- push commits
```shell
git push origin <branch>
```
## Branches
Use branches to work on specific topics (e.g. 'algorithm', 'analysis', 'writing', ore even more specific ones) and merge
them into Master-Branch when it works are up to your expectations.
The "master" branch should always contain a working/correct version of your project.
- Create/change into branches
```shell
# list all branches (highlight active branch)
git banch -a
# switch into existing
git checkout <existing branch>
# switch into new branch
git checkout master
git checkout -b <new branch>
```
- Re-merging with master branch
1) get current version of master and implement it into branch
```shell
git checkout master
git pull origin master
git checkout <branch>
git rebase master
```
This resets you branch to the fork-point, executes all commits of the current master before adding the commits of you
branch. You may have to resolve potential conflicts. Afterwards commit the corrected version and push it to your branch.
2) Update master branch master
- correct way: Create
```shell
git checkout master
git merge <branch>
git push origin master
```

BIN
assets/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

1184
assets/logo.svg Normal file

File diff suppressed because it is too large Load Diff

After

Width:  |  Height:  |  Size: 84 KiB

View File

@ -1,16 +1,21 @@
from pathlib import Path
import os
import os
import numpy as np
import matplotlib.pyplot as plt
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from scipy.ndimage import gaussian_filter1d
logger = makeLogger(__name__)
class Behavior:
"""Load behavior data from csv file as class attributes
Attributes
----------
behavior: 0: chasing onset, 1: chasing offset, 2: physical contact
behavior_type:
behavioral_category:
comment_start:
@ -20,22 +25,36 @@ class Behavior:
media_file:
observation_date:
observation_id:
start_s:
stop_s:
start_s: start time of the event in seconds
stop_s: stop time of the event in seconds
total_length:
"""
def __init__(self, datapath: str) -> None:
csv_file = str(sorted(Path(datapath).glob('**/*.csv'))[0])
self.dataframe = read_csv(csv_file, delimiter=',')
for key in self.dataframe:
def __init__(self, folder_path: str) -> None:
LED_on_time_BORIS = np.load(os.path.join(folder_path, 'LED_on_time.npy'), allow_pickle=True)
self.time = np.load(os.path.join(folder_path, "times.npy"), allow_pickle=True)
csv_filename = [f for f in os.listdir(folder_path) if f.endswith('.csv')][0] # check if there are more than one csv file
self.dataframe = read_csv(os.path.join(folder_path, csv_filename))
self.chirps = np.load(os.path.join(folder_path, 'chirps.npy'), allow_pickle=True)
self.chirps_ids = np.load(os.path.join(folder_path, 'chirps_ids.npy'), allow_pickle=True)
for k, key in enumerate(self.dataframe.keys()):
key = key.lower()
if ' ' in key:
new_key = key.replace(' ', '_')
if '(' in new_key:
new_key = new_key.replace('(', '')
new_key = new_key.replace(')', '')
new_key = new_key.lower()
setattr(self, new_key, np.array(self.dataframe[key]))
key = key.replace(' ', '_')
if '(' in key:
key = key.replace('(', '')
key = key.replace(')', '')
setattr(self, key, np.array(self.dataframe[self.dataframe.keys()[k]]))
last_LED_t_BORIS = LED_on_time_BORIS[-1]
real_time_range = self.time[-1] - self.time[0]
factor = 1.034141
shift = last_LED_t_BORIS - real_time_range * factor
self.start_s = (self.start_s - shift) / factor
self.stop_s = (self.stop_s - shift) / factor
"""
1 - chasing onset
@ -64,12 +83,191 @@ temporal encpding needs to be corrected ... not exactly 25FPS.
behavior = data['Behavior']
"""
def correct_chasing_events(
category: np.ndarray,
timestamps: np.ndarray
) -> tuple[np.ndarray, np.ndarray]:
onset_ids = np.arange(
len(category))[category == 0]
offset_ids = np.arange(
len(category))[category == 1]
# Check whether on- or offset is longer and calculate length difference
if len(onset_ids) > len(offset_ids):
len_diff = len(onset_ids) - len(offset_ids)
longer_array = onset_ids
shorter_array = offset_ids
logger.info(f'Onsets are greater than offsets by {len_diff}')
elif len(onset_ids) < len(offset_ids):
len_diff = len(offset_ids) - len(onset_ids)
longer_array = offset_ids
shorter_array = onset_ids
logger.info(f'Offsets are greater than offsets by {len_diff}')
elif len(onset_ids) == len(offset_ids):
logger.info('Chasing events are equal')
return category, timestamps
# Correct the wrong chasing events; delete double events
wrong_ids = []
for i in range(len(longer_array)-(len_diff+1)):
if (shorter_array[i] > longer_array[i]) & (shorter_array[i] < longer_array[i+1]):
pass
else:
wrong_ids.append(longer_array[i])
longer_array = np.delete(longer_array, i)
category = np.delete(
category, wrong_ids)
timestamps = np.delete(
timestamps, wrong_ids)
return category, timestamps
def event_triggered_chirps(
event: np.ndarray,
chirps:np.ndarray,
time_before_event: int,
time_after_event: int
)-> tuple[np.ndarray, np.ndarray]:
event_chirps = [] # chirps that are in specified window around event
centered_chirps = [] # timestamps of chirps around event centered on the event timepoint
for event_timestamp in event:
start = event_timestamp - time_before_event # timepoint of window start
stop = event_timestamp + time_after_event # timepoint of window ending
chirps_around_event = [c for c in chirps if (c >= start) & (c <= stop)] # get chirps that are in a -5 to +5 sec window around event
event_chirps.append(chirps_around_event)
if len(chirps_around_event) == 0:
continue
else:
centered_chirps.append(chirps_around_event - event_timestamp)
centered_chirps = np.concatenate(centered_chirps, axis=0) # convert list of arrays to one array for plotting
return event_chirps, centered_chirps
def main(datapath: str):
# behabvior is pandas dataframe with all the data
behavior = Behavior(datapath)
# behavior is pandas dataframe with all the data
bh = Behavior(datapath)
# chirps are not sorted in time (presumably due to prior groupings)
# get and sort chirps and corresponding fish_ids of the chirps
chirps = bh.chirps[np.argsort(bh.chirps)]
chirps_fish_ids = bh.chirps_ids[np.argsort(bh.chirps)]
category = bh.behavior
timestamps = bh.start_s
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
# split categories
chasing_onset = timestamps[category == 0]
chasing_offset = timestamps[category == 1]
physical_contact = timestamps[category == 2]
# First overview plot
fig1, ax1 = plt.subplots()
ax1.scatter(chirps, np.ones_like(chirps), marker='*', color='royalblue', label='Chirps')
ax1.scatter(chasing_onset, np.ones_like(chasing_onset)*2, marker='.', color='forestgreen', label='Chasing onset')
ax1.scatter(chasing_offset, np.ones_like(chasing_offset)*2.5, marker='.', color='firebrick', label='Chasing offset')
ax1.scatter(physical_contact, np.ones_like(physical_contact)*3, marker='x', color='black', label='Physical contact')
plt.legend()
# plt.show()
plt.close()
# Get fish ids
fish_ids = np.unique(chirps_fish_ids)
##### Chasing triggered chirps CTC #####
# Evaluate how many chirps were emitted in specific time window around the chasing onset events
# Iterate over chasing onsets (later over fish)
time_around_event = 5 # time window around the event in which chirps are counted, 5 = -5 to +5 sec around event
#### Loop crashes at concatenate in function ####
# for i in range(len(fish_ids)):
# fish = fish_ids[i]
# chirps = chirps[chirps_fish_ids == fish]
# print(fish)
chasing_chirps, centered_chasing_chirps = event_triggered_chirps(chasing_onset, chirps, time_around_event, time_around_event)
physical_chirps, centered_physical_chirps = event_triggered_chirps(physical_contact, chirps, time_around_event, time_around_event)
# Kernel density estimation ???
# centered_chasing_chirps_convolved = gaussian_filter1d(centered_chasing_chirps, 5)
# centered_chasing = chasing_onset[0] - chasing_onset[0] ## get the 0 timepoint for plotting; set one chasing event to 0
offsets = [0.5, 1]
fig4, ax4 = plt.subplots(figsize=(20 / 2.54, 12 / 2.54), constrained_layout=True)
ax4.eventplot(np.array([centered_chasing_chirps, centered_physical_chirps]), lineoffsets=offsets, linelengths=0.25, colors=['g', 'r'])
ax4.vlines(0, 0, 1.5, 'tab:grey', 'dashed', 'Timepoint of event')
# ax4.plot(centered_chasing_chirps_convolved)
ax4.set_yticks(offsets)
ax4.set_yticklabels(['Chasings', 'Physical \n contacts'])
ax4.set_xlabel('Time[s]')
ax4.set_ylabel('Type of event')
plt.show()
# Associate chirps to inidividual fish
fish1 = chirps[chirps_fish_ids == fish_ids[0]]
fish2 = chirps[chirps_fish_ids == fish_ids[1]]
fish = [len(fish1), len(fish2)]
### Plots:
# 1. All recordings, all fish, all chirps
# One CTC, one PTC
# 2. All recordings, only winners
# One CTC, one PTC
# 3. All recordings, all losers
# One CTC, one PTC
#### Chirp counts per fish general #####
fig2, ax2 = plt.subplots()
x = ['Fish1', 'Fish2']
width = 0.35
ax2.bar(x, fish, width=width)
ax2.set_ylabel('Chirp count')
# plt.show()
plt.close()
##### Count chirps emitted during chasing events and chirps emitted out of chasing events #####
chirps_in_chasings = []
for onset, offset in zip(chasing_onset, chasing_offset):
chirps_in_chasing = [c for c in chirps if (c > onset) & (c < offset)]
chirps_in_chasings.append(chirps_in_chasing)
# chirps out of chasing events
counts_chirps_chasings = 0
chasings_without_chirps = 0
for i in chirps_in_chasings:
if i:
chasings_without_chirps += 1
else:
counts_chirps_chasings += 1
# chirps in chasing events
fig3 , ax3 = plt.subplots()
ax3.bar(['Chirps in chasing events', 'Chasing events without Chirps'], [counts_chirps_chasings, chasings_without_chirps], width=width)
plt.ylabel('Count')
# plt.show()
plt.close()
# comparison between chasing events with and without chirps
embed()
exit()
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/2020-03-13-10_00/'
datapath = '../data/mount_data/2020-05-13-10_00/'
datapath = '../data/mount_data/2020-05-13-10_00/'
main(datapath)

1268
code/chirpdetection.py Normal file → Executable file

File diff suppressed because it is too large Load Diff

61
code/chirpdetector_conf.yml Normal file → Executable file
View File

@ -1,42 +1,41 @@
# Duration and overlap of the analysis window in seconds
window: 5
overlap: 1
edge: 0.25
# Path setup ------------------------------------------------------------------
# Number of electrodes to go over
electrodes: 3
dataroot: "../data/" # path to data
outputdir: "../output/" # path to save plots to
# Boundary for search frequency in Hz
search_boundary: 100
# Rolling window parameters ---------------------------------------------------
# Cutoff frequency for envelope estimation by lowpass filter
envelope_cutoff: 25
window: 5 # rolling window length in seconds
overlap: 1 # window overlap in seconds
edge: 0.25 # window edge cufoffs to mitigate filter edge effects
# Cutoff frequency for envelope highpass filter
envelope_highpass_cutoff: 3
# Electrode iteration parameters ----------------------------------------------
# Cutoff frequency for envelope of envelope
envelope_envelope_cutoff: 5
number_electrodes: 2 # number of electrodes to go over
minimum_electrodes: 1 # mimumun number of electrodes a chirp must be on
# Instantaneous frequency bandpass filter cutoff frequencies
instantaneous_lowf: 15
instantaneous_highf: 8000
# Feature extraction parameters -----------------------------------------------
# Baseline envelope peak detection parameters
baseline_prominence_percentile: 90
search_df_lower: 20 # start searching this far above the baseline
search_df_upper: 100 # stop searching this far above the baseline
search_res: 1 # search window resolution
default_search_freq: 60 # search here if no need for a search frequency
minimal_bandwidth: 10 # minimal bandpass filter width for baseline
search_bandwidth: 10 # minimal bandpass filter width for search frequency
baseline_frequency_smoothing: 10 # instantaneous frequency smoothing
# Search envelope peak detection parameters
search_prominence_percentile: 75
# Feature processing parameters -----------------------------------------------
# Instantaneous frequency peak detection parameters
instantaneous_prominence_percentile: 90
baseline_envelope_cutoff: 25 # envelope estimation cutoff
baseline_envelope_bandpass_lowf: 2 # envelope badpass lower cutoff
baseline_envelope_bandpass_highf: 100 # envelope bandbass higher cutoff
search_envelope_cutoff: 10 # search envelope estimation cufoff
# search freq parameter
search_df_lower: 25
search_df_upper: 100
search_res: 1
search_freq_percentiles:
- 5
- 95
default_search_freq: 50
# Peak detecion parameters ----------------------------------------------------
baseline_prominence: 0.00005 # peak prominence threshold for baseline envelope
search_prominence: 0.000004 # peak prominence threshold for search envelope
frequency_prominence: 2 # peak prominence threshold for baseline freq
# Classify events as chirps if they are less than this time apart
chirp_window_threshold: 0.02

593
code/eventchirpsplots.py Normal file
View File

@ -0,0 +1,593 @@
import os
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from tqdm import tqdm
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.plotstyle import PlotStyle
from modules.datahandling import causal_kde1d, acausal_kde1d, flatten
logger = makeLogger(__name__)
ps = PlotStyle()
class Behavior:
"""Load behavior data from csv file as class attributes
Attributes
----------
behavior: 0: chasing onset, 1: chasing offset, 2: physical contact
behavior_type:
behavioral_category:
comment_start:
comment_stop:
dataframe: pandas dataframe with all the data
duration_s:
media_file:
observation_date:
observation_id:
start_s: start time of the event in seconds
stop_s: stop time of the event in seconds
total_length:
"""
def __init__(self, folder_path: str) -> None:
print(f'{folder_path}')
LED_on_time_BORIS = np.load(os.path.join(
folder_path, 'LED_on_time.npy'), allow_pickle=True)
self.time = np.load(os.path.join(
folder_path, "times.npy"), allow_pickle=True)
csv_filename = [f for f in os.listdir(folder_path) if f.endswith(
'.csv')][0] # check if there are more than one csv file
self.dataframe = read_csv(os.path.join(folder_path, csv_filename))
self.chirps = np.load(os.path.join(
folder_path, 'chirps.npy'), allow_pickle=True)
self.chirps_ids = np.load(os.path.join(
folder_path, 'chirp_ids.npy'), allow_pickle=True)
for k, key in enumerate(self.dataframe.keys()):
key = key.lower()
if ' ' in key:
key = key.replace(' ', '_')
if '(' in key:
key = key.replace('(', '')
key = key.replace(')', '')
setattr(self, key, np.array(
self.dataframe[self.dataframe.keys()[k]]))
last_LED_t_BORIS = LED_on_time_BORIS[-1]
real_time_range = self.time[-1] - self.time[0]
factor = 1.034141
shift = last_LED_t_BORIS - real_time_range * factor
self.start_s = (self.start_s - shift) / factor
self.stop_s = (self.stop_s - shift) / factor
"""
1 - chasing onset
2 - chasing offset
3 - physical contact event
temporal encpding needs to be corrected ... not exactly 25FPS.
### correspinding python code ###
factor = 1.034141
LED_on_time_BORIS = np.load(os.path.join(folder_path, 'LED_on_time.npy'), allow_pickle=True)
last_LED_t_BORIS = LED_on_time_BORIS[-1]
real_time_range = times[-1] - times[0]
shift = last_LED_t_BORIS - real_time_range * factor
data = pd.read_csv(os.path.join(folder_path, file[1:-7] + '.csv'))
boris_times = data['Start (s)']
data_times = []
for Cevent_t in boris_times:
Cevent_boris_times = (Cevent_t - shift) / factor
data_times.append(Cevent_boris_times)
data_times = np.array(data_times)
behavior = data['Behavior']
"""
def correct_chasing_events(
category: np.ndarray,
timestamps: np.ndarray
) -> tuple[np.ndarray, np.ndarray]:
onset_ids = np.arange(
len(category))[category == 0]
offset_ids = np.arange(
len(category))[category == 1]
wrong_bh = np.arange(len(category))[
category != 2][:-1][np.diff(category[category != 2]) == 0]
if onset_ids[0] > offset_ids[0]:
offset_ids = np.delete(offset_ids, 0)
help_index = offset_ids[0]
wrong_bh = np.append(wrong_bh[help_index])
category = np.delete(category, wrong_bh)
timestamps = np.delete(timestamps, wrong_bh)
# Check whether on- or offset is longer and calculate length difference
if len(onset_ids) > len(offset_ids):
len_diff = len(onset_ids) - len(offset_ids)
logger.info(f'Onsets are greater than offsets by {len_diff}')
elif len(onset_ids) < len(offset_ids):
len_diff = len(offset_ids) - len(onset_ids)
logger.info(f'Offsets are greater than onsets by {len_diff}')
elif len(onset_ids) == len(offset_ids):
logger.info('Chasing events are equal')
return category, timestamps
def event_triggered_chirps(
event: np.ndarray,
chirps: np.ndarray,
time_before_event: int,
time_after_event: int,
dt: float,
width: float,
) -> tuple[np.ndarray, np.ndarray, np.ndarray]:
event_chirps = [] # chirps that are in specified window around event
# timestamps of chirps around event centered on the event timepoint
centered_chirps = []
for event_timestamp in event:
start = event_timestamp - time_before_event
stop = event_timestamp + time_after_event
chirps_around_event = [c for c in chirps if (c >= start) & (c <= stop)]
event_chirps.append(chirps_around_event)
if len(chirps_around_event) == 0:
continue
else:
centered_chirps.append(chirps_around_event - event_timestamp)
time = np.arange(-time_before_event, time_after_event, dt)
# Kernel density estimation with some if's
if len(centered_chirps) == 0:
centered_chirps = np.array([])
centered_chirps_convolved = np.zeros(len(time))
else:
# convert list of arrays to one array for plotting
centered_chirps = np.concatenate(centered_chirps, axis=0)
centered_chirps_convolved = (acausal_kde1d(
centered_chirps, time, width)) / len(event)
return event_chirps, centered_chirps, centered_chirps_convolved
def main(datapath: str):
foldernames = [
datapath + x + '/' for x in os.listdir(datapath) if os.path.isdir(datapath + x)]
nrecording_chirps = []
nrecording_chirps_fish_ids = []
nrecording_chasing_onsets = []
nrecording_chasing_offsets = []
nrecording_physicals = []
# Iterate over all recordings and save chirp- and event-timestamps
for folder in foldernames:
# exclude folder with empty LED_on_time.npy
if folder == '../data/mount_data/2020-05-12-10_00/':
continue
bh = Behavior(folder)
# Chirps are already sorted
category = bh.behavior
timestamps = bh.start_s
chirps = bh.chirps
nrecording_chirps.append(chirps)
chirps_fish_ids = bh.chirps_ids
nrecording_chirps_fish_ids.append(chirps_fish_ids)
fish_ids = np.unique(chirps_fish_ids)
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
# Split categories
chasing_onsets = timestamps[category == 0]
nrecording_chasing_onsets.append(chasing_onsets)
chasing_offsets = timestamps[category == 1]
nrecording_chasing_offsets.append(chasing_offsets)
physical_contacts = timestamps[category == 2]
nrecording_physicals.append(physical_contacts)
# Define time window for chirps around event analysis
time_before_event = 30
time_after_event = 60
dt = 0.01
width = 1.5 # width of kernel for all recordings, currently gaussian kernel
recording_width = 2 # width of kernel for each recording
time = np.arange(-time_before_event, time_after_event, dt)
##### Chirps around events, all fish, all recordings #####
# Centered chirps per event type
nrecording_centered_onset_chirps = []
nrecording_centered_offset_chirps = []
nrecording_centered_physical_chirps = []
# Bootstrapped chirps per recording and per event: 27[1000[n]] 27 recs, 1000 shuffles, n chirps
nrecording_shuffled_convolved_onset_chirps = []
nrecording_shuffled_convolved_offset_chirps = []
nrecording_shuffled_convolved_physical_chirps = []
nbootstrapping = 100
for i in range(len(nrecording_chirps)):
chirps = nrecording_chirps[i]
chasing_onsets = nrecording_chasing_onsets[i]
chasing_offsets = nrecording_chasing_offsets[i]
physical_contacts = nrecording_physicals[i]
# Chirps around chasing onsets
_, centered_chasing_onset_chirps, cc_chasing_onset_chirps = event_triggered_chirps(
chasing_onsets, chirps, time_before_event, time_after_event, dt, recording_width)
# Chirps around chasing offsets
_, centered_chasing_offset_chirps, cc_chasing_offset_chirps = event_triggered_chirps(
chasing_offsets, chirps, time_before_event, time_after_event, dt, recording_width)
# Chirps around physical contacts
_, centered_physical_chirps, cc_physical_chirps = event_triggered_chirps(
physical_contacts, chirps, time_before_event, time_after_event, dt, recording_width)
nrecording_centered_onset_chirps.append(centered_chasing_onset_chirps)
nrecording_centered_offset_chirps.append(
centered_chasing_offset_chirps)
nrecording_centered_physical_chirps.append(centered_physical_chirps)
## Shuffled chirps ##
nshuffled_onset_chirps = []
nshuffled_offset_chirps = []
nshuffled_physical_chirps = []
# for j in tqdm(range(nbootstrapping)):
# # Calculate interchirp intervals; add first chirp timestamp in beginning to get equal lengths
# interchirp_intervals = np.append(np.array([chirps[0]]), np.diff(chirps))
# np.random.shuffle(interchirp_intervals)
# shuffled_chirps = np.cumsum(interchirp_intervals)
# # Shuffled chasing onset chirps
# _, _, cc_shuffled_onset_chirps = event_triggered_chirps(chasing_onsets, shuffled_chirps, time_before_event, time_after_event, dt, recording_width)
# nshuffled_onset_chirps.append(cc_shuffled_onset_chirps)
# # Shuffled chasing offset chirps
# _, _, cc_shuffled_offset_chirps = event_triggered_chirps(chasing_offsets, shuffled_chirps, time_before_event, time_after_event, dt, recording_width)
# nshuffled_offset_chirps.append(cc_shuffled_offset_chirps)
# # Shuffled physical contact chirps
# _, _, cc_shuffled_physical_chirps = event_triggered_chirps(physical_contacts, shuffled_chirps, time_before_event, time_after_event, dt, recording_width)
# nshuffled_physical_chirps.append(cc_shuffled_physical_chirps)
# rec_shuffled_q5_onset, rec_shuffled_median_onset, rec_shuffled_q95_onset = np.percentile(
# nshuffled_onset_chirps, (5, 50, 95), axis=0)
# rec_shuffled_q5_offset, rec_shuffled_median_offset, rec_shuffled_q95_offset = np.percentile(
# nshuffled_offset_chirps, (5, 50, 95), axis=0)
# rec_shuffled_q5_physical, rec_shuffled_median_physical, rec_shuffled_q95_physical = np.percentile(
# nshuffled_physical_chirps, (5, 50, 95), axis=0)
# #### Recording plots ####
# fig, ax = plt.subplots(1, 3, figsize=(28*ps.cm, 16*ps.cm, ), constrained_layout=True, sharey='all')
# ax[0].set_xlabel('Time[s]')
# # Plot chasing onsets
# ax[0].set_ylabel('Chirp rate [Hz]')
# ax[0].plot(time, cc_chasing_onset_chirps, color=ps.yellow, zorder=2)
# ax0 = ax[0].twinx()
# ax0.eventplot(centered_chasing_onset_chirps, linelengths=0.2, colors=ps.gray, alpha=0.25, zorder=1)
# ax0.vlines(0, 0, 1.5, ps.white, 'dashed')
# ax[0].set_zorder(ax0.get_zorder()+1)
# ax[0].patch.set_visible(False)
# ax0.set_yticklabels([])
# ax0.set_yticks([])
# ######## median - q5, median + q95
# ax[0].fill_between(time, rec_shuffled_q5_onset, rec_shuffled_q95_onset, color=ps.gray, alpha=0.5)
# ax[0].plot(time, rec_shuffled_median_onset, color=ps.black)
# # Plot chasing offets
# ax[1].set_xlabel('Time[s]')
# ax[1].plot(time, cc_chasing_offset_chirps, color=ps.orange, zorder=2)
# ax1 = ax[1].twinx()
# ax1.eventplot(centered_chasing_offset_chirps, linelengths=0.2, colors=ps.gray, alpha=0.25, zorder=1)
# ax1.vlines(0, 0, 1.5, ps.white, 'dashed')
# ax[1].set_zorder(ax1.get_zorder()+1)
# ax[1].patch.set_visible(False)
# ax1.set_yticklabels([])
# ax1.set_yticks([])
# ax[1].fill_between(time, rec_shuffled_q5_offset, rec_shuffled_q95_offset, color=ps.gray, alpha=0.5)
# ax[1].plot(time, rec_shuffled_median_offset, color=ps.black)
# # Plot physical contacts
# ax[2].set_xlabel('Time[s]')
# ax[2].plot(time, cc_physical_chirps, color=ps.maroon, zorder=2)
# ax2 = ax[2].twinx()
# ax2.eventplot(centered_physical_chirps, linelengths=0.2, colors=ps.gray, alpha=0.25, zorder=1)
# ax2.vlines(0, 0, 1.5, ps.white, 'dashed')
# ax[2].set_zorder(ax2.get_zorder()+1)
# ax[2].patch.set_visible(False)
# ax2.set_yticklabels([])
# ax2.set_yticks([])
# ax[2].fill_between(time, rec_shuffled_q5_physical, rec_shuffled_q95_physical, color=ps.gray, alpha=0.5)
# ax[2].plot(time, rec_shuffled_median_physical, ps.black)
# fig.suptitle(f'Recording: {i}')
# # plt.show()
# plt.close()
# nrecording_shuffled_convolved_onset_chirps.append(nshuffled_onset_chirps)
# nrecording_shuffled_convolved_offset_chirps.append(nshuffled_offset_chirps)
# nrecording_shuffled_convolved_physical_chirps.append(nshuffled_physical_chirps)
#### New shuffle approach ####
bootstrap_onset = []
bootstrap_offset = []
bootstrap_physical = []
# New bootstrapping approach
for n in range(nbootstrapping):
diff_onset = np.diff(
np.sort(flatten(nrecording_centered_onset_chirps)))
diff_offset = np.diff(
np.sort(flatten(nrecording_centered_offset_chirps)))
diff_physical = np.diff(
np.sort(flatten(nrecording_centered_physical_chirps)))
np.random.shuffle(diff_onset)
shuffled_onset = np.cumsum(diff_onset)
np.random.shuffle(diff_offset)
shuffled_offset = np.cumsum(diff_offset)
np.random.shuffle(diff_physical)
shuffled_physical = np.cumsum(diff_physical)
kde_onset (acausal_kde1d(shuffled_onset, time, width))/(27*100)
kde_offset = (acausal_kde1d(shuffled_offset, time, width))/(27*100)
kde_physical = (acausal_kde1d(shuffled_physical, time, width))/(27*100)
bootstrap_onset.append(kde_onset)
bootstrap_offset.append(kde_offset)
bootstrap_physical.append(kde_physical)
# New shuffle approach q5, q50, q95
onset_q5, onset_median, onset_q95 = np.percentile(
bootstrap_onset, [5, 50, 95], axis=0)
offset_q5, offset_median, offset_q95 = np.percentile(
bootstrap_offset, [5, 50, 95], axis=0)
physical_q5, physical_median, physical_q95 = np.percentile(
bootstrap_physical, [5, 50, 95], axis=0)
# vstack um 1. Dim zu cutten
# nrecording_shuffled_convolved_onset_chirps = np.vstack(nrecording_shuffled_convolved_onset_chirps)
# nrecording_shuffled_convolved_offset_chirps = np.vstack(nrecording_shuffled_convolved_offset_chirps)
# nrecording_shuffled_convolved_physical_chirps = np.vstack(nrecording_shuffled_convolved_physical_chirps)
# shuffled_q5_onset, shuffled_median_onset, shuffled_q95_onset = np.percentile(
# nrecording_shuffled_convolved_onset_chirps, (5, 50, 95), axis=0)
# shuffled_q5_offset, shuffled_median_offset, shuffled_q95_offset = np.percentile(
# nrecording_shuffled_convolved_offset_chirps, (5, 50, 95), axis=0)
# shuffled_q5_physical, shuffled_median_physical, shuffled_q95_physical = np.percentile(
# nrecording_shuffled_convolved_physical_chirps, (5, 50, 95), axis=0)
# Flatten all chirps
all_chirps = np.concatenate(nrecording_chirps).ravel() # not centered
# Flatten event timestamps
all_onsets = np.concatenate(
nrecording_chasing_onsets).ravel() # not centered
all_offsets = np.concatenate(
nrecording_chasing_offsets).ravel() # not centered
all_physicals = np.concatenate(
nrecording_physicals).ravel() # not centered
# Flatten all chirps around events
all_onset_chirps = np.concatenate(
nrecording_centered_onset_chirps).ravel() # centered
all_offset_chirps = np.concatenate(
nrecording_centered_offset_chirps).ravel() # centered
all_physical_chirps = np.concatenate(
nrecording_centered_physical_chirps).ravel() # centered
# Convolute all chirps
# Divide by total number of each event over all recordings
all_onset_chirps_convolved = (acausal_kde1d(
all_onset_chirps, time, width)) / len(all_onsets)
all_offset_chirps_convolved = (acausal_kde1d(
all_offset_chirps, time, width)) / len(all_offsets)
all_physical_chirps_convolved = (acausal_kde1d(
all_physical_chirps, time, width)) / len(all_physicals)
# Plot all events with all shuffled
fig, ax = plt.subplots(1, 3, figsize=(
28*ps.cm, 16*ps.cm, ), constrained_layout=True, sharey='all')
# offsets = np.arange(1,28,1)
ax[0].set_xlabel('Time[s]')
# Plot chasing onsets
ax[0].set_ylabel('Chirp rate [Hz]')
ax[0].plot(time, all_onset_chirps_convolved, color=ps.yellow, zorder=2)
ax0 = ax[0].twinx()
nrecording_centered_onset_chirps = np.asarray(
nrecording_centered_onset_chirps, dtype=object)
ax0.eventplot(np.array(nrecording_centered_onset_chirps),
linelengths=0.5, colors=ps.gray, alpha=0.25, zorder=1)
ax0.vlines(0, 0, 1.5, ps.white, 'dashed')
ax[0].set_zorder(ax0.get_zorder()+1)
ax[0].patch.set_visible(False)
ax0.set_yticklabels([])
ax0.set_yticks([])
# ax[0].fill_between(time, shuffled_q5_onset, shuffled_q95_onset, color=ps.gray, alpha=0.5)
# ax[0].plot(time, shuffled_median_onset, color=ps.black)
ax[0].fill_between(time, onset_q5, onset_q95, color=ps.gray, alpha=0.5)
ax[0].plot(time, onset_median, color=ps.black)
# Plot chasing offets
ax[1].set_xlabel('Time[s]')
ax[1].plot(time, all_offset_chirps_convolved, color=ps.orange, zorder=2)
ax1 = ax[1].twinx()
nrecording_centered_offset_chirps = np.asarray(
nrecording_centered_offset_chirps, dtype=object)
ax1.eventplot(np.array(nrecording_centered_offset_chirps),
linelengths=0.5, colors=ps.gray, alpha=0.25, zorder=1)
ax1.vlines(0, 0, 1.5, ps.white, 'dashed')
ax[1].set_zorder(ax1.get_zorder()+1)
ax[1].patch.set_visible(False)
ax1.set_yticklabels([])
ax1.set_yticks([])
# ax[1].fill_between(time, shuffled_q5_offset, shuffled_q95_offset, color=ps.gray, alpha=0.5)
# ax[1].plot(time, shuffled_median_offset, color=ps.black)
ax[1].fill_between(time, offset_q5, offset_q95, color=ps.gray, alpha=0.5)
ax[1].plot(time, offset_median, color=ps.black)
# Plot physical contacts
ax[2].set_xlabel('Time[s]')
ax[2].plot(time, all_physical_chirps_convolved, color=ps.maroon, zorder=2)
ax2 = ax[2].twinx()
nrecording_centered_physical_chirps = np.asarray(
nrecording_centered_physical_chirps, dtype=object)
ax2.eventplot(np.array(nrecording_centered_physical_chirps),
linelengths=0.5, colors=ps.gray, alpha=0.25, zorder=1)
ax2.vlines(0, 0, 1.5, ps.white, 'dashed')
ax[2].set_zorder(ax2.get_zorder()+1)
ax[2].patch.set_visible(False)
ax2.set_yticklabels([])
ax2.set_yticks([])
# ax[2].fill_between(time, shuffled_q5_physical, shuffled_q95_physical, color=ps.gray, alpha=0.5)
# ax[2].plot(time, shuffled_median_physical, ps.black)
ax[2].fill_between(time, physical_q5, physical_q95,
color=ps.gray, alpha=0.5)
ax[2].plot(time, physical_median, ps.black)
fig.suptitle('All recordings')
plt.show()
plt.close()
embed()
# chasing_durations = []
# # Calculate chasing duration to evaluate a nice time window for kernel density estimation
# for onset, offset in zip(chasing_onsets, chasing_offsets):
# duration = offset - onset
# chasing_durations.append(duration)
# fig, ax = plt.subplots()
# ax.boxplot(chasing_durations)
# plt.show()
# plt.close()
# # Associate chirps to individual fish
# fish1 = chirps[chirps_fish_ids == fish_ids[0]]
# fish2 = chirps[chirps_fish_ids == fish_ids[1]]
# fish = [len(fish1), len(fish2)]
# Convolution over all recordings
# Rasterplot for each recording
# #### Chirps around events, winner VS loser, one recording ####
# # Load file with fish ids and winner/loser info
# meta = pd.read_csv('../data/mount_data/order_meta.csv')
# current_recording = meta[meta.index == 43]
# fish1 = current_recording['rec_id1'].values
# fish2 = current_recording['rec_id2'].values
# # Implement check if fish_ids from meta and chirp detection are the same???
# winner = current_recording['winner'].values
# if winner == fish1:
# loser = fish2
# elif winner == fish2:
# loser = fish1
# winner_chirps = chirps[chirps_fish_ids == winner]
# loser_chirps = chirps[chirps_fish_ids == loser]
# # Event triggered winner chirps
# _, winner_centered_onset, winner_cc_onset = event_triggered_chirps(chasing_onsets, winner_chirps, time_before_event, time_after_event, dt, width)
# _, winner_centered_offset, winner_cc_offset = event_triggered_chirps(chasing_offsets, winner_chirps, time_before_event, time_after_event, dt, width)
# _, winner_centered_physical, winner_cc_physical = event_triggered_chirps(physical_contacts, winner_chirps, time_before_event, time_after_event, dt, width)
# # Event triggered loser chirps
# _, loser_centered_onset, loser_cc_onset = event_triggered_chirps(chasing_onsets, loser_chirps, time_before_event, time_after_event, dt, width)
# _, loser_centered_offset, loser_cc_offset = event_triggered_chirps(chasing_offsets, loser_chirps, time_before_event, time_after_event, dt, width)
# _, loser_centered_physical, loser_cc_physical = event_triggered_chirps(physical_contacts, loser_chirps, time_before_event, time_after_event, dt, width)
# ########## Winner VS Loser plot ##########
# fig, ax = plt.subplots(2, 3, figsize=(50 / 2.54, 15 / 2.54), constrained_layout=True, sharey='row')
# offset = [1.35]
# ax[1][0].set_xlabel('Time[s]')
# ax[1][1].set_xlabel('Time[s]')
# ax[1][2].set_xlabel('Time[s]')
# # Plot winner chasing onsets
# ax[0][0].set_ylabel('Chirp rate [Hz]')
# ax[0][0].plot(time, winner_cc_onset, color='tab:blue', zorder=100)
# ax0 = ax[0][0].twinx()
# ax0.eventplot(np.array([winner_centered_onset]), lineoffsets=offset, linelengths=0.1, colors=['tab:green'], alpha=0.25, zorder=-100)
# ax0.set_ylabel('Event')
# ax0.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[0][0].set_zorder(ax0.get_zorder()+1)
# ax[0][0].patch.set_visible(False)
# ax0.set_yticklabels([])
# ax0.set_yticks([])
# # Plot winner chasing offets
# ax[0][1].plot(time, winner_cc_offset, color='tab:blue', zorder=100)
# ax1 = ax[0][1].twinx()
# ax1.eventplot(np.array([winner_centered_offset]), lineoffsets=offset, linelengths=0.1, colors=['tab:purple'], alpha=0.25, zorder=-100)
# ax1.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[0][1].set_zorder(ax1.get_zorder()+1)
# ax[0][1].patch.set_visible(False)
# ax1.set_yticklabels([])
# ax1.set_yticks([])
# # Plot winner physical contacts
# ax[0][2].plot(time, winner_cc_physical, color='tab:blue', zorder=100)
# ax2 = ax[0][2].twinx()
# ax2.eventplot(np.array([winner_centered_physical]), lineoffsets=offset, linelengths=0.1, colors=['tab:red'], alpha=0.25, zorder=-100)
# ax2.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[0][2].set_zorder(ax2.get_zorder()+1)
# ax[0][2].patch.set_visible(False)
# ax2.set_yticklabels([])
# ax2.set_yticks([])
# # Plot loser chasing onsets
# ax[1][0].set_ylabel('Chirp rate [Hz]')
# ax[1][0].plot(time, loser_cc_onset, color='tab:blue', zorder=100)
# ax3 = ax[1][0].twinx()
# ax3.eventplot(np.array([loser_centered_onset]), lineoffsets=offset, linelengths=0.1, colors=['tab:green'], alpha=0.25, zorder=-100)
# ax3.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[1][0].set_zorder(ax3.get_zorder()+1)
# ax[1][0].patch.set_visible(False)
# ax3.set_yticklabels([])
# ax3.set_yticks([])
# # Plot loser chasing offsets
# ax[1][1].plot(time, loser_cc_offset, color='tab:blue', zorder=100)
# ax4 = ax[1][1].twinx()
# ax4.eventplot(np.array([loser_centered_offset]), lineoffsets=offset, linelengths=0.1, colors=['tab:purple'], alpha=0.25, zorder=-100)
# ax4.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[1][1].set_zorder(ax4.get_zorder()+1)
# ax[1][1].patch.set_visible(False)
# ax4.set_yticklabels([])
# ax4.set_yticks([])
# # Plot loser physical contacts
# ax[1][2].plot(time, loser_cc_physical, color='tab:blue', zorder=100)
# ax5 = ax[1][2].twinx()
# ax5.eventplot(np.array([loser_centered_physical]), lineoffsets=offset, linelengths=0.1, colors=['tab:red'], alpha=0.25, zorder=-100)
# ax5.vlines(0, 0, 1.5, 'tab:grey', 'dashed')
# ax[1][2].set_zorder(ax5.get_zorder()+1)
# ax[1][2].patch.set_visible(False)
# ax5.set_yticklabels([])
# ax5.set_yticks([])
# plt.show()
# plt.close()
# for i in range(len(fish_ids)):
# fish = fish_ids[i]
# chirps_temp = chirps[chirps_fish_ids == fish]
# print(fish)
#### Chirps around events, only losers, one recording ####
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/'
main(datapath)

58
code/extract_chirps.py Normal file
View File

@ -0,0 +1,58 @@
import os
import pandas as pd
import numpy as np
from chirpdetection import chirpdetection
from IPython import embed
# check rec ../data/mount_data/2020-03-25-10_00/ starting at 3175
def get_valid_datasets(dataroot):
datasets = sorted([name for name in os.listdir(dataroot) if os.path.isdir(
os.path.join(dataroot, name))])
valid_datasets = []
for dataset in datasets:
path = os.path.join(dataroot, dataset)
csv_name = '-'.join(dataset.split('-')[:3]) + '.csv'
if os.path.exists(os.path.join(path, csv_name)) is False:
continue
if os.path.exists(os.path.join(path, 'ident_v.npy')) is False:
continue
ident = np.load(os.path.join(path, 'ident_v.npy'))
number_of_fish = len(np.unique(ident[~np.isnan(ident)]))
if number_of_fish != 2:
continue
valid_datasets.append(dataset)
datapaths = [os.path.join(dataroot, dataset) +
'/' for dataset in valid_datasets]
return datapaths, valid_datasets
def main(datapaths):
for path in datapaths:
chirpdetection(path, plot='show')
if __name__ == '__main__':
dataroot = '../data/mount_data/'
datapaths, valid_datasets= get_valid_datasets(dataroot)
recs = pd.DataFrame(columns=['recording'], data=valid_datasets)
recs.to_csv('../recs.csv', index=False)
# datapaths = ['../data/mount_data/2020-03-25-10_00/']
main(datapaths)
# window 1524 + 244 in dataset index 4 is nice example

35
code/get_behaviour.py Normal file
View File

@ -0,0 +1,35 @@
import os
from paramiko import SSHClient
from scp import SCPClient
from IPython import embed
from pandas import read_csv
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect(hostname='kraken',
username='efish',
password='fwNix4U',
)
# SCPCLient takes a paramiko transport as its only argument
scp = SCPClient(ssh.get_transport())
data = read_csv('../recs.csv')
foldernames = data['recording'].values
directory = f'/Users/acfw/Documents/uni_tuebingen/chirpdetection/GP2023_chirp_detection/data/mount_data/'
for foldername in foldernames:
if not os.path.exists(directory+foldername):
os.makedirs(directory+foldername)
files = [('-').join(foldername.split('-')[:3])+'.csv','chirp_ids.npy', 'chirps.npy', 'fund_v.npy', 'ident_v.npy', 'idx_v.npy', 'times.npy', 'spec.npy', 'LED_on_time.npy', 'sign_v.npy']
for f in files:
scp.get(f'/home/efish/behavior/2019_tube_competition/{foldername}/{f}',
directory+foldername)
scp.close()

View File

@ -0,0 +1,169 @@
import numpy as np
import os
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.datahandling import causal_kde1d, acausal_kde1d, flatten
logger = makeLogger(__name__)
class Behavior:
"""Load behavior data from csv file as class attributes
Attributes
----------
behavior: 0: chasing onset, 1: chasing offset, 2: physical contact
behavior_type:
behavioral_category:
comment_start:
comment_stop:
dataframe: pandas dataframe with all the data
duration_s:
media_file:
observation_date:
observation_id:
start_s: start time of the event in seconds
stop_s: stop time of the event in seconds
total_length:
"""
def __init__(self, folder_path: str) -> None:
LED_on_time_BORIS = np.load(os.path.join(
folder_path, 'LED_on_time.npy'), allow_pickle=True)
csv_filename = os.path.split(folder_path[:-1])[-1]
csv_filename = '-'.join(csv_filename.split('-')[:-1]) + '.csv'
# embed()
# csv_filename = [f for f in os.listdir(
# folder_path) if f.endswith('.csv')][0]
# logger.info(f'CSV file: {csv_filename}')
self.dataframe = read_csv(os.path.join(folder_path, csv_filename))
self.chirps = np.load(os.path.join(
folder_path, 'chirps.npy'), allow_pickle=True)
self.chirps_ids = np.load(os.path.join(
folder_path, 'chirp_ids.npy'), allow_pickle=True)
self.ident = np.load(os.path.join(
folder_path, 'ident_v.npy'), allow_pickle=True)
self.idx = np.load(os.path.join(
folder_path, 'idx_v.npy'), allow_pickle=True)
self.freq = np.load(os.path.join(
folder_path, 'fund_v.npy'), allow_pickle=True)
self.time = np.load(os.path.join(
folder_path, "times.npy"), allow_pickle=True)
self.spec = np.load(os.path.join(
folder_path, "spec.npy"), allow_pickle=True)
for k, key in enumerate(self.dataframe.keys()):
key = key.lower()
if ' ' in key:
key = key.replace(' ', '_')
if '(' in key:
key = key.replace('(', '')
key = key.replace(')', '')
setattr(self, key, np.array(
self.dataframe[self.dataframe.keys()[k]]))
last_LED_t_BORIS = LED_on_time_BORIS[-1]
real_time_range = self.time[-1] - self.time[0]
factor = 1.034141
shift = last_LED_t_BORIS - real_time_range * factor
self.start_s = (self.start_s - shift) / factor
self.stop_s = (self.stop_s - shift) / factor
def correct_chasing_events(
category: np.ndarray,
timestamps: np.ndarray
) -> tuple[np.ndarray, np.ndarray]:
onset_ids = np.arange(
len(category))[category == 0]
offset_ids = np.arange(
len(category))[category == 1]
wrong_bh = np.arange(len(category))[
category != 2][:-1][np.diff(category[category != 2]) == 0]
if category[category != 2][-1] == 0:
wrong_bh = np.append(
wrong_bh,
np.arange(len(category))[category != 2][-1])
if onset_ids[0] > offset_ids[0]:
offset_ids = np.delete(offset_ids, 0)
help_index = offset_ids[0]
wrong_bh = np.append(wrong_bh[help_index])
category = np.delete(category, wrong_bh)
timestamps = np.delete(timestamps, wrong_bh)
new_onset_ids = np.arange(
len(category))[category == 0]
new_offset_ids = np.arange(
len(category))[category == 1]
# Check whether on- or offset is longer and calculate length difference
if len(new_onset_ids) > len(new_offset_ids):
embed()
logger.warning('Onsets are greater than offsets')
elif len(new_onset_ids) < len(new_offset_ids):
logger.warning('Offsets are greater than onsets')
elif len(new_onset_ids) == len(new_offset_ids):
# logger.info('Chasing events are equal')
pass
return category, timestamps
def center_chirps(
events: np.ndarray,
chirps: np.ndarray,
time_before_event: int,
time_after_event: int,
# dt: float,
# width: float,
) -> tuple[np.ndarray, np.ndarray, np.ndarray]:
event_chirps = [] # chirps that are in specified window around event
# timestamps of chirps around event centered on the event timepoint
centered_chirps = []
for event_timestamp in events:
start = event_timestamp - time_before_event
stop = event_timestamp + time_after_event
chirps_around_event = [c for c in chirps if (c >= start) & (c <= stop)]
if len(chirps_around_event) == 0:
continue
centered_chirps.append(chirps_around_event - event_timestamp)
event_chirps.append(chirps_around_event)
centered_chirps = np.sort(flatten(centered_chirps))
event_chirps = np.sort(flatten(event_chirps))
if len(centered_chirps) != len(event_chirps):
raise ValueError(
'Non centered chirps and centered chirps are not equal')
# time = np.arange(-time_before_event, time_after_event, dt)
# # Kernel density estimation with some if's
# if len(centered_chirps) == 0:
# centered_chirps = np.array([])
# centered_chirps_convolved = np.zeros(len(time))
# else:
# # convert list of arrays to one array for plotting
# centered_chirps = np.concatenate(centered_chirps, axis=0)
# centered_chirps_convolved = (acausal_kde1d(
# centered_chirps, time, width)) / len(event)
return centered_chirps

View File

@ -0,0 +1,338 @@
import numpy as np
from typing import List, Any
from scipy.ndimage import gaussian_filter1d
from scipy.stats import gamma, norm
def minmaxnorm(data):
"""
Normalize data to [0, 1]
Parameters
----------
data : np.ndarray
Data to normalize.
Returns
-------
np.ndarray
Normalized data.
"""
return (data - np.min(data)) / (np.max(data) - np.min(data))
def instantaneous_frequency(
signal: np.ndarray,
samplerate: int,
smoothing_window: int,
) -> tuple[np.ndarray, np.ndarray]:
"""
Compute the instantaneous frequency of a signal that is approximately
sinusoidal and symmetric around 0.
Parameters
----------
signal : np.ndarray
Signal to compute the instantaneous frequency from.
samplerate : int
Samplerate of the signal.
smoothing_window : int
Window size for the gaussian filter.
Returns
-------
tuple[np.ndarray, np.ndarray]
"""
# calculate instantaneous frequency with zero crossings
roll_signal = np.roll(signal, shift=1)
time_signal = np.arange(len(signal)) / samplerate
period_index = np.arange(len(signal))[(roll_signal < 0) & (signal >= 0)][
1:-1
]
upper_bound = np.abs(signal[period_index])
lower_bound = np.abs(signal[period_index - 1])
upper_time = np.abs(time_signal[period_index])
lower_time = np.abs(time_signal[period_index - 1])
# create ratio
lower_ratio = lower_bound / (lower_bound + upper_bound)
# appy to time delta
time_delta = upper_time - lower_time
true_zero = lower_time + lower_ratio * time_delta
# create new time array
instantaneous_frequency_time = true_zero[:-1] + 0.5 * np.diff(true_zero)
# compute frequency
instantaneous_frequency = gaussian_filter1d(
1 / np.diff(true_zero), smoothing_window
)
return instantaneous_frequency_time, instantaneous_frequency
def purge_duplicates(
timestamps: List[float], threshold: float = 0.5
) -> List[float]:
"""
Compute the mean of groups of timestamps that are closer to the previous
or consecutive timestamp than the threshold, and return all timestamps that
are further apart from the previous or consecutive timestamp than the
threshold in a single list.
Parameters
----------
timestamps : List[float]
A list of sorted timestamps
threshold : float, optional
The threshold to group the timestamps by, default is 0.5
Returns
-------
List[float]
A list containing a list of timestamps that are further apart than
the threshold and a list of means of the groups of timestamps that
are closer to the previous or consecutive timestamp than the threshold.
"""
# Initialize an empty list to store the groups of timestamps that are
# closer to the previous or consecutive timestamp than the threshold
groups = []
# initialize the first group with the first timestamp
group = [timestamps[0]]
for i in range(1, len(timestamps)):
# check the difference between current timestamp and previous
# timestamp is less than the threshold
if timestamps[i] - timestamps[i - 1] < threshold:
# add the current timestamp to the current group
group.append(timestamps[i])
else:
# if the difference is greater than the threshold
# append the current group to the groups list
groups.append(group)
# start a new group with the current timestamp
group = [timestamps[i]]
# after iterating through all the timestamps, add the last group to the
# groups list
groups.append(group)
# get the mean of each group and only include the ones that have more
# than 1 timestamp
means = [np.mean(group) for group in groups if len(group) > 1]
# get the timestamps that are outliers, i.e. the ones that are alone
# in a group
outliers = [ts for group in groups for ts in group if len(group) == 1]
# return the outliers and means in a single list
return outliers + means
def group_timestamps(
sublists: List[List[float]], at_least_in: int, difference_threshold: float
) -> List[float]:
"""
Groups timestamps that are less than `threshold` milliseconds apart from
at least `n` other sublists.
Returns a list of the mean of each group.
If any of the sublists is empty, it will be ignored.
Parameters
----------
sublists : List[List[float]]
a list of sublists, each containing timestamps
n : int
minimum number of sublists that a timestamp must be close to in order
to be grouped
threshold : float
the maximum difference in milliseconds between timestamps to be
considered a match
Returns
-------
List[float]
a list of the mean of each group.
"""
# Flatten the sublists and sort the timestamps
timestamps = [
timestamp for sublist in sublists if sublist for timestamp in sublist
]
timestamps.sort()
if len(timestamps) == 0:
return []
groups = []
current_group = [timestamps[0]]
# Group timestamps that are less than threshold milliseconds apart
for i in range(1, len(timestamps)):
if timestamps[i] - timestamps[i - 1] < difference_threshold:
current_group.append(timestamps[i])
else:
groups.append(current_group)
current_group = [timestamps[i]]
groups.append(current_group)
# Retain only groups that contain at least n timestamps
final_groups = []
for group in groups:
if len(group) >= at_least_in:
final_groups.append(group)
# Calculate the mean of each group
means = [np.mean(group) for group in final_groups]
return means
def flatten(list: List[List[Any]]) -> List:
"""
Flattens a list / array of lists.
Parameters
----------
l : array or list of lists
The list to be flattened
Returns
-------
list
The flattened list
"""
return [item for sublist in list for item in sublist]
def causal_kde1d(spikes, time, width, shape=2):
"""
causalkde computes a kernel density estimate using a causal kernel (i.e. exponential or gamma distribution).
A shape of 1 turns the gamma distribution into an exponential.
Parameters
----------
spikes : array-like
spike times
time : array-like
sampling time
width : float
kernel width
shape : int, optional
shape of gamma distribution, by default 1
Returns
-------
rate : array-like
instantaneous firing rate
"""
# compute dt
dt = time[1] - time[0]
# time on which to compute kernel:
tmax = 10 * width
# kernel not wider than time
if 2 * tmax > time[-1] - time[0]:
tmax = 0.5 * (time[-1] - time[0])
# kernel time
ktime = np.arange(-tmax, tmax, dt)
# gamma kernel centered in ktime:
kernel = gamma.pdf(
x=ktime,
a=shape,
loc=0,
scale=width,
)
# indices of spikes in time array:
indices = np.asarray((spikes - time[0]) / dt, dtype=int)
# binary spike train:
brate = np.zeros(len(time))
brate[indices[(indices >= 0) & (indices < len(time))]] = 1.0
# convolution with kernel:
rate = np.convolve(brate, kernel, mode="same")
return rate
def acausal_kde1d(spikes, time, width):
"""
causalkde computes a kernel density estimate using a causal kernel (i.e. exponential or gamma distribution).
A shape of 1 turns the gamma distribution into an exponential.
Parameters
----------
spikes : array-like
spike times
time : array-like
sampling time
width : float
kernel width
shape : int, optional
shape of gamma distribution, by default 1
Returns
-------
rate : array-like
instantaneous firing rate
"""
# compute dt
dt = time[1] - time[0]
# time on which to compute kernel:
tmax = 10 * width
# kernel not wider than time
if 2 * tmax > time[-1] - time[0]:
tmax = 0.5 * (time[-1] - time[0])
# kernel time
ktime = np.arange(-tmax, tmax, dt)
# gamma kernel centered in ktime:
kernel = norm.pdf(
x=ktime,
loc=0,
scale=width,
)
# indices of spikes in time array:
indices = np.asarray((spikes - time[0]) / dt, dtype=int)
# binary spike train:
brate = np.zeros(len(time))
brate[indices[(indices >= 0) & (indices < len(time))]] = 1.0
# convolution with kernel:
rate = np.convolve(brate, kernel, mode="same")
return rate
if __name__ == "__main__":
timestamps = [
[1.2, 1.5, 1.3],
[],
[1.21, 1.51, 1.31],
[1.19, 1.49, 1.29],
[1.22, 1.52, 1.32],
[1.2, 1.5, 1.3],
]
print(group_timestamps(timestamps, 2, 0.05))
print(purge_duplicates([1, 2, 3, 4, 5, 6, 6.02, 7, 8, 8.02], 0.05))

View File

@ -3,6 +3,7 @@ import os
import yaml
import numpy as np
from thunderfish.dataloader import DataLoader
import matplotlib.pyplot as plt
class ConfLoader:
@ -36,6 +37,7 @@ class LoadData:
def __init__(self, datapath: str) -> None:
# load raw data
self.datapath = datapath
self.file = os.path.join(datapath, "traces-grid1.raw")
self.raw = DataLoader(self.file, 60.0, 0, channel=-1)
self.raw_rate = self.raw.samplerate
@ -53,3 +55,23 @@ class LoadData:
def __str__(self) -> str:
return f"LoadData({self.file})"
def make_outputdir(path: str) -> str:
"""
Creates a new directory where the path leads if it does not already exist.
Parameters
----------
path : string
path to the new output directory
Returns
-------
string
path of the newly created output directory
"""
if os.path.isdir(path) == False:
os.mkdir(path)
return path

View File

@ -3,8 +3,8 @@ import numpy as np
def bandpass_filter(
data: np.ndarray,
rate: float,
signal: np.ndarray,
samplerate: float,
lowf: float,
highf: float,
) -> np.ndarray:
@ -12,7 +12,7 @@ def bandpass_filter(
Parameters
----------
data : np.ndarray
signal : np.ndarray
The data to be filtered
rate : float
The sampling rate
@ -26,21 +26,22 @@ def bandpass_filter(
np.ndarray
The filtered data
"""
sos = butter(2, (lowf, highf), "bandpass", fs=rate, output="sos")
fdata = sosfiltfilt(sos, data)
return fdata
sos = butter(2, (lowf, highf), "bandpass", fs=samplerate, output="sos")
filtered_signal = sosfiltfilt(sos, signal)
return filtered_signal
def highpass_filter(
data: np.ndarray,
rate: float,
signal: np.ndarray,
samplerate: float,
cutoff: float,
) -> np.ndarray:
"""Highpass filter a signal.
Parameters
----------
data : np.ndarray
signal : np.ndarray
The data to be filtered
rate : float
The sampling rate
@ -52,14 +53,15 @@ def highpass_filter(
np.ndarray
The filtered data
"""
sos = butter(2, cutoff, "highpass", fs=rate, output="sos")
fdata = sosfiltfilt(sos, data)
return fdata
sos = butter(2, cutoff, "highpass", fs=samplerate, output="sos")
filtered_signal = sosfiltfilt(sos, signal)
return filtered_signal
def lowpass_filter(
data: np.ndarray,
rate: float,
signal: np.ndarray,
samplerate: float,
cutoff: float
) -> np.ndarray:
"""Lowpass filter a signal.
@ -78,21 +80,25 @@ def lowpass_filter(
np.ndarray
The filtered data
"""
sos = butter(2, cutoff, "lowpass", fs=rate, output="sos")
fdata = sosfiltfilt(sos, data)
return fdata
sos = butter(2, cutoff, "lowpass", fs=samplerate, output="sos")
filtered_signal = sosfiltfilt(sos, signal)
return filtered_signal
def envelope(data: np.ndarray, rate: float, freq: float) -> np.ndarray:
def envelope(signal: np.ndarray,
samplerate: float,
cutoff_frequency: float
) -> np.ndarray:
"""Calculate the envelope of a signal using a lowpass filter.
Parameters
----------
data : np.ndarray
signal : np.ndarray
The signal to calculate the envelope of
rate : float
samplingrate : float
The sampling rate of the signal
freq : float
cutoff_frequency : float
The cutoff frequency of the lowpass filter
Returns
@ -100,6 +106,7 @@ def envelope(data: np.ndarray, rate: float, freq: float) -> np.ndarray:
np.ndarray
The envelope of the signal
"""
sos = butter(2, freq, "lowpass", fs=rate, output="sos")
envelope = np.sqrt(2) * sosfiltfilt(sos, np.abs(data))
sos = butter(2, cutoff_frequency, "lowpass", fs=samplerate, output="sos")
envelope = np.sqrt(2) * sosfiltfilt(sos, np.abs(signal))
return envelope

41
code/modules/logger.py Normal file
View File

@ -0,0 +1,41 @@
import logging
def makeLogger(name: str):
# create logger formats for file and terminal
file_formatter = logging.Formatter(
"[ %(levelname)s ] ~ %(asctime)s ~ %(module)s.%(funcName)s: %(message)s")
console_formatter = logging.Formatter(
"[ %(levelname)s ] in %(module)s.%(funcName)s: %(message)s")
# create logging file if loglevel is debug
file_handler = logging.FileHandler(f"gridtools_log.log", mode="w")
file_handler.setLevel(logging.WARN)
file_handler.setFormatter(file_formatter)
# create stream handler for terminal output
console_handler = logging.StreamHandler()
console_handler.setFormatter(console_formatter)
console_handler.setLevel(logging.INFO)
# create script specific logger
logger = logging.getLogger(name)
logger.addHandler(file_handler)
logger.addHandler(console_handler)
logger.setLevel(logging.INFO)
return logger
if __name__ == "__main__":
# initiate logger
mylogger = makeLogger(__name__)
# test logger levels
mylogger.debug("This is for debugging!")
mylogger.info("This is an info.")
mylogger.warning("This is a warning.")
mylogger.error("This is an error.")
mylogger.critical("This is a critical error!")

View File

@ -23,17 +23,21 @@ def PlotStyle() -> None:
sky = "#89dceb"
teal = "#94e2d5"
green = "#a6e3a1"
yellow = "#f9e2af"
orange = "#fab387"
maroon = "#eba0ac"
red = "#f38ba8"
purple = "#cba6f7"
pink = "#f5c2e7"
yellow = "#f9d67f"
orange = "#faa472"
maroon = "#eb8486"
red = "#f37588"
purple = "#d89bf7"
pink = "#f59edb"
lavender = "#b4befe"
gblue1 = "#89b4fa"
gblue2 = "#89dceb"
gblue3 = "#a6e3a1"
@classmethod
def lims(cls, track1, track2):
"""Helper function to get frequency y axis limits from two fundamental frequency tracks.
"""Helper function to get frequency y axis limits from two
fundamental frequency tracks.
Args:
track1 (array): First track
@ -91,12 +95,23 @@ def PlotStyle() -> None:
ax.tick_params(left=False, labelleft=False)
ax.patch.set_visible(False)
@classmethod
def hide_xax(cls, ax):
ax.xaxis.set_visible(False)
ax.spines["bottom"].set_visible(False)
@classmethod
def hide_yax(cls, ax):
ax.yaxis.set_visible(False)
ax.spines["left"].set_visible(False)
@classmethod
def set_boxplot_color(cls, bp, color):
plt.setp(bp["boxes"], color=color)
plt.setp(bp["whiskers"], color=color)
plt.setp(bp["caps"], color=color)
plt.setp(bp["medians"], color=color)
plt.setp(bp["whiskers"], color=white)
plt.setp(bp["caps"], color=white)
plt.setp(bp["medians"], color=white)
@classmethod
def label_subplots(cls, labels, axes, fig):
@ -215,9 +230,9 @@ def PlotStyle() -> None:
plt.rc("legend", fontsize=SMALL_SIZE) # legend fontsize
plt.rc("figure", titlesize=BIGGER_SIZE) # fontsize of the figure title
plt.rcParams["image.cmap"] = 'cmo.haline'
# plt.rcParams["axes.xmargin"] = 0.1
# plt.rcParams["axes.ymargin"] = 0.15
plt.rcParams["image.cmap"] = "cmo.haline"
plt.rcParams["axes.xmargin"] = 0.05
plt.rcParams["axes.ymargin"] = 0.1
plt.rcParams["axes.titlelocation"] = "left"
plt.rcParams["axes.titlesize"] = BIGGER_SIZE
# plt.rcParams["axes.titlepad"] = -10
@ -230,9 +245,9 @@ def PlotStyle() -> None:
plt.rcParams["legend.borderaxespad"] = 0.5
plt.rcParams["legend.fancybox"] = False
# specify the custom font to use
plt.rcParams["font.family"] = "sans-serif"
plt.rcParams["font.sans-serif"] = "Helvetica Now Text"
# # specify the custom font to use
# plt.rcParams["font.family"] = "sans-serif"
# plt.rcParams["font.sans-serif"] = "Helvetica Now Text"
# dark mode modifications
plt.rcParams["boxplot.flierprops.color"] = white
@ -253,25 +268,27 @@ def PlotStyle() -> None:
plt.rcParams["axes.spines.top"] = False
plt.rcParams["axes.spines.right"] = False
plt.rcParams["axes.prop_cycle"] = cycler(
'color', [
'#b4befe',
'#89b4fa',
'#74c7ec',
'#89dceb',
'#94e2d5',
'#a6e3a1',
'#f9e2af',
'#fab387',
'#eba0ac',
'#f38ba8',
'#cba6f7',
'#f5c2e7',
])
"color",
[
"#b4befe",
"#89b4fa",
"#74c7ec",
"#89dceb",
"#94e2d5",
"#a6e3a1",
"#f9e2af",
"#fab387",
"#eba0ac",
"#f38ba8",
"#cba6f7",
"#f5c2e7",
],
)
plt.rcParams["xtick.color"] = gray # color of the ticks
plt.rcParams["ytick.color"] = gray # color of the ticks
plt.rcParams["grid.color"] = dark_gray # grid color
plt.rcParams["figure.facecolor"] = black # figure face color
plt.rcParams["figure.edgecolor"] = "#555169" # figure edge color
plt.rcParams["figure.edgecolor"] = black # figure edge color
plt.rcParams["savefig.facecolor"] = black # figure face color when saving
return style
@ -281,12 +298,11 @@ if __name__ == "__main__":
s = PlotStyle()
import matplotlib.pyplot as plt
import matplotlib.cbook as cbook
import matplotlib.cm as cm
import matplotlib.pyplot as plt
import matplotlib.cbook as cbook
from matplotlib.path import Path
from matplotlib.patches import PathPatch
from matplotlib.path import Path
# Fixing random state for reproducibility
np.random.seed(19680801)
@ -294,14 +310,20 @@ if __name__ == "__main__":
delta = 0.025
x = y = np.arange(-3.0, 3.0, delta)
X, Y = np.meshgrid(x, y)
Z1 = np.exp(-X**2 - Y**2)
Z2 = np.exp(-(X - 1)**2 - (Y - 1)**2)
Z1 = np.exp(-(X**2) - Y**2)
Z2 = np.exp(-((X - 1) ** 2) - (Y - 1) ** 2)
Z = (Z1 - Z2) * 2
fig1, ax = plt.subplots()
im = ax.imshow(Z, interpolation='bilinear', cmap=cm.RdYlGn,
origin='lower', extent=[-3, 3, -3, 3],
vmax=abs(Z).max(), vmin=-abs(Z).max())
im = ax.imshow(
Z,
interpolation="bilinear",
cmap=cm.RdYlGn,
origin="lower",
extent=[-3, 3, -3, 3],
vmax=abs(Z).max(),
vmin=-abs(Z).max(),
)
plt.show()
@ -314,22 +336,21 @@ if __name__ == "__main__":
all_data = [np.random.normal(0, std, 100) for std in range(6, 10)]
# plot violin plot
axs[0].violinplot(all_data,
showmeans=False,
showmedians=True)
axs[0].set_title('Violin plot')
axs[0].violinplot(all_data, showmeans=False, showmedians=True)
axs[0].set_title("Violin plot")
# plot box plot
axs[1].boxplot(all_data)
axs[1].set_title('Box plot')
axs[1].set_title("Box plot")
# adding horizontal grid lines
for ax in axs:
ax.yaxis.grid(True)
ax.set_xticks([y + 1 for y in range(len(all_data))],
labels=['x1', 'x2', 'x3', 'x4'])
ax.set_xlabel('Four separate samples')
ax.set_ylabel('Observed values')
ax.set_xticks(
[y + 1 for y in range(len(all_data))], labels=["x1", "x2", "x3", "x4"]
)
ax.set_xlabel("Four separate samples")
ax.set_ylabel("Observed values")
plt.show()
@ -341,24 +362,42 @@ if __name__ == "__main__":
theta = np.linspace(0.0, 2 * np.pi, N, endpoint=False)
radii = 10 * np.random.rand(N)
width = np.pi / 4 * np.random.rand(N)
colors = cmo.cm.haline(radii / 10.)
colors = cmo.cm.haline(radii / 10.0)
ax = plt.subplot(projection='polar')
ax = plt.subplot(projection="polar")
ax.bar(theta, radii, width=width, bottom=0.0, color=colors, alpha=0.5)
plt.show()
methods = [None, 'none', 'nearest', 'bilinear', 'bicubic', 'spline16',
'spline36', 'hanning', 'hamming', 'hermite', 'kaiser', 'quadric',
'catrom', 'gaussian', 'bessel', 'mitchell', 'sinc', 'lanczos']
methods = [
None,
"none",
"nearest",
"bilinear",
"bicubic",
"spline16",
"spline36",
"hanning",
"hamming",
"hermite",
"kaiser",
"quadric",
"catrom",
"gaussian",
"bessel",
"mitchell",
"sinc",
"lanczos",
]
# Fixing random state for reproducibility
np.random.seed(19680801)
grid = np.random.rand(4, 4)
fig, axs = plt.subplots(nrows=3, ncols=6, figsize=(9, 6),
subplot_kw={'xticks': [], 'yticks': []})
fig, axs = plt.subplots(
nrows=3, ncols=6, figsize=(9, 6), subplot_kw={"xticks": [], "yticks": []}
)
for ax, interp_method in zip(axs.flat, methods):
ax.imshow(grid, interpolation=interp_method)

View File

@ -0,0 +1,277 @@
import numpy as np
from extract_chirps import get_valid_datasets
import os
import numpy as np
import matplotlib.pyplot as plt
from thunderfish.powerspectrum import decibel
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.plotstyle import PlotStyle
from modules.behaviour_handling import Behavior, correct_chasing_events
ps = PlotStyle()
logger = makeLogger(__name__)
def get_chirp_winner_loser(folder_name, Behavior, order_meta_df):
foldername = folder_name.split('/')[-2]
winner_row = order_meta_df[order_meta_df['recording'] == foldername]
winner = winner_row['winner'].values[0].astype(int)
winner_fish1 = winner_row['fish1'].values[0].astype(int)
winner_fish2 = winner_row['fish2'].values[0].astype(int)
if winner > 0:
if winner == winner_fish1:
winner_fish_id = winner_row['rec_id1'].values[0]
loser_fish_id = winner_row['rec_id2'].values[0]
elif winner == winner_fish2:
winner_fish_id = winner_row['rec_id2'].values[0]
loser_fish_id = winner_row['rec_id1'].values[0]
chirp_winner = len(
Behavior.chirps[Behavior.chirps_ids == winner_fish_id])
chirp_loser = len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return chirp_winner, chirp_loser
else:
return np.nan, np.nan
def get_chirp_size(folder_name, Behavior, order_meta_df, id_meta_df):
foldername = folder_name.split('/')[-2]
folder_row = order_meta_df[order_meta_df['recording'] == foldername]
fish1 = folder_row['fish1'].values[0].astype(int)
fish2 = folder_row['fish2'].values[0].astype(int)
groub = folder_row['group'].values[0].astype(int)
size_fish1_row = id_meta_df[(id_meta_df['group'] == groub) & (
id_meta_df['fish'] == fish1)]
size_fish2_row = id_meta_df[(id_meta_df['group'] == groub) & (
id_meta_df['fish'] == fish2)]
size_winners = [size_fish1_row[col].values[0]
for col in ['l1', 'l2', 'l3']]
mean_size_winner = np.nanmean(size_winners)
size_losers = [size_fish2_row[col].values[0] for col in ['l1', 'l2', 'l3']]
mean_size_loser = np.nanmean(size_losers)
if mean_size_winner > mean_size_loser:
size_diff = mean_size_winner - mean_size_loser
winner_fish_id = folder_row['rec_id1'].values[0]
loser_fish_id = folder_row['rec_id2'].values[0]
elif mean_size_winner < mean_size_loser:
size_diff = mean_size_loser - mean_size_winner
winner_fish_id = folder_row['rec_id2'].values[0]
loser_fish_id = folder_row['rec_id1'].values[0]
else:
size_diff = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
chirp_diff = len(Behavior.chirps[Behavior.chirps_ids == winner_fish_id]) - len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return size_diff, chirp_diff
def get_chirp_freq(folder_name, Behavior, order_meta_df):
foldername = folder_name.split('/')[-2]
folder_row = order_meta_df[order_meta_df['recording'] == foldername]
fish1 = folder_row['rec_id1'].values[0].astype(int)
fish2 = folder_row['rec_id2'].values[0].astype(int)
chirp_freq_fish1 = np.nanmedian(
Behavior.freq[Behavior.ident == fish1])
chirp_freq_fish2 = np.nanmedian(
Behavior.freq[Behavior.ident == fish2])
if chirp_freq_fish1 > chirp_freq_fish2:
freq_diff = chirp_freq_fish1 - chirp_freq_fish2
winner_fish_id = folder_row['rec_id1'].values[0]
loser_fish_id = folder_row['rec_id2'].values[0]
elif chirp_freq_fish1 < chirp_freq_fish2:
freq_diff = chirp_freq_fish2 - chirp_freq_fish1
winner_fish_id = folder_row['rec_id2'].values[0]
loser_fish_id = folder_row['rec_id1'].values[0]
chirp_diff = len(Behavior.chirps[Behavior.chirps_ids == winner_fish_id]) - len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return freq_diff, chirp_diff
def main(datapath: str):
foldernames = [
datapath + x + '/' for x in os.listdir(datapath) if os.path.isdir(datapath+x)]
foldernames, _ = get_valid_datasets(datapath)
path_order_meta = (
'/').join(foldernames[0].split('/')[:-2]) + '/order_meta.csv'
order_meta_df = read_csv(path_order_meta)
order_meta_df['recording'] = order_meta_df['recording'].str[1:-1]
path_id_meta = (
'/').join(foldernames[0].split('/')[:-2]) + '/id_meta.csv'
id_meta_df = read_csv(path_id_meta)
chirps_winner = []
size_diffs = []
size_chirps_diffs = []
chirps_loser = []
freq_diffs = []
freq_chirps_diffs = []
for foldername in foldernames:
# behabvior is pandas dataframe with all the data
if foldername == '../data/mount_data/2020-05-12-10_00/':
continue
bh = Behavior(foldername)
# chirps are not sorted in time (presumably due to prior groupings)
# get and sort chirps and corresponding fish_ids of the chirps
category = bh.behavior
timestamps = bh.start_s
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
# winner_chirp, loser_chirp = get_chirp_winner_loser(
# foldername, bh, order_meta_df)
# chirps_winner.append(winner_chirp)
# chirps_loser.append(loser_chirp)
# size_diff, chirp_diff = get_chirp_size(
# foldername, bh, order_meta_df, id_meta_df)
# size_diffs.append(size_diff)
# size_chirps_diffs.append(chirp_diff)
# freq_diff, freq_chirps_diff = get_chirp_freq(
# foldername, bh, order_meta_df)
# freq_diffs.append(freq_diff)
# freq_chirps_diffs.append(freq_chirps_diff)
folder_name = foldername.split('/')[-2]
winner_row = order_meta_df[order_meta_df['recording'] == folder_name]
winner = winner_row['winner'].values[0].astype(int)
winner_fish1 = winner_row['fish1'].values[0].astype(int)
winner_fish2 = winner_row['fish2'].values[0].astype(int)
groub = winner_row['group'].values[0].astype(int)
size_rows = id_meta_df[id_meta_df['group'] == groub]
if winner == winner_fish1:
winner_fish_id = winner_row['rec_id1'].values[0]
loser_fish_id = winner_row['rec_id2'].values[0]
size_winners = []
for l in ['l1', 'l2', 'l3']:
size_winner = size_rows[size_rows['fish']
== winner_fish1][l].values[0]
size_winners.append(size_winner)
mean_size_winner = np.nanmean(size_winners)
size_losers = []
for l in ['l1', 'l2', 'l3']:
size_loser = size_rows[size_rows['fish']
== winner_fish2][l].values[0]
size_losers.append(size_loser)
mean_size_loser = np.nanmean(size_losers)
size_diffs.append(mean_size_winner - mean_size_loser)
elif winner == winner_fish2:
winner_fish_id = winner_row['rec_id2'].values[0]
loser_fish_id = winner_row['rec_id1'].values[0]
size_winners = []
for l in ['l1', 'l2', 'l3']:
size_winner = size_rows[size_rows['fish']
== winner_fish2][l].values[0]
size_winners.append(size_winner)
mean_size_winner = np.nanmean(size_winners)
size_losers = []
for l in ['l1', 'l2', 'l3']:
size_loser = size_rows[size_rows['fish']
== winner_fish1][l].values[0]
size_losers.append(size_loser)
mean_size_loser = np.nanmean(size_losers)
size_diffs.append(mean_size_winner - mean_size_loser)
else:
continue
print(foldername)
all_fish_ids = np.unique(bh.chirps_ids)
chirp_winner = len(bh.chirps[bh.chirps_ids == winner_fish_id])
chirp_loser = len(bh.chirps[bh.chirps_ids == loser_fish_id])
freq_winner = np.nanmedian(bh.freq[bh.ident == winner_fish_id])
freq_loser = np.nanmedian(bh.freq[bh.ident == loser_fish_id])
chirps_winner.append(chirp_winner)
chirps_loser.append(chirp_loser)
size_chirps_diffs.append(chirp_winner - chirp_loser)
freq_diffs.append(freq_winner - freq_loser)
fig, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(
22*ps.cm, 12*ps.cm), width_ratios=[1.5, 1, 1])
plt.subplots_adjust(left=0.098, right=0.945, top=0.94, wspace=0.343)
scatterwinner = 1.15
scatterloser = 1.85
chirps_winner = np.asarray(chirps_winner)[~np.isnan(chirps_winner)]
chirps_loser = np.asarray(chirps_loser)[~np.isnan(chirps_loser)]
bplot1 = ax1.boxplot(chirps_winner, positions=[
1], showfliers=False, patch_artist=True)
bplot2 = ax1.boxplot(chirps_loser, positions=[
2], showfliers=False, patch_artist=True)
ax1.scatter(np.ones(len(chirps_winner)) *
scatterwinner, chirps_winner, color='r')
ax1.scatter(np.ones(len(chirps_loser)) *
scatterloser, chirps_loser, color='r')
ax1.set_xticklabels(['winner', 'loser'])
ax1.text(0.1, 0.9, f'n = {len(chirps_winner)}',
transform=ax1.transAxes, color=ps.white)
for w, l in zip(chirps_winner, chirps_loser):
ax1.plot([scatterwinner, scatterloser], [w, l],
color='r', alpha=0.5, linewidth=0.5)
ax1.set_ylabel('Chirps [n]', color=ps.white)
colors1 = ps.red
ps.set_boxplot_color(bplot1, colors1)
colors1 = ps.orange
ps.set_boxplot_color(bplot2, colors1)
ax2.scatter(size_diffs, size_chirps_diffs, color='r')
ax2.set_xlabel('Size difference [mm]')
ax2.set_ylabel('Chirps [n]')
ax3.scatter(freq_diffs, size_chirps_diffs, color='r')
# ax3.scatter(freq_diffs, freq_chirps_diffs, color='r')
ax3.set_xlabel('Frequency difference [Hz]')
ax3.set_yticklabels([])
ax3.set
plt.savefig('../poster/figs/chirps_winner_loser.pdf')
plt.show()
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/'
main(datapath)

307
code/plot_chirp_size.py Normal file
View File

@ -0,0 +1,307 @@
import numpy as np
from extract_chirps import get_valid_datasets
import os
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import pearsonr, spearmanr, wilcoxon
from thunderfish.powerspectrum import decibel
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.plotstyle import PlotStyle
from modules.behaviour_handling import Behavior, correct_chasing_events
ps = PlotStyle()
logger = makeLogger(__name__)
def get_chirp_winner_loser(folder_name, Behavior, order_meta_df):
foldername = folder_name.split('/')[-2]
winner_row = order_meta_df[order_meta_df['recording'] == foldername]
winner = winner_row['winner'].values[0].astype(int)
winner_fish1 = winner_row['fish1'].values[0].astype(int)
winner_fish2 = winner_row['fish2'].values[0].astype(int)
if winner > 0:
if winner == winner_fish1:
winner_fish_id = winner_row['rec_id1'].values[0]
loser_fish_id = winner_row['rec_id2'].values[0]
elif winner == winner_fish2:
winner_fish_id = winner_row['rec_id2'].values[0]
loser_fish_id = winner_row['rec_id1'].values[0]
chirp_winner = len(
Behavior.chirps[Behavior.chirps_ids == winner_fish_id])
chirp_loser = len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return chirp_winner, chirp_loser
else:
return np.nan, np.nan
def get_chirp_size(folder_name, Behavior, order_meta_df, id_meta_df):
foldername = folder_name.split('/')[-2]
folder_row = order_meta_df[order_meta_df['recording'] == foldername]
fish1 = folder_row['fish1'].values[0].astype(int)
fish2 = folder_row['fish2'].values[0].astype(int)
winner = folder_row['winner'].values[0].astype(int)
groub = folder_row['group'].values[0].astype(int)
size_fish1_row = id_meta_df[(id_meta_df['group'] == groub) & (
id_meta_df['fish'] == fish1)]
size_fish2_row = id_meta_df[(id_meta_df['group'] == groub) & (
id_meta_df['fish'] == fish2)]
size_winners = [size_fish1_row[col].values[0]
for col in ['l1', 'l2', 'l3']]
size_fish1 = np.nanmean(size_winners)
size_losers = [size_fish2_row[col].values[0] for col in ['l1', 'l2', 'l3']]
size_fish2 = np.nanmean(size_losers)
if winner == fish1:
if size_fish1 > size_fish2:
size_diff_bigger = size_fish1 - size_fish2
size_diff_smaller = size_fish2 - size_fish1
elif size_fish1 < size_fish2:
size_diff_bigger = size_fish1 - size_fish2
size_diff_smaller = size_fish2 - size_fish1
else:
size_diff_bigger = np.nan
size_diff_smaller = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
return size_diff_bigger, size_diff_smaller, winner_fish_id, loser_fish_id
winner_fish_id = folder_row['rec_id1'].values[0]
loser_fish_id = folder_row['rec_id2'].values[0]
elif winner == fish2:
if size_fish2 > size_fish1:
size_diff_bigger = size_fish2 - size_fish1
size_diff_smaller = size_fish1 - size_fish2
elif size_fish2 < size_fish1:
size_diff_bigger = size_fish2 - size_fish1
size_diff_smaller = size_fish1 - size_fish2
else:
size_diff_bigger = np.nan
size_diff_smaller = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
return size_diff_bigger, size_diff_smaller, winner_fish_id, loser_fish_id
winner_fish_id = folder_row['rec_id2'].values[0]
loser_fish_id = folder_row['rec_id1'].values[0]
else:
size_diff_bigger = np.nan
size_diff_smaller = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
return size_diff_bigger, size_diff_smaller, winner_fish_id, loser_fish_id
chirp_winner = len(
Behavior.chirps[Behavior.chirps_ids == winner_fish_id])
chirp_loser = len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return size_diff_bigger, chirp_winner, size_diff_smaller, chirp_loser
def get_chirp_freq(folder_name, Behavior, order_meta_df):
foldername = folder_name.split('/')[-2]
folder_row = order_meta_df[order_meta_df['recording'] == foldername]
fish1 = folder_row['fish1'].values[0].astype(int)
fish2 = folder_row['fish2'].values[0].astype(int)
fish1_freq = folder_row['rec_id1'].values[0].astype(int)
fish2_freq = folder_row['rec_id2'].values[0].astype(int)
winner = folder_row['winner'].values[0].astype(int)
chirp_freq_fish1 = np.nanmedian(
Behavior.freq[Behavior.ident == fish1_freq])
chirp_freq_fish2 = np.nanmedian(
Behavior.freq[Behavior.ident == fish2_freq])
if winner == fish1:
if chirp_freq_fish1 > chirp_freq_fish2:
freq_diff_higher = chirp_freq_fish1 - chirp_freq_fish2
freq_diff_lower = chirp_freq_fish2 - chirp_freq_fish1
elif chirp_freq_fish1 < chirp_freq_fish2:
freq_diff_higher = chirp_freq_fish1 - chirp_freq_fish2
freq_diff_lower = chirp_freq_fish2 - chirp_freq_fish1
else:
freq_diff_higher = np.nan
freq_diff_lower = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
winner_fish_id = folder_row['rec_id1'].values[0]
loser_fish_id = folder_row['rec_id2'].values[0]
elif winner == fish2:
if chirp_freq_fish2 > chirp_freq_fish1:
freq_diff_higher = chirp_freq_fish2 - chirp_freq_fish1
freq_diff_lower = chirp_freq_fish1 - chirp_freq_fish2
elif chirp_freq_fish2 < chirp_freq_fish1:
freq_diff_higher = chirp_freq_fish2 - chirp_freq_fish1
freq_diff_lower = chirp_freq_fish1 - chirp_freq_fish2
else:
freq_diff_higher = np.nan
freq_diff_lower = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
winner_fish_id = folder_row['rec_id2'].values[0]
loser_fish_id = folder_row['rec_id1'].values[0]
else:
freq_diff_higher = np.nan
freq_diff_lower = np.nan
winner_fish_id = np.nan
loser_fish_id = np.nan
chirp_winner = len(
Behavior.chirps[Behavior.chirps_ids == winner_fish_id])
chirp_loser = len(
Behavior.chirps[Behavior.chirps_ids == loser_fish_id])
return freq_diff_higher, chirp_winner, freq_diff_lower, chirp_loser
def main(datapath: str):
foldernames = [
datapath + x + '/' for x in os.listdir(datapath) if os.path.isdir(datapath+x)]
foldernames, _ = get_valid_datasets(datapath)
path_order_meta = (
'/').join(foldernames[0].split('/')[:-2]) + '/order_meta.csv'
order_meta_df = read_csv(path_order_meta)
order_meta_df['recording'] = order_meta_df['recording'].str[1:-1]
path_id_meta = (
'/').join(foldernames[0].split('/')[:-2]) + '/id_meta.csv'
id_meta_df = read_csv(path_id_meta)
chirps_winner = []
size_diffs_winner = []
size_diffs_loser = []
size_chirps_winner = []
size_chirps_loser = []
freq_diffs_higher = []
freq_diffs_lower = []
freq_chirps_winner = []
freq_chirps_loser = []
chirps_loser = []
freq_diffs = []
freq_chirps_diffs = []
for foldername in foldernames:
# behabvior is pandas dataframe with all the data
if foldername == '../data/mount_data/2020-05-12-10_00/':
continue
bh = Behavior(foldername)
# chirps are not sorted in time (presumably due to prior groupings)
# get and sort chirps and corresponding fish_ids of the chirps
category = bh.behavior
timestamps = bh.start_s
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
winner_chirp, loser_chirp = get_chirp_winner_loser(
foldername, bh, order_meta_df)
chirps_winner.append(winner_chirp)
chirps_loser.append(loser_chirp)
size_diff_bigger, chirp_winner, size_diff_smaller, chirp_loser = get_chirp_size(
foldername, bh, order_meta_df, id_meta_df)
freq_diff_higher, chirp_freq_winner, freq_diff_lower, chirp_freq_loser = get_chirp_freq(
foldername, bh, order_meta_df)
freq_diffs_higher.append(freq_diff_higher)
freq_diffs_lower.append(freq_diff_lower)
freq_chirps_winner.append(chirp_freq_winner)
freq_chirps_loser.append(chirp_freq_loser)
if np.isnan(size_diff_bigger):
continue
size_diffs_winner.append(size_diff_bigger)
size_diffs_loser.append(size_diff_smaller)
size_chirps_winner.append(chirp_winner)
size_chirps_loser.append(chirp_loser)
size_winner_pearsonr = pearsonr(size_diffs_winner, size_chirps_winner)
size_loser_pearsonr = pearsonr(size_diffs_loser, size_chirps_loser)
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(
13*ps.cm, 10*ps.cm), sharey=True)
plt.subplots_adjust(left=0.098, right=0.945, top=0.94, wspace=0.343)
scatterwinner = 1.15
scatterloser = 1.85
chirps_winner = np.asarray(chirps_winner)[~np.isnan(chirps_winner)]
chirps_loser = np.asarray(chirps_loser)[~np.isnan(chirps_loser)]
stat = wilcoxon(chirps_winner, chirps_loser)
print(stat)
bplot1 = ax1.boxplot(chirps_winner, positions=[
0.9], showfliers=False, patch_artist=True)
bplot2 = ax1.boxplot(chirps_loser, positions=[
2.1], showfliers=False, patch_artist=True)
ax1.scatter(np.ones(len(chirps_winner)) *
scatterwinner, chirps_winner, color=ps.red)
ax1.scatter(np.ones(len(chirps_loser)) *
scatterloser, chirps_loser, color=ps.orange)
ax1.set_xticklabels(['winner', 'loser'])
ax1.text(0.1, 0.9, f'n = {len(chirps_winner)}',
transform=ax1.transAxes, color=ps.white)
for w, l in zip(chirps_winner, chirps_loser):
ax1.plot([scatterwinner, scatterloser], [w, l],
color=ps.white, alpha=1, linewidth=0.5)
ax1.set_ylabel('chirps [n]', color=ps.white)
ax1.set_xlabel('outcome', color=ps.white)
colors1 = ps.red
ps.set_boxplot_color(bplot1, colors1)
colors1 = ps.orange
ps.set_boxplot_color(bplot2, colors1)
ax2.scatter(size_diffs_winner, size_chirps_winner,
color=ps.red, label='winner')
ax2.scatter(size_diffs_loser, size_chirps_loser,
color=ps.orange, label='loser')
ax2.set_xlabel('size difference [cm]')
# ax2.set_xticks(np.arange(-10, 10.1, 2))
handles, labels = ax2.get_legend_handles_labels()
fig.legend(handles, labels, loc='upper center', ncol=2)
plt.subplots_adjust(left=0.162, right=0.97, top=0.85, bottom=0.176)
# pearson r
plt.savefig('../poster/figs/chirps_winner_loser.pdf')
plt.show()
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/'
main(datapath)

View File

@ -0,0 +1,81 @@
import numpy as np
import os
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import pearsonr, spearmanr
from thunderfish.powerspectrum import decibel
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.plotstyle import PlotStyle
from modules.behaviour_handling import Behavior, correct_chasing_events
from modules.datahandling import flatten
ps = PlotStyle()
logger = makeLogger(__name__)
def main(datapath: str):
foldernames = [
datapath + x + '/' for x in os.listdir(datapath) if os.path.isdir(datapath+x)]
time_precents = []
chirps_percents = []
for foldername in foldernames:
# behabvior is pandas dataframe with all the data
if foldername == '../data/mount_data/2020-05-12-10_00/':
continue
bh = Behavior(foldername)
category = bh.behavior
timestamps = bh.start_s
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
chasing_onset = timestamps[category == 0]
chasing_offset = timestamps[category == 1]
if len(chasing_onset) != len(chasing_offset):
embed()
chirps_in_chasings = []
for onset, offset in zip(chasing_onset, chasing_offset):
chirps_in_chasing = [c for c in bh.chirps if (c > onset) & (c < offset)]
chirps_in_chasings.append(chirps_in_chasing)
try:
time_chasing = np.sum(chasing_offset[chasing_offset<3*60*60] - chasing_onset[chasing_onset<3*60*60])
except:
time_chasing = np.sum(chasing_offset[chasing_offset<3*60*60] - chasing_onset[chasing_onset<3*60*60][:-1])
time_chasing_percent = (time_chasing/(3*60*60))*100
chirps_chasing = np.asarray(flatten(chirps_in_chasings))
chirps_chasing_new = chirps_chasing[chirps_chasing<3*60*60]
chirps_percent = (len(chirps_chasing_new)/len(bh.chirps))*100
time_precents.append(time_chasing_percent)
chirps_percents.append(chirps_percent)
fig, ax = plt.subplots(1, 1, figsize=(14*ps.cm, 10*ps.cm))
ax.boxplot([time_precents, chirps_percents])
ax.set_xticklabels(['Time Chasing', 'Chirps in Chasing'])
ax.set_ylabel('Percent')
ax.scatter(np.ones(len(time_precents))*1.25, time_precents, color=ps.white)
ax.scatter(np.ones(len(chirps_percents))*1.75, chirps_percents, color=ps.white)
plt.savefig('../poster/figs/chirps_in_chasing.pdf')
plt.show()
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/'
main(datapath)

123
code/plot_event_timeline.py Normal file
View File

@ -0,0 +1,123 @@
import numpy as np
import os
import numpy as np
import matplotlib.pyplot as plt
from thunderfish.powerspectrum import decibel
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.plotstyle import PlotStyle
from modules.behaviour_handling import Behavior, correct_chasing_events
from extract_chirps import get_valid_datasets
ps = PlotStyle()
logger = makeLogger(__name__)
def main(datapath: str):
foldernames = [
datapath + x + '/' for x in os.listdir(datapath) if os.path.isdir(datapath+x)]
foldernames, _ = get_valid_datasets(datapath)
for foldername in foldernames[1:2]:
# foldername = foldernames[0]
if foldername == '../data/mount_data/2020-05-12-10_00/':
continue
# behabvior is pandas dataframe with all the data
bh = Behavior(foldername)
# 2020-06-11-10
category = bh.behavior
timestamps = bh.start_s
# Correct for doubles in chasing on- and offsets to get the right on-/offset pairs
# Get rid of tracking faults (two onsets or two offsets after another)
category, timestamps = correct_chasing_events(category, timestamps)
# split categories
chasing_onset = (timestamps[category == 0] / 60) / 60
chasing_offset = (timestamps[category == 1] / 60) / 60
physical_contact = (timestamps[category == 2] / 60) / 60
all_fish_ids = np.unique(bh.chirps_ids)
fish1_id = all_fish_ids[0]
fish2_id = all_fish_ids[1]
# Associate chirps to inidividual fish
fish1 = (bh.chirps[bh.chirps_ids == fish1_id] / 60) / 60
fish2 = (bh.chirps[bh.chirps_ids == fish2_id] / 60) / 60
fish1_color = ps.purple
fish2_color = ps.lavender
fig, ax = plt.subplots(5, 1, figsize=(
21*ps.cm, 10*ps.cm), height_ratios=[0.5, 0.5, 0.5, 0.2, 6], sharex=True)
# marker size
s = 80
ax[0].scatter(physical_contact, np.ones(
len(physical_contact)), color=ps.maroon, marker='|', s=s)
ax[1].scatter(chasing_onset, np.ones(len(chasing_onset)),
color=ps.orange, marker='|', s=s)
ax[2].scatter(fish1, np.ones(len(fish1))-0.25,
color=fish1_color, marker='|', s=s)
ax[2].scatter(fish2, np.zeros(len(fish2))+0.25,
color=fish2_color, marker='|', s=s)
freq_temp = bh.freq[bh.ident == fish1_id]
time_temp = bh.time[bh.idx[bh.ident == fish1_id]]
ax[4].plot((time_temp / 60) / 60, freq_temp, color=fish1_color)
freq_temp = bh.freq[bh.ident == fish2_id]
time_temp = bh.time[bh.idx[bh.ident == fish2_id]]
ax[4].plot((time_temp / 60) / 60, freq_temp, color=fish2_color)
# ax[3].imshow(decibel(bh.spec), extent=[bh.time[0]/60/60, bh.time[-1]/60/60, 0, 2000], aspect='auto', origin='lower')
# Hide grid lines
ax[0].grid(False)
ax[0].set_frame_on(False)
ax[0].set_xticks([])
ax[0].set_yticks([])
ps.hide_ax(ax[0])
ax[0].yaxis.set_label_coords(-0.1, 0.5)
ax[1].grid(False)
ax[1].set_frame_on(False)
ax[1].set_xticks([])
ax[1].set_yticks([])
ps.hide_ax(ax[1])
ax[2].grid(False)
ax[2].set_frame_on(False)
ax[2].set_yticks([])
ax[2].set_xticks([])
ps.hide_ax(ax[2])
ax[4].axvspan(3, 6, 0, 5, facecolor='grey', alpha=0.5)
ax[4].set_xticks(np.arange(0, 6.1, 0.5))
ps.hide_ax(ax[3])
labelpad = 30
fsize = 12
ax[0].set_ylabel('contact', rotation=0,
labelpad=labelpad, fontsize=fsize)
ax[1].set_ylabel('chasing', rotation=0,
labelpad=labelpad, fontsize=fsize)
ax[2].set_ylabel('chirps', rotation=0,
labelpad=labelpad, fontsize=fsize)
ax[4].set_ylabel('EODf')
ax[4].set_xlabel('time [h]')
# ax[0].set_title(foldername.split('/')[-2])
# 2020-03-31-9_59
plt.subplots_adjust(left=0.158, right=0.987, top=0.918)
# plt.savefig('../poster/figs/timeline.pdf')
plt.show()
# plot chirps
if __name__ == '__main__':
# Path to the data
datapath = '../data/mount_data/'
main(datapath)

View File

@ -0,0 +1,121 @@
import numpy as np
import matplotlib.pyplot as plt
from thunderfish.powerspectrum import spectrogram, decibel
from modules.filehandling import LoadData
from modules.datahandling import instantaneous_frequency
from modules.filters import bandpass_filter
from modules.plotstyle import PlotStyle
ps = PlotStyle()
def main():
# Load data
datapath = "../data/2022-06-02-10_00/"
data = LoadData(datapath)
# good chirp times for data: 2022-06-02-10_00
window_start_seconds = 3 * 60 * 60 + 6 * 60 + 43.5 + 9 + 6.25
window_start_index = window_start_seconds * data.raw_rate
window_duration_seconds = 0.2
window_duration_index = window_duration_seconds * data.raw_rate
timescaler = 1000
raw = data.raw[window_start_index:window_start_index +
window_duration_index, 10]
fig, (ax1, ax2, ax3) = plt.subplots(
3, 1, figsize=(12 * ps.cm, 10*ps.cm), sharex=True, sharey=True)
# plot instantaneous frequency
filtered1 = bandpass_filter(
signal=raw, lowf=750, highf=1200, samplerate=data.raw_rate)
filtered2 = bandpass_filter(
signal=raw, lowf=550, highf=700, samplerate=data.raw_rate)
freqtime1, freq1 = instantaneous_frequency(
filtered1, data.raw_rate, smoothing_window=3)
freqtime2, freq2 = instantaneous_frequency(
filtered2, data.raw_rate, smoothing_window=3)
ax1.plot(freqtime1*timescaler, freq1, color=ps.red,
lw=2, label=f"fish 1, {np.median(freq1):.0f} Hz")
ax1.plot(freqtime2*timescaler, freq2, color=ps.orange,
lw=2, label=f"fish 2, {np.median(freq2):.0f} Hz")
ax1.legend(bbox_to_anchor=(0, 1.02, 1, 0.2), loc="lower center",
mode="normal", borderaxespad=0, ncol=2)
ps.hide_xax(ax1)
# plot fine spectrogram
spec_power, spec_freqs, spec_times = spectrogram(
raw,
ratetime=data.raw_rate,
freq_resolution=150,
overlap_frac=0.2,
)
ylims = [300, 1200]
fmask = np.zeros(spec_freqs.shape, dtype=bool)
fmask[(spec_freqs > ylims[0]) & (spec_freqs < ylims[1])] = True
ax2.imshow(
decibel(spec_power[fmask, :]),
extent=[
spec_times[0]*timescaler,
spec_times[-1]*timescaler,
spec_freqs[fmask][0],
spec_freqs[fmask][-1],
],
aspect="auto",
origin="lower",
interpolation="gaussian",
alpha=1,
)
ps.hide_xax(ax2)
# plot coarse spectrogram
spec_power, spec_freqs, spec_times = spectrogram(
raw,
ratetime=data.raw_rate,
freq_resolution=10,
overlap_frac=0.3,
)
fmask = np.zeros(spec_freqs.shape, dtype=bool)
fmask[(spec_freqs > ylims[0]) & (spec_freqs < ylims[1])] = True
ax3.imshow(
decibel(spec_power[fmask, :]),
extent=[
spec_times[0]*timescaler,
spec_times[-1]*timescaler,
spec_freqs[fmask][0],
spec_freqs[fmask][-1],
],
aspect="auto",
origin="lower",
interpolation="gaussian",
alpha=1,
)
# ps.hide_xax(ax3)
ax3.set_xlabel("time [ms]")
ax2.set_ylabel("frequency [Hz]")
ax1.set_yticks(np.arange(400, 1201, 400))
ax1.spines.left.set_bounds((400, 1200))
ax2.set_yticks(np.arange(400, 1201, 400))
ax2.spines.left.set_bounds((400, 1200))
ax3.set_yticks(np.arange(400, 1201, 400))
ax3.spines.left.set_bounds((400, 1200))
plt.subplots_adjust(left=0.17, right=0.98, top=0.9,
bottom=0.14, hspace=0.35)
plt.savefig('../poster/figs/introplot.pdf')
plt.show()
if __name__ == '__main__':
main()

471
code/plot_kdes.py Normal file
View File

@ -0,0 +1,471 @@
from extract_chirps import get_valid_datasets
import os
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from tqdm import tqdm
from IPython import embed
from pandas import read_csv
from modules.logger import makeLogger
from modules.datahandling import flatten, causal_kde1d, acausal_kde1d
from modules.behaviour_handling import (
Behavior, correct_chasing_events, center_chirps)
from modules.plotstyle import PlotStyle
logger = makeLogger(__name__)
ps = PlotStyle()
def bootstrap(data, nresamples, kde_time, kernel_width, event_times, time_before, time_after):
bootstrapped_kdes = []
data = data[data <= 3*60*60] # only night time
# diff_data = np.diff(np.sort(data), prepend=0)
# if len(data) != 0:
# mean_chirprate = (len(data) - 1) / (data[-1] - data[0])
for i in tqdm(range(nresamples)):
# np.random.shuffle(diff_data)
# bootstrapped_data = np.cumsum(diff_data)
bootstrapped_data = data + np.random.randn(len(data)) * 10
bootstrap_data_centered = center_chirps(
bootstrapped_data, event_times, time_before, time_after)
bootstrapped_kde = acausal_kde1d(
bootstrap_data_centered, time=kde_time, width=kernel_width)
# bootstrapped_kdes = list(np.asarray(
# bootstrapped_kdes) / len(event_times))
bootstrapped_kdes.append(bootstrapped_kde)
return bootstrapped_kdes
def jackknife(data, nresamples, subsetsize, kde_time, kernel_width, event_times, time_before, time_after):
jackknife_kdes = []
data = data[data <= 3*60*60] # only night time
subsetsize = int(len(data) * subsetsize)
diff_data = np.diff(np.sort(data), prepend=0)
for i in tqdm(range(nresamples)):
bootstrapped_data = np.random.sample(data, subsetsize, replace=False)
bootstrapped_data = np.cumsum(diff_data)
bootstrap_data_centered = center_chirps(
bootstrapped_data, event_times, time_before, time_after)
bootstrapped_kde = acausal_kde1d(
bootstrap_data_centered, time=kde_time, width=kernel_width)
# bootstrapped_kdes = list(np.asarray(
# bootstrapped_kdes) / len(event_times))
jackknife_kdes.append(bootstrapped_kde)
return jackknife_kdes
def get_chirp_winner_loser(folder_name, Behavior, order_meta_df):
foldername = folder_name.split('/')[-2]
winner_row = order_meta_df[order_meta_df['recording'] == foldername]
winner = winner_row['winner'].values[0].astype(int)
winner_fish1 = winner_row['fish1'].values[0].astype(int)
winner_fish2 = winner_row['fish2'].values[0].astype(int)
if winner > 0:
if winner == winner_fish1:
winner_fish_id = winner_row['rec_id1'].values[0]
loser_fish_id = winner_row['rec_id2'].values[0]
elif winner == winner_fish2:
winner_fish_id = winner_row['rec_id2'].values[0]
loser_fish_id = winner_row['rec_id1'].values[0]
chirp_winner = Behavior.chirps[Behavior.chirps_ids == winner_fish_id]
chirp_loser = Behavior.chirps[Behavior.chirps_ids == loser_fish_id]
return chirp_winner, chirp_loser
return None, None
def main(dataroot):
foldernames, _ = get_valid_datasets(dataroot)
plot_all = True
time_before = 60
time_after = 60
dt = 0.001
kernel_width = 1
kde_time = np.arange(-time_before, time_after, dt)
nbootstraps = 2
meta_path = (
'/').join(foldernames[0].split('/')[:-2]) + '/order_meta.csv'
meta = pd.read_csv(meta_path)
meta['recording'] = meta['recording'].str[1:-1]
winner_onsets = []
winner_offsets = []
winner_physicals = []
loser_onsets = []
loser_offsets = []
loser_physicals = []
winner_onsets_boot = []
winner_offsets_boot = []
winner_physicals_boot = []
loser_onsets_boot = []
loser_offsets_boot = []
loser_physicals_boot = []
onset_count = 0
offset_count = 0
physical_count = 0
# Iterate over all recordings and save chirp- and event-timestamps
for folder in tqdm(foldernames):
foldername = folder.split('/')[-2]
# logger.info('Loading data from folder: {}'.format(foldername))
broken_folders = ['../data/mount_data/2020-05-12-10_00/']
if folder in broken_folders:
continue
bh = Behavior(folder)
category, timestamps = correct_chasing_events(bh.behavior, bh.start_s)
category = category[timestamps < 3*60*60] # only night time
timestamps = timestamps[timestamps < 3*60*60] # only night time
winner, loser = get_chirp_winner_loser(folder, bh, meta)
if winner is None:
continue
onsets = (timestamps[category == 0])
offsets = (timestamps[category == 1])
physicals = (timestamps[category == 2])
onset_count += len(onsets)
offset_count += len(offsets)
physical_count += len(physicals)
winner_onsets.append(center_chirps(
winner, onsets, time_before, time_after))
winner_offsets.append(center_chirps(
winner, offsets, time_before, time_after))
winner_physicals.append(center_chirps(
winner, physicals, time_before, time_after))
loser_onsets.append(center_chirps(
loser, onsets, time_before, time_after))
loser_offsets.append(center_chirps(
loser, offsets, time_before, time_after))
loser_physicals.append(center_chirps(
loser, physicals, time_before, time_after))
# bootstrap
# chirps = [winner, winner, winner, loser, loser, loser]
winner_onsets_boot.append(bootstrap(
winner,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=onsets,
time_before=time_before,
time_after=time_after))
winner_offsets_boot.append(bootstrap(
winner,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=offsets,
time_before=time_before,
time_after=time_after))
winner_physicals_boot.append(bootstrap(
winner,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=physicals,
time_before=time_before,
time_after=time_after))
loser_onsets_boot.append(bootstrap(
loser,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=onsets,
time_before=time_before,
time_after=time_after))
loser_offsets_boot.append(bootstrap(
loser,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=offsets,
time_before=time_before,
time_after=time_after))
loser_physicals_boot.append(bootstrap(
loser,
nresamples=nbootstraps,
kde_time=kde_time,
kernel_width=kernel_width,
event_times=physicals,
time_before=time_before,
time_after=time_after))
if plot_all:
winner_onsets_conv = acausal_kde1d(
winner_onsets[-1], kde_time, kernel_width)
winner_offsets_conv = acausal_kde1d(
winner_offsets[-1], kde_time, kernel_width)
winner_physicals_conv = acausal_kde1d(
winner_physicals[-1], kde_time, kernel_width)
loser_onsets_conv = acausal_kde1d(
loser_onsets[-1], kde_time, kernel_width)
loser_offsets_conv = acausal_kde1d(
loser_offsets[-1], kde_time, kernel_width)
loser_physicals_conv = acausal_kde1d(
loser_physicals[-1], kde_time, kernel_width)
fig, ax = plt.subplots(2, 3, figsize=(
21*ps.cm, 10*ps.cm), sharey=True, sharex=True)
ax[0, 0].set_title(
f"{foldername}, onsets {len(onsets)}, offsets {len(offsets)}, physicals {len(physicals)},winner {len(winner)}, looser {len(loser)} , onsets")
ax[0, 0].plot(kde_time, winner_onsets_conv/len(onsets))
ax[0, 1].plot(kde_time, winner_offsets_conv/len(offsets))
ax[0, 2].plot(kde_time, winner_physicals_conv/len(physicals))
ax[1, 0].plot(kde_time, loser_onsets_conv/len(onsets))
ax[1, 1].plot(kde_time, loser_offsets_conv/len(offsets))
ax[1, 2].plot(kde_time, loser_physicals_conv/len(physicals))
# # plot bootstrap lines
for kde in winner_onsets_boot[-1]:
ax[0, 0].plot(kde_time, kde/len(onsets),
color='gray')
for kde in winner_offsets_boot[-1]:
ax[0, 1].plot(kde_time, kde/len(offsets),
color='gray')
for kde in winner_physicals_boot[-1]:
ax[0, 2].plot(kde_time, kde/len(physicals),
color='gray')
for kde in loser_onsets_boot[-1]:
ax[1, 0].plot(kde_time, kde/len(onsets),
color='gray')
for kde in loser_offsets_boot[-1]:
ax[1, 1].plot(kde_time, kde/len(offsets),
color='gray')
for kde in loser_physicals_boot[-1]:
ax[1, 2].plot(kde_time, kde/len(physicals),
color='gray')
# plot bootstrap percentiles
# ax[0, 0].fill_between(
# kde_time,
# np.percentile(winner_onsets_boot[-1], 5, axis=0),
# np.percentile(winner_onsets_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[0, 1].fill_between(
# kde_time,
# np.percentile(winner_offsets_boot[-1], 5, axis=0),
# np.percentile(
# winner_offsets_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[0, 2].fill_between(
# kde_time,
# np.percentile(
# winner_physicals_boot[-1], 5, axis=0),
# np.percentile(
# winner_physicals_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[1, 0].fill_between(
# kde_time,
# np.percentile(loser_onsets_boot[-1], 5, axis=0),
# np.percentile(loser_onsets_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[1, 1].fill_between(
# kde_time,
# np.percentile(loser_offsets_boot[-1], 5, axis=0),
# np.percentile(loser_offsets_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[1, 2].fill_between(
# kde_time,
# np.percentile(
# loser_physicals_boot[-1], 5, axis=0),
# np.percentile(
# loser_physicals_boot[-1], 95, axis=0),
# color='gray',
# alpha=0.5)
# ax[0, 0].plot(kde_time, np.median(winner_onsets_boot[-1], axis=0),
# color='black', linewidth=2)
# ax[0, 1].plot(kde_time, np.median(winner_offsets_boot[-1], axis=0),
# color='black', linewidth=2)
# ax[0, 2].plot(kde_time, np.median(winner_physicals_boot[-1], axis=0),
# color='black', linewidth=2)
# ax[1, 0].plot(kde_time, np.median(loser_onsets_boot[-1], axis=0),
# color='black', linewidth=2)
# ax[1, 1].plot(kde_time, np.median(loser_offsets_boot[-1], axis=0),
# color='black', linewidth=2)
# ax[1, 2].plot(kde_time, np.median(loser_physicals_boot[-1], axis=0),
# color='black', linewidth=2)
ax[0, 0].set_xlim(-30, 30)
plt.show()
winner_onsets = np.sort(flatten(winner_onsets))
winner_offsets = np.sort(flatten(winner_offsets))
winner_physicals = np.sort(flatten(winner_physicals))
loser_onsets = np.sort(flatten(loser_onsets))
loser_offsets = np.sort(flatten(loser_offsets))
loser_physicals = np.sort(flatten(loser_physicals))
winner_onsets_conv = acausal_kde1d(
winner_onsets, kde_time, kernel_width)
winner_offsets_conv = acausal_kde1d(
winner_offsets, kde_time, kernel_width)
winner_physicals_conv = acausal_kde1d(
winner_physicals, kde_time, kernel_width)
loser_onsets_conv = acausal_kde1d(
loser_onsets, kde_time, kernel_width)
loser_offsets_conv = acausal_kde1d(
loser_offsets, kde_time, kernel_width)
loser_physicals_conv = acausal_kde1d(
loser_physicals, kde_time, kernel_width)
winner_onsets_conv = winner_onsets_conv / onset_count
winner_offsets_conv = winner_offsets_conv / offset_count
winner_physicals_conv = winner_physicals_conv / physical_count
loser_onsets_conv = loser_onsets_conv / onset_count
loser_offsets_conv = loser_offsets_conv / offset_count
loser_physicals_conv = loser_physicals_conv / physical_count
winner_onsets_boot = np.concatenate(
winner_onsets_boot)
winner_offsets_boot = np.concatenate(
winner_offsets_boot)
winner_physicals_boot = np.concatenate(
winner_physicals_boot)
loser_onsets_boot = np.concatenate(
loser_onsets_boot)
loser_offsets_boot = np.concatenate(
loser_offsets_boot)
loser_physicals_boot = np.concatenate(
loser_physicals_boot)
percs = [5, 50, 95]
winner_onsets_boot_quarts = np.percentile(
winner_onsets_boot, percs, axis=0)
winner_offsets_boot_quarts = np.percentile(
winner_offsets_boot, percs, axis=0)
winner_physicals_boot_quarts = np.percentile(
winner_physicals_boot, percs, axis=0)
loser_onsets_boot_quarts = np.percentile(
loser_onsets_boot, percs, axis=0)
loser_offsets_boot_quarts = np.percentile(
loser_offsets_boot, percs, axis=0)
loser_physicals_boot_quarts = np.percentile(
loser_physicals_boot, percs, axis=0)
fig, ax = plt.subplots(2, 3, figsize=(
21*ps.cm, 10*ps.cm), sharey=True, sharex=True)
ax[0, 0].plot(kde_time, winner_onsets_conv)
ax[0, 1].plot(kde_time, winner_offsets_conv)
ax[0, 2].plot(kde_time, winner_physicals_conv)
ax[1, 0].plot(kde_time, loser_onsets_conv)
ax[1, 1].plot(kde_time, loser_offsets_conv)
ax[1, 2].plot(kde_time, loser_physicals_conv)
ax[0, 0].plot(kde_time, winner_onsets_boot_quarts[1], c=ps.black)
ax[0, 1].plot(kde_time, winner_offsets_boot_quarts[1], c=ps.black)
ax[0, 2].plot(kde_time, winner_physicals_boot_quarts[1], c=ps.black)
ax[1, 0].plot(kde_time, loser_onsets_boot_quarts[1], c=ps.black)
ax[1, 1].plot(kde_time, loser_offsets_boot_quarts[1], c=ps.black)
ax[1, 2].plot(kde_time, loser_physicals_boot_quarts[1], c=ps.black)
# for kde in winner_onsets_boot:
# ax[0, 0].plot(kde_time, kde,
# color='gray')
# for kde in winner_offsets_boot:
# ax[0, 1].plot(kde_time, kde,
# color='gray')
# for kde in winner_physicals_boot:
# ax[0, 2].plot(kde_time, kde,
# color='gray')
# for kde in loser_onsets_boot:
# ax[1, 0].plot(kde_time, kde,
# color='gray')
# for kde in loser_offsets_boot:
# ax[1, 1].plot(kde_time, kde,
# color='gray')
# for kde in loser_physicals_boot:
# ax[1, 2].plot(kde_time, kde,
# color='gray')
ax[0, 0].fill_between(kde_time,
winner_onsets_boot_quarts[0],
winner_onsets_boot_quarts[2],
color=ps.gray,
alpha=0.5)
ax[0, 1].fill_between(kde_time,
winner_offsets_boot_quarts[0],
winner_offsets_boot_quarts[2],
color=ps.gray,
alpha=0.5)
ax[0, 2].fill_between(kde_time,
loser_physicals_boot_quarts[0],
loser_physicals_boot_quarts[2],
color=ps.gray,
alpha=0.5)
ax[1, 0].fill_between(kde_time,
loser_onsets_boot_quarts[0],
loser_onsets_boot_quarts[2],
color=ps.gray,
alpha=0.5)
ax[1, 1].fill_between(kde_time,
loser_offsets_boot_quarts[0],
loser_offsets_boot_quarts[2],
color=ps.gray,
alpha=0.5)
ax[1, 2].fill_between(kde_time,
loser_physicals_boot_quarts[0],
loser_physicals_boot_quarts[2],
color=ps.gray,
alpha=0.5)
plt.show()
if __name__ == '__main__':
main('../data/mount_data/')

BIN
poster/figs/algorithm.pdf Normal file

Binary file not shown.

BIN
poster/figs/algorithm1.pdf Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
poster/figs/efishlogo.pdf Normal file

Binary file not shown.

BIN
poster/figs/introplot.pdf Normal file

Binary file not shown.

529
poster/figs/logo_all.pdf Normal file

File diff suppressed because one or more lines are too long

BIN
poster/figs/timeline.pdf Normal file

Binary file not shown.

Binary file not shown.

View File

@ -1,4 +1,4 @@
\documentclass[25pt, a0paper, landscape, margin=0mm, innermargin=20mm,
\documentclass[25pt, a0paper, portrait, margin=0mm, innermargin=20mm,
blockverticalspace=2mm, colspace=20mm, subcolspace=0mm]{tikzposter} %Default values for poster format options.
\input{packages}
@ -7,77 +7,98 @@ blockverticalspace=2mm, colspace=20mm, subcolspace=0mm]{tikzposter} %Default val
\begin{document}
\renewcommand{\baselinestretch}{1}
\title{\parbox{1900pt}{A dark template to make colorful figures pop}}
\author{Sina Prause, Alexander Wendt, Patrick Weygoldt}
\institute{Supervised by Till Raab \& Jan Benda}
\title{\parbox{1500pt}{Bypassing time-frequency uncertainty in the detection of transient communication signals in weakly electric fish}}
\author{Sina Prause, Alexander Wendt, and Patrick Weygoldt}
\institute{Supervised by Till Raab \& Jan Benda, Neuroethology Lab, University of Tuebingen}
\usetitlestyle[]{sampletitle}
\maketitle
\renewcommand{\baselinestretch}{1.4}
\begin{columns}
\column{0.3}
\myblock[TranspBlock]{Introduction}{
\lipsum[1][1-5]
\column{0.4}
\myblock[GrayBlock]{Introduction}{
The time-frequency tradeoff makes reliable signal detecion and simultaneous
sender identification by simple Fourier decomposition in freely interacting
weakly electric fish impossible. This profoundly limits our current
understanding of chirps to experiments
with single - or physically separated - individuals.
% \begin{tikzfigure}[]
% \label{griddrawing}
% \includegraphics[width=0.8\linewidth]{figs/introplot}
% \end{tikzfigure}
}
\myblock[TranspBlock]{Chirp detection}{
\begin{tikzfigure}[]
\label{griddrawing}
\includegraphics[width=\linewidth]{example-image-a}
\label{fig:alg1}
\includegraphics[width=0.9\linewidth]{figs/algorithm1}
\end{tikzfigure}
}
\myblock[TranspBlock]{Methods}{
\vspace{2cm}
\begin{tikzfigure}[]
\label{detector}
\includegraphics[width=\linewidth]{example-image-b}
\label{fig:alg2}
\includegraphics[width=1\linewidth]{figs/algorithm}
\end{tikzfigure}
\vspace{0cm}
}
\column{0.4}
\myblock[TranspBlock]{Results}{
\lipsum[3][1-5]
\column{0.6}
\myblock[TranspBlock]{Chirps during competition}{
\begin{tikzfigure}[]
\label{modulations}
\includegraphics[width=\linewidth]{example-image-c}
\label{fig:example_b}
\includegraphics[width=\linewidth]{figs/timeline.pdf}
\end{tikzfigure}
}
\noindent
\begin{itemize}
\setlength\itemsep{0.5em}
\item Two fish compete for one hidding place in one tank,
\item Experiment had a 3 hour long darkphase and a 3 hour long light phase.
\end{itemize}
\myblock[TranspBlock]{More Stuff}{
\lipsum[3][1-9]
\noindent
\begin{minipage}[c]{0.7\linewidth}
\begin{tikzfigure}[]
\label{fig:example_b}
\includegraphics[width=\linewidth]{figs/chirps_winner_loser.pdf}
\end{tikzfigure}
\end{minipage} % no space if you would like to put them side by side
\begin{minipage}[c]{0.2\linewidth}
\begin{itemize}
\setlength\itemsep{0.5em}
\item Fish who won the competition chirped more often than the fish who lost.
\item
\end{itemize}
\end{minipage}
}
\column{0.3}
\myblock[TranspBlock]{More Results}{
\myblock[TranspBlock]{Interactions at modulations}{
\vspace{-1.2cm}
\begin{tikzfigure}[]
\label{results}
\includegraphics[width=\linewidth]{example-image-a}
\label{fig:example_c}
\includegraphics[width=0.5\linewidth]{example-image-c}
\end{tikzfigure}
\begin{multicols}{2}
\lipsum[5][1-8]
\end{multicols}
\vspace{-1cm}
}
\myblock[TranspBlock]{Conclusion}{
\myblock[GrayBlock]{Conclusion}{
\begin{itemize}
\setlength\itemsep{0.5em}
\item \lipsum[1][1]
\item \lipsum[1][1]
\item \lipsum[1][1]
\item Our analysis is the first to indicate that \textit{A. leptorhynchus} uses long, diffuse and synchronized EOD$f$ signals to communicate in addition to chirps and rises.
\item The recorded fish do not exhibit jamming avoidance behavior while close during synchronous modulations.
\item Synchronous signals \textbf{initiate} spatio-temporal interactions.
\end{itemize}
\vspace{0.2cm}
}
\end{columns}
\end{columns}
\node[
above right,
\node [above right,
text=white,
outer sep=45pt,
minimum width=\paperwidth,
align=center,
draw,
fill=boxes,
color=boxes,
] at (-0.51\paperwidth,-43.5) {
\textcolor{text}{\normalsize Contact: name.surname@student.uni-tuebingen.de}};
color=boxes] at (-43.6,-61) {
\textcolor{white}{
\normalsize Contact: \{name\}.\{surname\}@student.uni-tuebingen.de}};
\end{document}

View File

@ -8,4 +8,3 @@
\setlength{\columnsep}{1.5cm}
\usepackage{xspace}
\usepackage{tikz}
\usepackage{lipsum}

View File

@ -16,10 +16,11 @@
\colorlet{notefgcolor}{background}
\colorlet{notebgcolor}{background}
% Title setup
\settitle{
% Rearrange the order of the minipages to e.g. center the title between the logos
\begin{minipage}[c]{0.6\paperwidth}
\begin{minipage}[c]{0.8\paperwidth}
% \centering
\vspace{2.5cm}\hspace{1.5cm}
\color{text}{\Huge{\textbf{\@title}} \par}
@ -30,26 +31,28 @@
\vspace{2.5cm}
\end{minipage}
\begin{minipage}[c]{0.2\paperwidth}
% \centering
\vspace{1cm}\hspace{1cm}
\includegraphics[scale=1]{example-image-a}
\end{minipage}
\begin{minipage}[c]{0.2\paperwidth}
% \vspace{1cm}\hspace{1cm}
\centering
\includegraphics[scale=1]{example-image-a}
% \vspace{1cm}
\hspace{-10cm}
\includegraphics[width=0.8\linewidth]{figs/efishlogo.pdf}
\end{minipage}}
% \begin{minipage}[c]{0.2\paperwidth}
% \vspace{1cm}\hspace{1cm}
% \centering
% \includegraphics[width=\linewidth]{example-image-a}
% \end{minipage}}
% definie title style with background box
% define title style with background box (currently white)
\definetitlestyle{sampletitle}{
width=1189mm,
width=841mm,
roundedcorners=0,
linewidth=0pt,
innersep=15pt,
titletotopverticalspace=0mm,
titletoblockverticalspace=5pt
}{
\begin{scope}[line width=\titlelinewidth, rounded corners=\titleroundedcorners]
\begin{scope}[line width=\titlelinewidth,
rounded corners=\titleroundedcorners]
\draw[fill=text, color=boxes]
(\titleposleft,\titleposbottom)
rectangle

Binary file not shown.

After

Width:  |  Height:  |  Size: 116 KiB

Binary file not shown.

Binary file not shown.

BIN
poster_old/figs/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

1184
poster_old/figs/logo.svg Normal file

File diff suppressed because it is too large Load Diff

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 157 KiB

BIN
poster_old/main.pdf Normal file

Binary file not shown.

119
poster_old/main.tex Normal file
View File

@ -0,0 +1,119 @@
\documentclass[25pt, a0paper, landscape, margin=0mm, innermargin=20mm,
blockverticalspace=2mm, colspace=20mm, subcolspace=0mm]{tikzposter} %Default values for poster format options.
\input{packages}
\input{style}
\begin{document}
\renewcommand{\baselinestretch}{1}
\title{\parbox{1900pt}{Pushing the limits of time-frequency uncertainty in the
detection of transient communication signals in weakly electric fish}}
\author{Sina Prause, Alexander Wendt, Patrick Weygoldt}
\institute{Supervised by Till Raab \& Jan Benda, Neurothology Group,
University of Tübingen}
\usetitlestyle[]{sampletitle}
\maketitle
\renewcommand{\baselinestretch}{1.4}
\begin{columns}
\column{0.5}
\myblock[TranspBlock]{Introduction}{
\begin{minipage}[t]{0.55\linewidth}
The time-frequency tradeoff makes reliable signal detecion and simultaneous
sender identification of freely interacting individuals impossible.
This profoundly limits our current understanding of chirps to experiments
with single - or physically separated - individuals.
\end{minipage} \hfill
\begin{minipage}[t]{0.40\linewidth}
\vspace{-1.5cm}
\begin{tikzfigure}[]
\label{tradeoff}
\includegraphics[width=\linewidth]{figs/introplot}
\end{tikzfigure}
\end{minipage}
}
\myblock[TranspBlock]{A chirp detection algorithm}{
\begin{tikzfigure}[]
\label{modulations}
\includegraphics[width=\linewidth]{figs/algorithm}
\end{tikzfigure}
}
\column{0.5}
\myblock[TranspBlock]{Chirps and diadic competitions}{
\begin{minipage}[t]{0.7\linewidth}
\begin{tikzfigure}[]
\label{modulations}
\includegraphics[width=\linewidth]{figs/placeholder1}
\end{tikzfigure}
\end{minipage} \hfill
\begin{minipage}[t]{0.25\linewidth}
\lipsum[3][1-3]
\end{minipage}
\begin{minipage}[t]{0.7\linewidth}
\begin{tikzfigure}[]
\label{modulations}
\includegraphics[width=\linewidth]{figs/placeholder1}
\end{tikzfigure}
\end{minipage} \hfill
\begin{minipage}[t]{0.25\linewidth}
\lipsum[3][1-3]
\end{minipage}
\begin{minipage}[t]{0.7\linewidth}
\begin{tikzfigure}[]
\label{modulations}
\includegraphics[width=\linewidth]{figs/placeholder1}
\end{tikzfigure}
\end{minipage} \hfill
\begin{minipage}[t]{0.25\linewidth}
\lipsum[3][1-3]
\end{minipage}
}
\myblock[TranspBlock]{Conclusion}{
\lipsum[3][1-9]
}
% \column{0.3}
% \myblock[TranspBlock]{More Results}{
% \begin{tikzfigure}[]
% \label{results}
% \includegraphics[width=\linewidth]{example-image-a}
% \end{tikzfigure}
% \begin{multicols}{2}
% \lipsum[5][1-8]
% \end{multicols}
% \vspace{-1cm}
% }
% \myblock[TranspBlock]{Conclusion}{
% \begin{itemize}
% \setlength\itemsep{0.5em}
% \item \lipsum[1][1]
% \item \lipsum[1][1]
% \item \lipsum[1][1]
% \end{itemize}
% \vspace{0.2cm}
% }
\end{columns}
\node[
above right,
text=white,
outer sep=45pt,
minimum width=\paperwidth,
align=center,
draw,
fill=boxes,
color=boxes,
] at (-0.51\paperwidth,-43.5) {
\textcolor{text}{\normalsize Contact: \{name\}.\{surname\}@student.uni-tuebingen.de}};
\end{document}

11
poster_old/packages.tex Normal file
View File

@ -0,0 +1,11 @@
\usepackage[utf8]{inputenc}
\usepackage[scaled]{helvet}
\renewcommand\familydefault{\sfdefault}
\usepackage[T1]{fontenc}
\usepackage{wrapfig}
\usepackage{setspace}
\usepackage{multicol}
\setlength{\columnsep}{1.5cm}
\usepackage{xspace}
\usepackage{tikz}
\usepackage{lipsum}

119
poster_old/style.tex Normal file
View File

@ -0,0 +1,119 @@
\tikzposterlatexaffectionproofoff
\usetheme{Default}
\definecolor{text}{HTML}{e0e4f7}
\definecolor{background}{HTML}{111116}
\definecolor{boxes}{HTML}{2a2a32}
\definecolor{unired}{HTML}{a51e37}
\colorlet{blocktitlefgcolor}{text}
\colorlet{backgroundcolor}{background}
\colorlet{blocktitlebgcolor}{background}
\colorlet{blockbodyfgcolor}{text}
\colorlet{innerblocktitlebgcolor}{background}
\colorlet{innerblocktitlefgcolor}{text}
\colorlet{notefrcolor}{text}
\colorlet{notefgcolor}{background}
\colorlet{notebgcolor}{background}
% Title setup
\settitle{
% Rearrange the order of the minipages to e.g. center the title between the logos
\begin{minipage}[c]{0.6\paperwidth}
% \centering
\vspace{2.5cm}\hspace{1.5cm}
\color{text}{\Huge{\textbf{\@title}} \par}
\vspace*{2em}\hspace{1.5cm}
\color{text}{\LARGE \@author \par}
\vspace*{2em}\hspace{1.5cm}
\color{text}{\Large \@institute}
\vspace{2.5cm}
\end{minipage}
\begin{minipage}[c]{0.2\paperwidth}
% \centering
\vspace{1cm}\hspace{1cm}
\includegraphics[scale=1]{example-image-a}
\end{minipage}
\begin{minipage}[c]{0.2\paperwidth}
% \vspace{1cm}\hspace{1cm}
\centering
\includegraphics[scale=1]{example-image-a}
\end{minipage}}
% definie title style with background box
\definetitlestyle{sampletitle}{
width=1189mm,
roundedcorners=0,
linewidth=0pt,
innersep=15pt,
titletotopverticalspace=0mm,
titletoblockverticalspace=5pt
}{
\begin{scope}[line width=\titlelinewidth, rounded corners=\titleroundedcorners]
\draw[fill=text, color=boxes]
(\titleposleft,\titleposbottom)
rectangle
(\titleposright,\titlepostop);
\end{scope}
}
% define coustom block style for visible blocks
\defineblockstyle{GrayBlock}{
titlewidthscale=1,
bodywidthscale=1,
% titlecenter,
titleleft,
titleoffsetx=0pt,
titleoffsety=-30pt,
bodyoffsetx=0pt,
bodyoffsety=-40pt,
bodyverticalshift=0mm,
roundedcorners=25,
linewidth=1pt,
titleinnersep=20pt,
bodyinnersep=38pt
}{
\draw[rounded corners=\blockroundedcorners, inner sep=\blockbodyinnersep,
line width=\blocklinewidth, color=background,
top color=boxes, bottom color=boxes,
]
(blockbody.south west) rectangle (blockbody.north east); %
\ifBlockHasTitle%
\draw[rounded corners=\blockroundedcorners, inner sep=\blocktitleinnersep,
top color=background, bottom color=background,
line width=2, color=background, %fill=blocktitlebgcolor
]
(blocktitle.south west) rectangle (blocktitle.north east); %
\fi%
}
\newcommand\myblock[3][GrayBlock]{\useblockstyle{#1}\block{#2}{#3}\useblockstyle{Default}}
% Define blockstyle for tranparent block
\defineblockstyle{TranspBlock}{
titlewidthscale=0.99,
bodywidthscale=0.99,
titleleft,
titleoffsetx=15pt,
titleoffsety=-40pt,
bodyoffsetx=0pt,
bodyoffsety=-40pt,
bodyverticalshift=0mm,
roundedcorners=25,
linewidth=1pt,
titleinnersep=20pt,
bodyinnersep=38pt
}{
\draw[rounded corners=\blockroundedcorners, inner sep=\blockbodyinnersep,
line width=\blocklinewidth, color=background,
top color=background, bottom color=background,
]
(blockbody.south west) rectangle (blockbody.north east); %
\ifBlockHasTitle%
\draw[rounded corners=\blockroundedcorners, inner sep=\blocktitleinnersep,
top color=background, bottom color=background,
line width=2, color=background, %fill=blocktitlebgcolor
]
(blocktitle.south west) rectangle (blocktitle.north east); %
\fi%
}
\renewcommand\myblock[3][TranspBlock]{\useblockstyle{#1}\block{#2}{#3}\useblockstyle{Default}}

29
recs.csv Normal file
View File

@ -0,0 +1,29 @@
recording
2020-03-13-10_00
2020-03-16-10_00
2020-03-19-10_00
2020-03-20-10_00
2020-03-23-09_58
2020-03-24-10_00
2020-03-25-10_00
2020-03-31-09_59
2020-05-11-10_00
2020-05-12-10_00
2020-05-13-10_00
2020-05-14-10_00
2020-05-15-10_00
2020-05-18-10_00
2020-05-19-10_00
2020-05-21-10_00
2020-05-25-10_00
2020-05-27-10_00
2020-05-28-10_00
2020-05-29-10_00
2020-06-02-10_00
2020-06-03-10_10
2020-06-04-10_00
2020-06-05-10_00
2020-06-08-10_00
2020-06-09-10_00
2020-06-10-10_00
2020-06-11-10_00
1 recording
2 2020-03-13-10_00
3 2020-03-16-10_00
4 2020-03-19-10_00
5 2020-03-20-10_00
6 2020-03-23-09_58
7 2020-03-24-10_00
8 2020-03-25-10_00
9 2020-03-31-09_59
10 2020-05-11-10_00
11 2020-05-12-10_00
12 2020-05-13-10_00
13 2020-05-14-10_00
14 2020-05-15-10_00
15 2020-05-18-10_00
16 2020-05-19-10_00
17 2020-05-21-10_00
18 2020-05-25-10_00
19 2020-05-27-10_00
20 2020-05-28-10_00
21 2020-05-29-10_00
22 2020-06-02-10_00
23 2020-06-03-10_10
24 2020-06-04-10_00
25 2020-06-05-10_00
26 2020-06-08-10_00
27 2020-06-09-10_00
28 2020-06-10-10_00
29 2020-06-11-10_00