VoDEx: a Python library for time annotation and management of volumetric functional imaging data

In functional imaging studies, accurately synchronizing the time course of experimental manipulations and stimulus presentations with resulting imaging data is crucial for analysis. Current software tools lack such functionality, requiring manual processing of the experimental and imaging data, whic...

Full description

Saved in:
Bibliographic Details
Published inArXiv.org
Main Authors Nadtochiy, Anna, Luu, Peter, Fraser, Scott E, Truong, Thai V
Format Journal Article
LanguageEnglish
Published United States Cornell University 11.05.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:In functional imaging studies, accurately synchronizing the time course of experimental manipulations and stimulus presentations with resulting imaging data is crucial for analysis. Current software tools lack such functionality, requiring manual processing of the experimental and imaging data, which is error-prone and potentially non-reproducible. We present VoDEx, an open-source Python library that streamlines the data management and analysis of functional imaging data. VoDEx synchronizes the experimental timeline and events (eg. presented stimuli, recorded behavior) with imaging data. VoDEx provides tools for logging and storing the timeline annotation, and enables retrieval of imaging data based on specific time-based and manipulation-based experimental conditions. Availability and Implementation: VoDEx is an open-source Python library and can be installed via the "pip install" command. It is released under a BSD license, and its source code is publicly accessible on GitHub https://github.com/LemonJust/vodex. A graphical interface is available as a napari-vodex plugin, which can be installed through the napari plugins menu or using "pip install." The source code for the napari plugin is available on GitHub https://github.com/LemonJust/napari-vodex.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Working Paper/Pre-Print-1
content type line 23
ISSN:2331-8422
2331-8422