Back to Search Start Over

HDMF: Hierarchical Data Modeling Framework for Modern Science Data Standards

Authors :
Kristofer E. Bouchard
Oliver Rubel
Ryan Ly
Loren M. Frank
Benjamin Dichter
Edward F. Chang
Andrew Tritt
Donghe Kang
Baru, Chaitanya
Huan, Jun
Khan, Latifur
Hu, Xiaohua
Ak, Ronay
Tian, Yuanyuan
Barga, Roger S
Zaniolo, Carlo
Lee, Kisung
Ye, Yanfang Fanny
Source :
IEEE BigData, Proc IEEE Int Conf Big Data
Publication Year :
2019
Publisher :
IEEE, 2019.

Abstract

A ubiquitous problem in aggregating data across different experimental and observational data sources is a lack of software infrastructure that enables flexible and extensible standardization of data and metadata. To address this challenge, we developed HDMF, a hierarchical data modeling framework for modern science data standards. With HDMF, we separate the process of data standardization into three main components: (1) data modeling and specification, (2) data I/O and storage, and (3) data interaction and data APIs. To enable standards to support the complex requirements and varying use cases throughout the data life cycle, HDMF provides object mapping infrastructure to insulate and integrate these various components. This approach supports the flexible development of data standards and extensions, optimized storage backends, and data APIs, while allowing the other components of the data standards ecosystem to remain stable. To meet the demands of modern, large-scale science data, HDMF provides advanced data I/O functionality for iterative data write, lazy data load, and parallel I/O. It also supports optimization of data storage via support for chunking, compression, linking, and modular data storage. We demonstrate the application of HDMF in practice to design NWB 2.0 [13], a modern data standard for collaborative science across the neurophysiology community.

Details

Database :
OpenAIRE
Journal :
2019 IEEE International Conference on Big Data (Big Data)
Accession number :
edsair.doi.dedup.....518e285f0327e4d24b8e316625dfced6