Back to Search Start Over

TimeX++: Learning Time-Series Explanations with Information Bottleneck

Authors :
Liu, Zichuan
Wang, Tianchun
Shi, Jimeng
Zheng, Xu
Chen, Zhuomin
Song, Lei
Dong, Wenqian
Obeysekera, Jayantha
Shirani, Farhad
Luo, Dongsheng
Publication Year :
2024

Abstract

Explaining deep learning models operating on time series data is crucial in various applications of interest which require interpretable and transparent insights from time series signals. In this work, we investigate this problem from an information theoretic perspective and show that most existing measures of explainability may suffer from trivial solutions and distributional shift issues. To address these issues, we introduce a simple yet practical objective function for time series explainable learning. The design of the objective function builds upon the principle of information bottleneck (IB), and modifies the IB objective function to avoid trivial solutions and distributional shift issues. We further present TimeX++, a novel explanation framework that leverages a parametric network to produce explanation-embedded instances that are both in-distributed and label-preserving. We evaluate TimeX++ on both synthetic and real-world datasets comparing its performance against leading baselines, and validate its practical efficacy through case studies in a real-world environmental application. Quantitative and qualitative evaluations show that TimeX++ outperforms baselines across all datasets, demonstrating a substantial improvement in explanation quality for time series data. The source code is available at \url{https://github.com/zichuan-liu/TimeXplusplus}.<br />Comment: Accepted by International Conference on Machine Learning (ICML 2024)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.09308
Document Type :
Working Paper