Back to Search Start Over

EQ-CBM: A Probabilistic Concept Bottleneck with Energy-based Models and Quantized Vectors

Authors :
Kim, Sangwon
Ahn, Dasom
Ko, Byoung Chul
Jang, In-su
Kim, Kwang-Ju
Publication Year :
2024

Abstract

The demand for reliable AI systems has intensified the need for interpretable deep neural networks. Concept bottleneck models (CBMs) have gained attention as an effective approach by leveraging human-understandable concepts to enhance interpretability. However, existing CBMs face challenges due to deterministic concept encoding and reliance on inconsistent concepts, leading to inaccuracies. We propose EQ-CBM, a novel framework that enhances CBMs through probabilistic concept encoding using energy-based models (EBMs) with quantized concept activation vectors (qCAVs). EQ-CBM effectively captures uncertainties, thereby improving prediction reliability and accuracy. By employing qCAVs, our method selects homogeneous vectors during concept encoding, enabling more decisive task performance and facilitating higher levels of human intervention. Empirical results using benchmark datasets demonstrate that our approach outperforms the state-of-the-art in both concept and task accuracy.<br />Comment: Accepted by ACCV 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.14630
Document Type :
Working Paper