Back to Search Start Over

The Curious Case of Nonverbal Abstract Reasoning with Multi-Modal Large Language Models

Authors :
Ahrabian, Kian
Sourati, Zhivar
Sun, Kexuan
Zhang, Jiarui
Jiang, Yifan
Morstatter, Fred
Pujara, Jay
Publication Year :
2024

Abstract

While large language models (LLMs) are still being adopted to new domains and utilized in novel applications, we are experiencing an influx of the new generation of foundation models, namely multi-modal large language models (MLLMs). These models integrate verbal and visual information, opening new possibilities to demonstrate more complex reasoning abilities at the intersection of the two modalities. However, despite the revolutionizing prospect of MLLMs, our understanding of their reasoning abilities is limited. In this study, we assess the nonverbal abstract reasoning abilities of open-source and closed-source MLLMs using variations of Raven's Progressive Matrices. Our experiments reveal the challenging nature of such problems for MLLMs while showcasing the immense gap between open-source and closed-source models. We also uncover critical shortcomings of visual and textual perceptions, subjecting the models to low-performance ceilings. Finally, to improve MLLMs' performance, we experiment with different methods, such as Chain-of-Thought prompting, leading to a significant (up to 100%) boost in performance. Our code and datasets are available at https://github.com/usc-isi-i2/isi-mmlm-rpm.<br />Comment: 21 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.12117
Document Type :
Working Paper