1. TabMoE: A General Framework for Diverse Table-Based Reasoning with Mixture-of-Experts.
- Author
-
Wu, Jie and Hou, Mengshu
- Subjects
NATURAL language processing ,ALGORITHMS - Abstract
Tables serve as a widely adopted data format, attracting considerable academic interest concerning semantic understanding and logical inference of tables. In recent years, the prevailing paradigm of pre-training and fine-tuning on tabular data has become increasingly prominent in research on table understanding. However, existing table-based pre-training methods frequently exhibit constraints, supporting only single tasks while requiring substantial computational resources, which hinders their efficiency and applicability. In this paper, we introduce the TabMoE, a novel framework based on mixture-of-experts, designed to handle a wide range of tasks involving logical reasoning over tabular data. Each expert within the model specializes in a distinct logical function and is trained through the utilization of a hard Expectation–Maximization algorithm. Remarkably, this framework eliminates the necessity of dependency on tabular pre-training, instead exclusively employing limited task-specific data to significantly enhance models' inferential capabilities. We conduct empirical experiments across three typical tasks related to tabular data: table-based question answering, table-based fact verification, and table-to-text generation. The experimental results underscore the innovation and feasibility of our framework. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF