Back to Search Start Over

Few-Shot Classification of Interactive Activities of Daily Living (InteractADL)

Authors :
Durante, Zane
Harries, Robathan
Vendrow, Edward
Luo, Zelun
Kyuragi, Yuta
Kozuka, Kazuki
Fei-Fei, Li
Adeli, Ehsan
Publication Year :
2024

Abstract

Understanding Activities of Daily Living (ADLs) is a crucial step for different applications including assistive robots, smart homes, and healthcare. However, to date, few benchmarks and methods have focused on complex ADLs, especially those involving multi-person interactions in home environments. In this paper, we propose a new dataset and benchmark, InteractADL, for understanding complex ADLs that involve interaction between humans (and objects). Furthermore, complex ADLs occurring in home environments comprise a challenging long-tailed distribution due to the rarity of multi-person interactions, and pose fine-grained visual recognition tasks due to the presence of semantically and visually similar classes. To address these issues, we propose a novel method for fine-grained few-shot video classification called Name Tuning that enables greater semantic separability by learning optimal class name vectors. We show that Name Tuning can be combined with existing prompt tuning strategies to learn the entire input text (rather than only learning the prompt or class names) and demonstrate improved performance for few-shot classification on InteractADL and 4 other fine-grained visual classification benchmarks. For transparency and reproducibility, we release our code at https://github.com/zanedurante/vlm_benchmark.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.01662
Document Type :
Working Paper