Few-Shot Classification of Interactive Activities of Daily Living (InteractADL)

Understanding Activities of Daily Living (ADLs) is a crucial step fordifferent applications including assistive robots, smart homes, and healthcare.However, to date, few benchmarks and methods have focused on complex ADLs,especially those involving multi-person interactions in home environments. Inthis paper, we propose a new dataset and benchmark, InteractADL, forunderstanding complex ADLs that involve interaction between humans (andobjects). Furthermore, complex ADLs occurring in home environments comprise achallenging long-tailed distribution due to the rarity of multi-personinteractions, and pose fine-grained visual recognition tasks due to thepresence of semantically and visually similar classes. To address these issues,we propose a novel method for fine-grained few-shot video classification calledName Tuning that enables greater semantic separability by learning optimalclass name vectors. We show that Name Tuning can be combined with existingprompt tuning strategies to learn the entire input text (rather than onlylearning the prompt or class names) and demonstrate improved performance forfew-shot classification on InteractADL and 4 other fine-grained visualclassification benchmarks. For transparency and reproducibility, we release ourcode at https://github.com/zanedurante/vlm_benchmark.