HyperAIHyperAI
2 months ago

Distribution Matching for Multi-Task Learning of Classification Tasks: a Large-Scale Study on Faces & Beyond

Kollias, Dimitrios ; Sharmanska, Viktoriia ; Zafeiriou, Stefanos
Distribution Matching for Multi-Task Learning of Classification Tasks: a
  Large-Scale Study on Faces & Beyond
Abstract

Multi-Task Learning (MTL) is a framework, where multiple related tasks arelearned jointly and benefit from a shared representation space, or parametertransfer. To provide sufficient learning support, modern MTL uses annotateddata with full, or sufficiently large overlap across tasks, i.e., each inputsample is annotated for all, or most of the tasks. However, collecting suchannotations is prohibitive in many real applications, and cannot benefit fromdatasets available for individual tasks. In this work, we challenge this setupand show that MTL can be successful with classification tasks with little, ornon-overlapping annotations, or when there is big discrepancy in the size oflabeled data per task. We explore task-relatedness for co-annotation andco-training, and propose a novel approach, where knowledge exchange is enabledbetween the tasks via distribution matching. To demonstrate the generalapplicability of our method, we conducted diverse case studies in the domainsof affective computing, face recognition, species recognition, and shoppingitem classification using nine datasets. Our large-scale study of affectivetasks for basic expression recognition and facial action unit detectionillustrates that our approach is network agnostic and brings large performanceimprovements compared to the state-of-the-art in both tasks and across allstudied databases. In all case studies, we show that co-training viatask-relatedness is advantageous and prevents negative transfer (which occurswhen MT model's performance is worse than that of at least one single-taskmodel).