Exemplar-free Class Incremental Learning via Discriminative and Comparable One-class Classifiers

The exemplar-free class incremental learning requires classification modelsto learn new class knowledge incrementally without retaining any old samples.Recently, the framework based on parallel one-class classifiers (POC), whichtrains a one-class classifier (OCC) independently for each category, hasattracted extensive attention, since it can naturally avoid catastrophicforgetting. POC, however, suffers from weak discriminability and comparabilitydue to its independent training strategy for different OOCs. To meet thischallenge, we propose a new framework, named Discriminative and ComparableOne-class classifiers for Incremental Learning (DisCOIL). DisCOIL follows thebasic principle of POC, but it adopts variational auto-encoders (VAE) insteadof other well-established one-class classifiers (e.g. deep SVDD), because atrained VAE can not only identify the probability of an input sample belongingto a class but also generate pseudo samples of the class to assist in learningnew tasks. With this advantage, DisCOIL trains a new-class VAE in contrast withthe old-class VAEs, which forces the new-class VAE to reconstruct better fornew-class samples but worse for the old-class pseudo samples, thus enhancingthe comparability. Furthermore, DisCOIL introduces a hinge reconstruction lossto ensure the discriminability. We evaluate our method extensively on MNIST,CIFAR10, and Tiny-ImageNet. The experimental results show that DisCOIL achievesstate-of-the-art performance.