Tackling Catastrophic Forgetting and Background Shift in Continual Semantic Segmentation

Deep learning approaches are nowadays ubiquitously used to tackle computervision tasks such as semantic segmentation, requiring large datasets andsubstantial computational power. Continual learning for semantic segmentation(CSS) is an emerging trend that consists in updating an old model bysequentially adding new classes. However, continual learning methods areusually prone to catastrophic forgetting. This issue is further aggravated inCSS where, at each step, old classes from previous iterations are collapsedinto the background. In this paper, we propose Local POD, a multi-scale poolingdistillation scheme that preserves long- and short-range spatial relationshipsat feature level. Furthermore, we design an entropy-based pseudo-labelling ofthe background w.r.t. classes predicted by the old model to deal withbackground shift and avoid catastrophic forgetting of the old classes. Finally,we introduce a novel rehearsal method that is particularly suited forsegmentation. Our approach, called PLOP, significantly outperformsstate-of-the-art methods in existing CSS scenarios, as well as in newlyproposed challenging benchmarks.