Stochastic Subsampling With Average Pooling

Regularization of deep neural networks has been an important issue to achievehigher generalization performance without overfitting problems. Although thepopular method of Dropout provides a regularization effect, it causesinconsistent properties in the output, which may degrade the performance ofdeep neural networks. In this study, we propose a new module called stochasticaverage pooling, which incorporates Dropout-like stochasticity in pooling. Wedescribe the properties of stochastic subsampling and average pooling andleverage them to design a module without any inconsistency problem. Thestochastic average pooling achieves a regularization effect without anypotential performance degradation due to the inconsistency issue and can easilybe plugged into existing architectures of deep neural networks. Experimentsdemonstrate that replacing existing average pooling with stochastic averagepooling yields consistent improvements across a variety of tasks, datasets, andmodels.