HyperAIHyperAI
25 days ago

BinaryConnect: Training Deep Neural Networks with binary weights during propagations

{Jean-Pierre David, Yoshua Bengio, Matthieu Courbariaux}
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
Abstract

Deep Neural Networks (DNN) have achieved state-of-the-art results in a widerange of tasks, with the best results obtained with large training sets andlarge models. In the past, GPUs enabled these breakthroughs because of theirgreater computational speed. In the future, faster computation at both trainingand test time is likely to be crucial for further progress and for consumerapplications on low-power devices. As a result, there is much interest inresearch and development of dedicated hardware for Deep Learning (DL). Binaryweights, i.e., weights which are constrained to only two possible values (e.g.-1 or 1), would bring great benefits to specialized DL hardware by replacingmany multiply-accumulate operations by simple accumulations, as multipliers arethe most space and power-hungry components of the digital implementation ofneural networks. We introduce BinaryConnect, a method which consists intraining a DNN with binary weights during the forward and backwardpropagations, while retaining precision of the stored weights in whichgradients are accumulated. Like other dropout schemes, we show thatBinaryConnect acts as regularizer and we obtain near state-of-the-art resultswith BinaryConnect on the permutation-invariant MNIST, CIFAR-10 and SVHN.

BinaryConnect: Training Deep Neural Networks with binary weights during propagations | Papers | HyperAI