Progressive Stochastic Binarization of Deep Networks
- Resource Type
- Conference
- Authors
- Hartmann, David; Wand, Michael
- Source
- 2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS) EMC2-NIPS Energy Efficient Machine Learning and Cognitive Computing - NeurIPS Edition (EMC2-NIPS), 2019 Fifth Workshop on. :31-35 Dec, 2019
- Subject
- Computing and Processing
Training
Quantization (signal)
Stochastic processes
Focusing
Machine learning
Tools
Licenses
neural networks
stochastic binarization
hardware
- Language
We propose a stochastic binarization scheme for deep networks that approximates scalar products of weights and activations using progressive sampling of stochastic shifts. This representation has bounded relative error and thereby permits high accuracies at moderate sampling costs. Further, it allows for the first time a fully dynamic and localized control of accuracy. This not only enables a choice of accuracy at run-time, but also provides a new tool for adaptively focusing computational attention. A reference implementation is provided under a free license (https://github.com/JGU-VC/progressive_stochastic_binarization).