Use of Stochastic Switching in the Training of Binarized Neural Networks
Most prior research into the field of Binarized Neural Networks (BNNs) has been motivated by a desire to optimize Deep Learning computations on traditional hardware. These methods rely on bit-wise operations to drastically decrease computation time, however to address the resulting loss in accuracy it is common practice to reintroduce continuous parameters and train using batch normalization. Here
