Skip to Main content Skip to Navigation
New interface
Journal articles

Energy Efficient Learning With Low Resolution Stochastic Domain Wall Synapse for Deep Neural Networks

Abstract : We demonstrate extremely low resolution quantized (nominally 5-state) synapses with large stochastic variations in synaptic weights can be energy efficient and achieve reasonably high testing accuracies compared to Deep Neural Networks (DNNs) of similar sizes using floating-point precision synaptic weights. Specifically, voltage-controlled domain wall (DW) devices demonstrate stochastic behavior and can only encode limited states; however, they are extremely energy efficient during both training and inference. In this study, we propose both in-situ and ex-situ training algorithms, based on modification of the algorithm proposed by Hubara et al., 2017 which works well with quantization of synaptic weights, and train several 5-layer DNNs on MNIST dataset using 2-, 3-and 5-state DW devices as a synapse. For insitu training, a separate high precision memory unit preserves and accumulates the weight gradients which prevents accuracy loss due to weight quantization. For ex-situ training, a precursor DNN is first trained based on weight quantization and DW device model. Moreover, a noise tolerance margin is included in both of the training methods to account for the intrinsic device noise. The highest inference accuracies we obtain after in-situ and ex-situ training are ∼ 96.67% and ∼96.63%, respectively, which is very close to the baseline accuracy of ∼97.1% obtained from a similar topology DNN having floating-point precision weights with no stochasticity. Large interstate intervals due to quantized weights and noise tolerance margin enables in-situ training with significantly lower number of programming attempts. Our proposed approach demonstrates a possibility of at least two orders of magnitude energy savings compared to the floating-point approach implemented in CMOS. This approach is specifically attractive for low power intelligent edge devices where the ex-situ learning can be utilized for energy efficient non-adaptive tasks and the in-situ learning can provide the opportunity to adapt and learn in a dynamically evolving environment.
Complete list of metadata
Contributor : Damien Querlioz Connect in order to contact the contributor
Submitted on : Saturday, November 19, 2022 - 11:43:37 AM
Last modification on : Monday, November 28, 2022 - 11:20:26 AM


Publisher files allowed on an open archive



Walid Al Misba, Mark Lozano, D. Querlioz, Jayasimha Atulasimha. Energy Efficient Learning With Low Resolution Stochastic Domain Wall Synapse for Deep Neural Networks. IEEE Access, 2022, 10, pp.84946 - 84959. ⟨10.1109/access.2022.3196688⟩. ⟨hal-03861118⟩



Record views


Files downloads