Leveraging Highly Approximated Multipliers in DNN Inference
Leveraging Highly Approximated Multipliers in DNN Inference
Abstract:
In this work, we present our control variate approximation technique that enables the exploitation of highly approximate multipliers in Deep Neural Network (DNN) accelerators. Our approach does not require retraining and significantly decreases the induced error due to approximate multiplications, improving the overall inference accuracy. As a result, control variate approximation enables satisfying tight accuracy loss constraints while boosting the power savings. Our experimental evaluation, across six different DNNs and several approximate multipliers, demonstrates the versatility of control variate technique and shows that compared to the accurate design, it achieves the same performance, 45% power reduction, and less than 1% average accuracy loss. Compared to the corresponding approximate designs without using our technique, the error-correction of the control variate method improves the accuracy by 1.9x on average.
” Thanks for Visit this project Pages – Register This Project and Buy soon with Novelty “
Leveraging Highly Approximated Multipliers in DNN Inference