Leveraging Highly Approximated Multipliers in DNN Inference
Leveraging Highly Approximated Multipliers in DNN Inference
Abstract:
In this work, we present our control variate approximation technique that enables the exploitation of highly approximate multipliers in Deep Neural Network (DNN) accelerators. Our approach does not require retraining and significantly decreases the induced error due to approximate multiplications, improving power efficiency and accuracy. As a result, control variate approximation enables satisfying tight error resilience requirements while preserving the power savings. Our experimental evaluation, across six different DNNs and several approximate multiplier designs, demonstrates the versatility of control variate technique and shows that compared to the accurate designs, it achieves the same performance, 45% power reduction, and less than 1% average accuracy loss. Compared to the corresponding approximate designs without using our technique, the error-correction of the control variate method improves the accuracy by 1.9x on average.
Index Terms:
Approximate computing, approximate multipliers, control variate, deep neural networks, error correction, low power.
” Thanks for visiting this project page – Register this project and buy soon with Novelty “
Leveraging Highly Approximated Multipliers in DNN Inference