Abstract:
The increasing demand for mobile devices and high performance computing has made energy consumption a main issue in computer technology. Mobile devices require extended battery life, but the available technology still puts limits on the need for recharging the devices. High performance computing has a high price tag on energy for compute-intensive applications such as data mining. As a result, optimizations at various layers of the computer platform are becoming necessary to minimize energy usage or extend the time before a battery needs to be recharged. This paper focuses on back-propagation neural network algorithm, one of the popular compute-intensive data mining algorithms. The goal is to present a design methodology for developing an energy aware algorithm. The key idea revolves around identifying operations called kernels, which are frequently used in the algorithm, and that can be implemented in hardware. Optimizing these kernels for performance or energy would then lead to a major impact in these areas. These kernels are analyzed for their impact on the overall application energy using energy-based asymptotic analysis. The methodology then considers additional optimizations not related to kernels, but are specific to the back-propagation algorithm. Suggestions are provided to improve the performance and reduce energy consumption. Experiments show that there are significant potentials in energy reduction through the use of alternative lower energy kernels or through custom optimizations with tradeoffs in the accuracy of the results. © 2011 IEEE.