Towards a sustainable artificial intelligence: A case study of energy efficiency in decision tree algorithms
Artificial intelligence has been showing accelerated growth due to its use in solving problems in several application domains. This success results from the convergence of large amounts of data, high-performance computing, and precision of machine learning (ML) algorithms. Even with the relevance of ML algorithms, little is known about their computational requirements and power consumption, which has become an important task to achieve greener computing. This work aims to evaluate the energy efficiency of the ML algorithms to identify their energy hotspots. Also, to investigate which influences the energy consumption (EC) of these algorithms and how the parameters design could affect it. We conducted a series of experiments using 27 different datasets, 2 decision trees algorithms, and 2 ensembles for classification and regression tasks to answer these questions. Our results show interesting findings, such as, like some simple parameters choice can have a high impact on EC, as a consequence, finding greener strategies for AI.
Search for the Publication In: