Recent implications towards sustainable and energy efficient AI and big data implementations in cloud-fog systems: A newsworthy inquiry
Cloud-fog based industries are entailing today greedy energy costs, given the wide multiplication of their AI models and distributed BD frameworks implementations. This paper conducts a newsworthy inquiry purposing to study at which extent IT community are conscious about energy evaluation of their hardware and software implementations, and whether they are in line with sustainable implications toward efficient AI and BD deployments. Through an analysis of responses to the first inquiry questions, we were able to address the residing interoperability between AI models, distributed BD frameworks, and cloud-fog systems. Unfortunately, only 10% of respondents were adopting energy metrics when evaluating their implementations. Even worse, multi-level energy consumption measurement techniques were not evident to most respondents. Accordingly, we provided a useful guideline of various multi-level energy and power estimation approaches. Hereafter, we devoted in accordance with both inquiry and literature results some essential parts for analyzing emerging efficient DNNs and distributed BD implementations. These endeavors were mainly manifested in the form of efficient reconfigurable accelerators designs based on Processing-In-Memory and Processing-Near-Memory architectures. To serve eventually IT community with other tangible solutions, we proposed two roadmaps opening up to the possibility of investigating sustainable actions covering hardware, software, and data levels.
Search for the Publication In: