{"id":39377,"date":"2023-12-15T07:57:14","date_gmt":"2023-12-15T07:57:14","guid":{"rendered":"https:\/\/www.innovationnewsnetwork.com\/?p=39377"},"modified":"2023-12-15T08:32:01","modified_gmt":"2023-12-15T08:32:01","slug":"green-dat-ai-enabling-energy-efficient-ai-services","status":"publish","type":"post","link":"https:\/\/www.innovationnewsnetwork.com\/green-dat-ai-enabling-energy-efficient-ai-services\/39377\/","title":{"rendered":"GREEN.DAT.AI: Enabling energy-efficient AI services"},"content":{"rendered":"

Accelerating the green energy transition through the development of energy-efficient AI services and data spaces.<\/h2>\n

The energy demand of Artificial Intelligence<\/a> (AI) systems is a growing concern, especially with the increasing use of Deep Learning and other computationally intensive algorithms. This is becoming increasingly evident with the wide adoption of large language models that use terabytes of data for training,5<\/sup> and therefore a large amount of computing power and capital investment, which then translates into significant carbon dioxide emissions.<\/p>\n

With the increased use of AI across several domains, from manufacturing, to banking, to intelligent transport systems, Europe urgently needs to develop new data management solutions that will harness the transformative potential of AI, whilst meeting the European Green Deal objectives.1<\/sup><\/p>\n

Energy demand of AI services<\/h3>\n

Addressing the energy demands of AI services extends beyond model training, encompassing inference, data storage, retrieval, and data centre cooling.<\/p>\n

A key mitigation strategy involves designing AI algorithms with lower energy consumption. Energy efficiency, quantified by the ratio of energy consumed to output or work produced (e.g., energy per prediction, per transaction), requires a multifaceted evaluation considering various factors and metrics.<\/p>\n

Measuring, for example, the energy required to train an AI model involves instrumenting the hardware, or using power models that provide energy consumption estimates based on hardware specifications, collecting data on energy consumption during the training process. Then the data is normalised based on the size and complexity of the model, and the results are reported along with other performance metrics.<\/p>\n

Operational energy consumption can be monitored using energy monitoring tools, power meters, or specialised software, facilitating estimation of the service\u2019s carbon footprint based on energy source.<\/p>\n

Additionally, energy-efficient AI systems must be designed with scalability in mind, allowing for flexible adjustments to demand without a significant rise in energy consumption. Balancing efficiency and performance is the key to progress. It entails ensuring that a service is both energy-efficient and maintains or surpasses performance standards while consuming less energy. This guarantees acceptable levels of accuracy and speed.<\/p>\n

Developing energy-efficient AI<\/h3>\n

Techniques for improving the energy efficiency of algorithms include hardware acceleration, model optimisation, and data compression.2-10<\/sup> AI developers now have a responsibility to mitigate the environmental impact of their technology.<\/p>\n

In this context, the GREEN.DAT.AI<\/a><\/u> project11<\/sup> aims to channel the potential of AI towards Europe\u2019s sustainability goals, by developing novel Energy-Efficient Large-Scale Data Analytics Services, ready-to-use in industrial AI-based systems while reducing the environmental impact of data management processes.<\/p>\n

A number of challenges are posed by large-scale data analysis, including data transfer (bringing the data and computational resources together), controlling access to the data, managing the data, standardising data formats, and integrating data of multiple different types to accurately model industrial systems.<\/p>\n

The project vision is to design smart AI-powered applications that allow computing to move from data centres to edge devices. Shifting computation from the cloud to devices reduces the flow and potential leakage of sensitive data, leads to faster inference with a shorter reaction time and drives innovation in applications where these parameters are critical.<\/p>\n

Core to the project innovation are federated learning mechanisms, which utilise distributed data for inferring information based on decentralised collaborative modelling algorithms. The ambition is to enable data sharing while minimising data transfer. To achieve this, a novel Toolbox of reusable energy-efficient AI services<\/a> is being developed including:<\/p>\n