Skip to main content

How does AI play a part in a world striving for net-zero?

Web Hosting & Remote IT Support

Artificial intelligence (AI) holds the potential to solve some of the thorniest problems facing humanity, including the challenges of climate change. But at the same time, the technology - in particular, generative AI - uses a vast amount of computational power, and consequently a huge amount of energy. This is a problem, and one which is only going to grow. 

The amount of computing power required for cutting-edge AI models is doubling every five or six months, and it’s reasonable to imagine that it will continue to increase as demand for the technology booms. Data centers already consume up to 1.5% of the world’s electricity supply, and energy consumption is responsible for around 75% of man-made greenhouse gas emissions in the EU.

Recent research by Gartner predicts that, “by 2030, AI could help reduce global GHG emissions by 5% to 10%”. However, by the same year, Gartner predicts that “AI could consume up to 3.5% of the world’s electricity”.

The tech industry is facing a clear challenge: to find solutions to curb the energy demands of AI, and thus unlock the technology’s full potential to help the human race.

How AI consumes power

The power required by AI is due to two factors: energy is consumed when models are trained, and during inference, where live data is run through a trained AI model to solve tasks. Research published in the journal Joule suggests that inference can account for at least 60% of the energy consumption of generative AI, and that adding AI capabilities to web searches can multiply energy demands tenfold. There also tends to be an increased volume of queries when engaging with a generative model compared to a search engine, due to the back-and-forth dialogue as users try to achieve their desired result.

As new use cases for generative AI emerge around text, images and video, there will also be an increase in large models being trained, retrained and fine-tuned on a daily basis. The recent class of generative AI models require more than a 200-fold increase in computing power to train compared to previous generations. Every new generation of models requires more computing power for inference, and more energy to train. It’s a constant cycle that continually adds demand onto the required infrastructure.

In terms of hardware, the graphics processing units (GPUs) used for AI can expend many times the energy of a traditional CPU system. Today’s GPU’s can consume up to 700 watts, and an average installation takes eight GPUs per server. This means a server could be consuming nearly six kilowatts, compared to one kilowatt for the traditional two-socket server unit enterprises use for virtualization. So, the big question is, how can we make this more sustainable?

Finding answers

The first step is to understand that sustainability is a journey: there is no singular action that can ‘fix’ it when it comes to AI. But small steps can make a big difference. The computing industry is being sent a loud, clear message to create better products that use fewer resources. This call is coming from consumers and investors, but also increasingly from governments. Being energy efficient will in future be a legal requirement for organizations in the AI space. Recent amendments to the EU AI Act will mandate that operators adopt state-of-the-art methods to cut energy consumption and enhance the efficiency of their AI platforms.

This can be achieved in three specific technical ways: first in the chips used to generate the computational power, second in the computers built for those chips, and third in the data center. Sustainability is increasingly becoming a competitive differentiator both for chip makers and PC makers, and will become more so as companies make the effort to achieve ESG goals. In the coming decades, new advances such as analogue chips could offer an energy-efficient alternative, perfect for neural networks, according to research in the journal Nature.

In the data center, older air-cooling technologies are already struggling to deal with the high energy demands of AI, and customers are turning to liquid cooling to minimize energy consumption. By efficiently transferring the heat generated by generative AI into water, customers can save up to 30-40% on electricity. Data centers driven by renewable energy sources will be key to reducing AI’s carbon footprint. ‘As a service’ approaches to AI technology can also help to minimize waste and ensure that organizations are using the newest, most sustainable hardware, without up-front capital outlay.

AI for good

There is a trade-off around AI and its energy demands that needs to be discussed. Some are using AI for the benefit of humankind, by improving medicine or tackling climate change, for example, while others are using it to generate entertainment. This raises questions around whether we should view those different energy demands differently.

It is certain that AI has enormous potential to do good, already having an impact in many areas. There are dozens of examples of how AI holds the potential to mitigate the impacts of climate change, with the UN pointing out that it is not only helping to better forecast and understand extreme weather, but also offering direct help to communities impacted by this.

In addition, AI can offer new understanding of the world around us, which could in turn help to curb greenhouse gas emissions. In smart cities, it has potential to minimize emissions by saving minutes or hours of heating and air conditioning at city scale, by learning people’s habits and turning heating or air con down gradually in the hour before they leave their homes. The technology can also regulate traffic across a city, so that vehicles drive efficiently and traffic jams are prevented. Norwegian start-up Oceanbox.io is harnessing predictive AI on its mission to understand the depths of the ocean, forecasting the movement of currents which can help to combat the spread of pollution and help vessels to reduce their petrol use.

AI’s contribution to a net-zero world

There is no question that AI uses a lot of power, but we can tackle this step by step - by using warm water cooling instead of air cooling, harnessing renewable energy sources to drive data centers, and through innovations in chip and computer design.

In so many ways, AI also can also offer positives for humanity and become a powerful force to drive the world toward the UN’s Sustainable Development Goals. It has the potential to help us to better understand the causes of climate change and tackle it, reduce inequality, and preserve our oceans and forests. Used responsibly, AI can go hand in hand with sustainability objectives. As the world comes together to drive towards net-zero, AI will increasingly play an important part.

We've listed the best AI website builders.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro



via Hosting & Support

Comments

Popular posts from this blog

This new malware campaign can hijack your Gmail or Outlook email account

Web Hosting & Remote IT Support Cybersecurity researchers from Cisco Talos have spotted a new hacking campaign they claim is targeting victims’ sensitive data, login credentials, and email inboxes. Horabot is described as a botnet that has been active for almost two and a half years now (first spotted in November 2020). During that time, it’s mostly been tasked with distributing a banking trojan and spam malware .  Its operators seem to be located in Brazil, while its victims are Spanish-speaking users located mostly in Mexico, Uruguay, Venezuela Brazil, Panama, Argentina, and Guatemala. Horabot botnet The victims are found in different industries, from investment firms to wholesale distribution, from construction to engineering, and accounting. The attack starts with an email message carrying a malicious HTML attachment. Ultimately, the victim is urged to download a .RAR archive, which holds the banking trojan.  The malware is capable of doing plenty of things: stealing l

Want to store 1PB of data in the cloud? This startup can do it for you for as little as $10,000 a month — Qumulo says it can scale to Exabytes off premise and wants to eradicate tapes once and for all

Web Hosting & Remote IT Support Qumulo has launched Azure Native Qumulo Cold (ANQ Cold), which it claims is the first truly cloud-native, fully managed SaaS solution for storing and retrieving infrequently accessed “cold” file data. Fully POSIX-compliant and positioned as an on-premises alternative to tape storage, ANQ Cold can be used as a standalone file service, a backup target for any file store, including on-premises legacy scale-out NAS, and it can be integrated into a hybrid storage infrastructure, enabling access to remote data as if it were local. It can also scale to an exabyte-level file system in a single namespace. “ANQ Cold is an industry game changer for economically storing and retrieving cold file data,” said Ryan Farris, VP of Product at Qumulo. “To put this in perspective with a common use case, hospital IT administrators in charge of PACS archival data can use ANQ Cold for the long-term retention of DICOM images at a fraction of their current on-premises leg

No light without dark : making the most of ‘shadow IT’

Web Hosting & Remote IT Support In the last few decades, technology has created a modern digital workforce that is technically skilled and adept at finding innovative solutions that would help them succeed at work. However, with 95% of employees struggling with digital friction in the workplace - including a lack of access to the right tools - ambitious employees who are hungry for results have often needed to explore fixes outside the scope of existing systems provided by their employers. On top of that, the popularity of cloud-based apps has resulted in business processes often ending up fragmented across various systems, requiring workers to devote time to manual maintenance. This has accelerated the spread of (the unnecessarily ominous sounding) ‘shadow IT’, or applications that savvy workers use without official authorization to help them bypass limitations and get work done. In a perfect world, a balance can be struck between giving these technically skilled workers freed