When questioned if it really did have our best intentions at heart, ChatGPT is unnervingly diplomatic.
“As an AI language model, I don’t have intentions, desires, or the capability to harm or help in the real world, my purpose is to assist users,” the machine pens, milliseconds after the question was sent.
And assist it has – according to the latest available data, ChatGPT currently has around 180m users worldwide and ranks as the seventeenth most-visited site in the world ahead of movie database IMDB, professional social media network Linkedin and streaming services Twitch and Netflix.
But all this traffic is expensive – financially and environmentally.
Recent research from the University of Washington shows that the hundreds of millions of queries logged on Open AI’s platform require the equivalent energy of 33,000 U.S. households — around one gigawatt-hour a day.
A single GPT query reportedly consumes 15 times more energy than a Google search query.
Making the platform omnipresent has rightly or wrongly catapulted artificial intelligence firmly into the cultural zeitgeist of how we work play and, well, live.
When ChatGPT launched in early 2022, there was nothing else like it and its combination of impressive technology and cushy funding pipeline saw user numbers rocket.
But now major technology firms have properly caught up; Microsoft Bing, Google Bard, OpenAI Playground, Amazon CodeWhisperer, and GitHub Copilot are all multi-billion dollar projects from some of the sector’s heaviest hitters.
The result of the proliferation is that AI is everywhere; a McKinsey study conducted across North America and Europe last year found that 79 per cent of the 1,648 respondents had some exposure to AI either at work or at home.
By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 terawatts annually based on the projection of AI server production.
That is the rough equivalent of what Argentina, the Netherlands and Sweden each use in a year, and is about 0.5 per cent of the world’s current electricity use.
With the AI boom in full effect and showing no signs of slowing, how do we reckon with its potentially planet-stopping demands on our fuel resources?
The simple answer is we probably don’t.
Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam, penned in a paper examining AI’s energy usage last year that as demand for AI services increases, energy consumption from it will continue to increase too, regardless of efforts made to offset it through efficiency gains.
Daryl Elfield, a KPMG UK partner specialising in ESG tech and data tells City A.M. that not all generative AI systems are created equal and understanding this is key to gauging the energy demands of each.
“It’s more about how we consume and who will consume it and one set of research shows that our demand for generative AI platforms will exceed the suppliers’ ability to match using sustainable kinds of electricity,” he said.
“But the other says our approach to how we use general AI is going to flatten that curve quite quickly.”
He argues that the most energy-intensive form of usage comes from a user on a platform such as engaging with a Large Language Model (LLM) like a ChatGPT through an application programming interface – a mechanism that enables different software to talk to each other – i.e a weather app on your phone contacting a weather database to display the weather on your screen.
The trick to reducing AI’s energy could be to rely more on platforms such as Microsoft CoPilot or Google Bard which incorporate the AI function into its platform meaning that connection does not have to be bridged.
“The impact of this change on energy consumption is massive but which of those trends will be more accurate is difficult to tell at this point.”
He added that the way we view the energy usage of AI like ChatGPT is also flawed and that up to 60 per cent of the programme’s total energy demand is used up during the pre-processing, development and training phases compared to 40 per cent deployed to respond to user queries.
Global consultancy firm KPMG’s CEO Outlook 2023 report illustrates the knowledge gap between artificial intelligence systems and their impact on being green.
Ethics and costs were both cited as challenges to generative AI adoption by 63 per cent of the chief executives surveyed, while one quarter said the complexity of decarbonising supply chains was the leading barrier to achieving net zero solutions.
Supply chains are one of the proven use cases for the power of generative AI with huge multinationals like Unilever, Siemens and Maersk deploying the technology to streamline systems.
But if decision-makers don’t appreciate the demand levels of the generative AI being used in this way, it’s no wonder it continues to raise questions about its environmental footprint.