ChannelLife Australia - Industry insider news for technology resellers
Ps dave wardrop profile pic

From storage to strategy: Building AI business cases that deliver enterprise-wide impact

Thu, 13th Nov 2025

Artificial intelligence is making organisations more productive, but that additional productivity is coming at a cost. Organisations are saving dollars in one place but increasing their costs in others. As time goes on, these increased costs are becoming more apparent and being driven 'out in the open'. Some innovative solutions will be required to confront these costs and ensure that what AI delivers is a net benefit. 

The hidden costs of the AI era

There's growing consensus that artificial intelligence has "changed the calculus" when it comes to data volumes, data centre capacity and demand on the electricity grid. For years, we've heard about the ever-increasing volumes of data being generated worldwide. This is continuing, but the data demands of AI are blowing previous assumptions and estimates around future growth rates out of the park. A commonly-cited statistic is that 90% of the world's data was generated in the past two years. The data-hungry demands of AI could make that figure conservative. It's hard to know with any degree of certainty what the estimates of data growth are anymore, because the emergence of a new AI service or model will change the situation completely.

There's also been plenty of discussion recently about the amount of data centre capacity that is required to underpin AI. Since the arrival of the current AI era, we've seen substantial investment flow into data centre campus projects worldwide, only for some pullback in recent months as more efficient AI models emerge. Getting the amount of data centre capacity right will be crucial to meeting the accelerated growth opportunities presented by AI, but the rapid change in the space makes forecasting challenging. 

AI is for the most part exponentially more energy intensive than standard web-based workloads. While a broad range of figures exists, it's been suggested that a ChatGPT query uses up to 25 times more power than a standard Google search. Obviously, this isn't an 'apples' comparison - a ChatGPT query produces a much more specific and tailored response to a question being asked - but given ChatGPT is estimated to receive tens or hundreds of million queries a day, it is an illustration of the step change in energy requirements of the AI era.

The energy intensity of AI has a flow-on effect. According to the International Energy Agency (IEA), "electricity demand from AI-optimised data centres is projected to more than quadruple by 2030", with power consumption by data centres in the United States alone "on course to account for almost half of the growth in electricity demand between now and 2030." Whereas in past years grid capacity has held up against increased demands placed upon it, owing to improved management efficiency combined with additional capacity from things like renewables, there are signs the power-hungry demands of AI workloads are outpacing the grid's ability to keep up.

What's important for adopters of AI is to recognise these challenges in their adoption of AI and to factor them into a holistic business case for AI. It is no longer enough to prepare a one-dimensional business case for AI based on generating productivity gains. The full environmental, social and governance (ESG) costs of an AI-driven future need to be understood and factored into strategies and decisions being made.

The sustainable AI era

AI may be part of the problem, but importantly it's also part of the solution as well. At its core, AI is about bringing intelligence to bear in a range of domains that can be used to increase efficiency. Where it can really create value is in its ability to optimise across a number of variables - across IT, data centre and energy infrastructure - working in synchronisation to enable other AI workloads to run efficiently. 

Several AI-based innovations are being brought to bear in the data storage infrastructure. One of these, Dynamic Carbon Reduction, offers an algorithmic way to reduce energy consumption by switching CPUs into eco-mode during periods of low activity. Another, a type of "always on compression", allows the system to switch from inline data reduction to post processing which further reduces energy consumption and contributes to a lower CO2 footprint by as much as 30-40%. By optimising the way data is stored, this can have a positive flow-on impact on data centre cooling and power consumption requirements. 

Efficiency gains from data storage are just the start. Similar applications of AI to the servers underpinning AI workloads, and to the power systems that feed data centres, are also required - and all of these different AIs must work in tandem to create overall efficiencies. 

The optimisation of a single system is valuable, but when optimisation stretches across all the enabling systems for the current AI era - in the form of an integrated layer of software and services for IT equipment, the data centre and the power grid - this will transform the current situation and guarantee the AI era can continue sustainably for years to come. 

This is still a work-in-progress but is expected to be an area of considerable innovation over the next three-to-five years. Engaging a services partner with expertise and experience across these domains now can substantially improve an organisation's ability to sustainably and cost-effectively operate in the AI era.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X