One of the less discussed impacts of AI propulsion is the sheer amount of energy required to power the massive amounts of systemic infrastructure required to run these massive systems.
According to reports,Powered by approximately 25,000 NVIDIA A100 GPUs, the training process of OpenAI’s GPT-4 required ~62,000 megawatt hours, equivalent to the energy needs of 1,000 U.S. homes for more than five years.
And that’s just one project. Meta’s new AI supercluster also includes: There are 350,000 NVIDIA H100 GPUs, and various companies, including X and Google, are also building large-scale hardware projects to power their own models.
This is a huge resource burden and will require significant investment to facilitate.
And it will have an impact on the environment.
To provide some perspective on this, the team at Visual Capitalist put together an overview of Microsoft’s growing power demands as it continues to collaborate with OpenAI on the project.