One of many lesser-discussed impacts of the AI push is the sheer quantity of power required to energy the plenty of systematic infrastructure required to energy these expansive methods.
In line with experiences, the training course of for OpenAI’s GPT-4, which is powered by round 25,000 NVIDIA A100 GPUs, required as much as 62,000 megawatt-hours. That’s equal to the power wants of 1,000 U.S. households for over 5 years.
And that’s only one mission. Meta’s new AI supercluster will embrace 350,000 NVIDIA H100 GPUs, whereas X and Google, amongst numerous others, are additionally constructing huge {hardware} tasks to energy their very own fashions.
It’s an enormous useful resource burden, which would require important funding to facilitate.
And it’ll even have an environmental influence.
To offer some perspective on this, the workforce at Visible Capitalist have put collectively an outline of Microsoft’s rising electrical energy wants because it continues to work with OpenAI on its tasks.