AI’s insatiable need for power is driven by the complexity and scale of its computational requirements.
AI
models are often trained and deployed in data centres, which are
massive facilities housing thousands of servers. These servers consume a
substantial amount of energy, equivalent to that of 30,000 homes.
AI
inference, the process of answering user queries, relies heavily on
Graphics Processing Units (“GPUs”). Each inference requires GPU
processing power, which uses energy. This demand is expected to increase
as more AI models are developed and deployed.
Larger AI models,
such as those used in language processing and computer vision, require
more computational resources and, consequently, more energy. These
models have billions of parameters and rely on massive data sets,
further straining energy demands. And as AI adoption grows, so does the
need for more powerful infrastructure to support it.
In an own
goal for those eager to implement Agenda 2030 and its Sustainable
Development Goals – such as those who eagerly signed the ‘Pact for the
Future’, ‘Global Digital Compact’ and ‘Pact for Future Generations’ at
the UN Summit of the Future on Sunday – the energy consumption of AI
systems contributes to greenhouse gas emissions, so they say, and
strains global grids.
We guess matching large energy
requirements 100% of the time is hard to do with “renewable” energy,
i.e. wind and solar, because it is unreliable and intermittent.
However,
the solution is simple; label nuclear energy as “green” energy. Gura
posed the question: “Do they, the tech companies, believe that kind of
traditional green energy, do they think that green energy is going to be
enough to make up the difference that they need?”
Saul responded, “Well everybody loves nuclear I mean nuclear has gotten so hot, like Joe Rogan talking about nuclear.”
Gura chipped in, “Bill Gates is talking about it now.”
“Everybody. Yeah, everyone’s very excited about nuclear,” Saul said....<<<Read More>>>...