Scientists Already Have a Solution For Future AI Energy Needs
- Editor OGN Daily
- 7 hours ago
- 2 min read
With everyone worried about the energy requirements for AI, it's good to know that the solution has already been figured out. And it should save us from the perceived need to ramp up fossil fuel production.

Trump has signed an executive order to “turbocharge coal mining”, proclaiming that America will need to double electricity output to drive America’s AI supremacy. The good news is that we will likely avert this dystopian disaster after all. The magic lies in the unique properties of graphene, a flat sheet of carbon atoms first isolated in a Nobel prize discovery in Manchester in 2004.
Graphene is known for its exceptionally high tensile strength, electrical conductivity, transparency, and being the thinnest two-dimensional material in the world. It has been in a perpetual state of “just about ready to revolutionize the world” for years - and is now on the verge of doing so.
Scientists from three British universities are together developing an atom-thick graphene chip that slashes energy use for computing and AI data centres by over 90 percent, radically changing the trajectory of global electricity demand over the next quarter century. “We’re very confident that we will be able to cut electricity use for computing by 90 percent and perhaps even by five times more than that,” said Sir Colin Humphreys, the project leader and professor of materials science at Queen Mary University of London. “We expect to have a prototype that works by 2029, and we should be manufacturing millions of working devices by 2032-2033,” he said.
There is obviously still an element of 'fingers crossed' on this, but if graphene can deliver this quantum jump before the end of the decade, and start rolling it out at scale in the early 2030s, Google, Microsoft, Meta, Amazon and the giant hyper-scalers will not need extra fleets of nuclear reactors and gas plants to run their AI data centres.