The economics of AI is very different from the imagination of science fiction writers. It has already disrupted labor markets, but it will never put humanity out of work on a permanent basis.
The real competition between AI and humans however involves scarce resources, not jobs. As the cost per unit of computation continuously declines, more resources will be mobilized, paradoxically increasing their price. One of the main inputs to AI training and prompting is energy. The growth of data centers is emerging as a new threat to the energy transition, and some investors are betting that a new energy distribution model will offer a sustainable value proposition.
“Cleantech is hard,” said Bill Brown, currently CEO of New Waters Capital and a serial entrepreneur in clean energy. “It's commodity in - process - commodity out, and you always are driven by the commodity markets, it always has to be the cheapest. And I never saw a value proposition. And then all of a sudden, the light bulb goes off and it’s that ah, compute is a value trade.”
Sustainable demand, unsustainable supply
AI threatens several aspects of the energy transition all at once. First, it is a source of scalable power demand, eliminating the possibility of energy conservation to drive down carbon emissions. Second, the demand it creates is usually continuous, causing a mismatch with the intermittent nature of most renewable energy sources.
And a third, more subtle point is that it also creates new requirements for the power grid. In countries which are not small, isolated islands, geographic diversification can be a useful tool to manage this intermittency, enabled by the ability to move electricity between areas with different weather patterns. If one area is cloudy, inhibiting solar power, surely another will be sunny or windy.
The power grid is complex infrastructure which, for many reasons, is unsuited for market-based management. Instead, utilities are highly regulated and risk-averse, making them sleepy stocks for growth investors. This year, however, as AI concept securities surged, the markets started turning to utilities as a back door into the frenzy. After all, one solution to this green dilemma is to simply build more capacity than would be needed the vast majority of the time.
New Waters rejects this approach. According to Brown, transmission and distribution doubles the cost of electricity – both due to the physics of power transmission, but also to the various administrative expenses involved in maintaining a grid. Instead, he sees co-location between data centers and power sources as the future.
Brown sees great untapped potential in AI, which still has not been priced into the market. Regarding the financial industry, it has been mostly used in the back office, rather than the front office which is closer to customer needs. He compared it to a “ninth grader” who may make mistakes, but it eager to help. The longer-term and broader trend, meanwhile, will be toward AI agents who can cooperate and critique each other’s work.
With the growth in data requirements over the past few years, tech giants have found themselves unable to meet their carbon neutrality commitments. Green supply chains will be a priority for years to come.
All energy has two uses
The most important use of futuristic storytelling is to anticipate the constraints which may start to dominate as a valuable factor, in this case computing power, trends toward infinity. Physics provides one answer. The second law of thermodynamics shows that no process will ever use energy at 100% efficiency, except for heat production. As the spatial density of computation in data centers increases, dispersion of waste heat is emerging as a critical issue to ensure the operational lives of servers.
In some cases, that waste heat may be a desired endpoint, constituting an investment opportunity. Heat is the world’s largest energy end use, making up almost 50%, according to the International Energy Agency, even larger than electricity and transport. Some countries in colder climates with denser construction use centralized heating, which could come almost for free near data centers. Even though heat production is already 100% efficient, economic efficiency gains can be still made by adding steps prior to the final stage of the energy life cycle to do useful work.
Even when heat is not the end use, it can also sometimes be consumed for production. Some of the energy used to turn turbines goes toward heating up air until it is hot enough to produce combustion, which can significantly reduce the efficiency of electricity generation. Power plants often reuse their own waste heat for this purpose, in a process similar to turbocharging for internal combustion engines. The waste heat created by data centers could also be similarly injected into this process, assuming they are co-located with a thermal power source.
Incidentally, the energy wastage of computation did not emerge solely with AI. Cryptocurrency has been receiving less attention since the boom in LLMs, but it is likewise a value-added application of computational cycles at scale. The University of Cambridge estimated that global cryptocurrency mining already consumed as much electricity as the entire nation of the Netherlands in 2023. It has become possible to buy currency mining space heaters, demonstrating that energy for computation is entirely duplicated with the energy for winter heating.
Just as for AI, energy management also become a point of competition in the blockchain space, spurring new power generation and distribution arrangements. “The blockchain people, what do they want to do? They want to co-locate. Co-location, behind the meter, they get it. The normal data center people do not.”
A new supply chain advantage
The sources of supply and demand for heat are both clear, but one problem remains: transporting it between the two. Liquid carriers heat much more effectively than air, but an entirely new liquid-cooled “data center 2.0” architecture will be required, potentially stranding existing data center assets.
Taiwan is in a good position to capitalize on the new technological requirements. Brown mentioned Delta Electronics, Formosa Heavy Industries, Chung-Hsin Electric & Machinery (CHEM), and Kaori as excellent Taiwanese manufacturers in this field. “I actually think that Taiwan could own a huge part of the US economy,” he said. “It could be your part of the economy.”
He contrasted Taiwan’s hardware manufacturing to Japan’s construction industry. “Back when I was building power plants, I spent a lot of time in Tokyo with Mitsubishi and with Toshiba and Toyo Engineering, so I know that they know how to build heavy equipment really, really well. But you guys here have totally different approaches of things, and I'm just learning to appreciate it. I hope to learn a lot more about it as time passes.”
When it comes to Taiwan’s own computation needs, however, he has different advice. “If I were Taiwan – and this will be probably very controversial – I would not import expensive energy and turn that expensive energy into expensive bits. I would leave the expensive molecules and build a factory or a data center on top of the cheapest molecules in the world, and import the bits for free.”
Energy storage is not a viable way to ensure a stable power supply. It is necessary to think about battery capacity scaling not only in terms of quantity of electricity stored, but also along a separate dimension of time, which could correspond to yearly planetary cycles, which is a long time to keep storage infrastructure effectively offline. Without other sources of clean, non-intermittent power, it is more practical to move the data itself offshore for processing overseas.