The Silicon Metabolism
A Report on the Surrender of the Earth
[Deep Dive Disclaimer: These deep dives are a mixture of original drafting, AI deep research and editing. I have been creating a rough draft of an op-ed style article, using that rough draft as a prompt for deep research, using that deep research to improve the draft, then using the improved draft as a deep research prompt - to create a more stylised and readable ‘deep dive’. I send out the improved op-eds through my Substack email list. These deep dives are the endpoint of the process. I was creating them for my own enjoyment but realised other people might enjoy them too. I’m not sure I can claim authorship of them - though I don’t think anybody else could have created them either…]
I. The Weight of the Cloud
We persist in describing artificial intelligence as an advanced form of software: cleverer, more resource-intensive perhaps, but still weightless at heart. The metaphors we use to house our digital lives, such as the “cloud”, the “stream”, and the “web”, conspire to maintain a specific illusion: that the digital realm is ethereal, a place where friction is a glitch rather than a physical law. Apps update silently in the night; code replicates without marginal cost; intelligence, we tell ourselves, should scale with similar grace. This fiction is comforting. It suggests that the digital economy belongs to a realm of pure logic, where abundance feels morally ordained and the constraints of the material world, such as gravity, heat, and scarcity, have been suspended.
Yet, as we pivot from the era of mobile computing to the age of generative intelligence, the material reality of AI is stubbornly refusing this script. Disclosures from the hardware frontier reveal something else entirely. We hear Nvidia’s CEO Jensen Huang warning of “trillion-dollar clusters” ; we see Microsoft abruptly pausing critical data centre builds in Europe over grid constraints in mid-2025 ; we witness U.S. communities in Virginia and Arizona imposing moratoriums on new facilities amid soaring power bills and vanishing aquifers. We are not merely scaling software. We are witnessing the birth of a new silicon metabolism: a vast assemblage of energy systems, mineral extraction, and land use, all reorganised around the sustenance of machine cognition.
The error is categorical. AI does not float in the “cloud”. It is embodied in copper windings, concrete slabs, evaporative cooling towers, and silicon etched at atomic precision. Just as the agricultural revolution was not merely about “better seeds” but about a total reordering of human geography (the clearing of forests, the diverting of rivers, the enclosure of commons), the AI transition is about the planet’s ability to “metabolise” non-organic intelligence. The decisive events of the mid-2020s are not occurring in the context windows of large language models, but in the zoning hearings of rural counties, the union halls of Arizona construction sites, and the drought-stricken valleys of the Andes.
We are living through a “Strategic Concession”, where societies are quietly conceding land, water, and power to the machine in exchange for the promise of intelligence. It is a Faustian bargain made at the level of infrastructure, often invisible until the lights flicker or the taps run dry. This report investigates the anatomy of this new metabolism, tracing the flow of electrons, water, and rare earth metals that undergirds the recursive self-improvement of modern algorithms. It argues that the primary friction in the twenty-first century is no longer between rival ideologies, but between the exponential clock of silicon scaling and the geological clock of the biosphere.
II. The Thermodynamic Ledger
The most immediate manifestation of this new metabolism is the voracious appetite for electricity. For a decade, the efficiency gains of the cloud era masked the rising demand of digital services; data centres became more efficient even as they grew, decoupling data growth from energy growth. That era of decoupling is over. The International Energy Agency (IEA) reports that global electricity demand is now growing at double the rate of total energy demand, a divergence driven largely by the cooling and computational requirements of AI and the electrification of industry.
The End of the Ethereal
The physics of intelligence are unforgiving. Training a frontier model is not merely a manipulation of symbols; it is a thermodynamic event. The specialised hardware required to drive these systems, such as Nvidia’s H100 and the subsequent Blackwell architectures, operates at thermal design powers (TDP) that push the limits of air cooling. A single server rack, which in the cloud era might have drawn a manageable 5 to 10 kilowatts, now demands upwards of 100 kilowatts. This is a phase change in energy density.
Consider the case of the Nvidia NVL72 cluster, which connects over 4,600 GPUs into a single supercomputer-style accelerator. This is not a computer in the sense of a desktop; it is a furnace of cognition. The energy required to flip the trillions of transistors involved in a training run or a complex inference task must be delivered instantly and reliably. There is no buffering the thought of a machine; if the power sags, the thought dies.
This energy density has profound geographic consequences. In Ireland, a country that successfully positioned itself as the digital gateway to Europe for two decades, data centres now consume 21 per cent of all metered electricity: a figure that exceeds the consumption of all urban households combined. The strain is so acute that EirGrid, the state transmission operator, enacted a de facto moratorium on new grid connections in Dublin extending to 2028. Microsoft, a company with a global investment roadmap exceeding hundreds of billions, was forced to abandon plans for a massive facility on land it already owned in Dublin, looking instead to less constrained, though perhaps less optimal, geographies like Northern England or Wales.
This is not an isolated European curiosity. It is the new global baseline. In Northern Virginia’s “Data Centre Alley”, the densest concentration of connectivity on the planet, the grid is creaking under the load. The Virginia Joint Legislative Audit and Review Commission (JLARC) released a report in late 2024 warning of spiralling transmission costs and the necessity of bringing old fossil fuel generators back online to meet the peak loads of the AI clusters. The promise of a green transition is colliding with the reality of the AI load; utilities are keeping coal and gas plants alive not for human warmth, but to ensure the weights and biases of large language models remain accessible. In China and India, the IEA notes that despite renewable growth, coal demand remains stubborn, partly to feed the stable baseload required by high-tech industries and data centres.
The Gigawatt Scale
The unit of analysis for AI infrastructure has shifted. We no longer speak of megawatts (MW) but of gigawatts (GW). A single “gigawatt-scale” campus, roughly the output of a standard nuclear reactor, is now the planning unit for the hyperscalers. The U.S. Department of Energy’s 2024 report estimates that domestic data centre energy use could triple by 2028, consuming up to 12% of the nation’s total electricity.
The implications of the “Gigawatt Scale” are staggering. To power a single campus of this size requires transmission lines that take a decade to permit and build. It requires a constant, non-intermittent supply of energy that wind and solar, without massive battery storage, struggle to provide. This has led to a bizarre historical inversion: the most futuristic industry on earth is resuscitating the most archaic forms of power generation. In Virginia, regulators are weighing the expanded use of polluting diesel generators just to keep the servers running during peak demand.
The “Silicon Clock” creates this crisis. The silicon clock ticks in milliseconds and training runs, quasi-exponential and impatient. It demands that a new cluster be online now to train the next model (GPT-5, GPT-6) before a competitor does. But the “Infrastructure Clock” ticks in decades. It takes years to pour the concrete for a dam, to certify a small modular reactor, or to string high-voltage lines across a mountain range. When the speed of the technology outpaces the speed of the infrastructure supporting it, the result is not harmony but force. The faster clock bends the slower one. In 2025, this bending takes the form of emergency measures, the delay of coal phase-outs, and the cannibalisation of grid capacity meant for other sectors, such as the electrification of transport or heating.
III. The Thirst of the Machine: Hydro-Political Conflict
If electricity is the blood of the AI metabolism, water is its sweat. The conversion of electricity into computation generates waste heat, which must be removed to prevent the delicate silicon wafers from losing their atomic precision. While air cooling suffices for lower densities, the thermal intensity of modern AI chips forces a reliance on evaporative cooling, literally sweating the heat away into the atmosphere.
The Water Footprint of Cognition
The specific heat capacity of water makes it an unrivaled coolant, but this efficiency comes at a high environmental price. The metric of concern is Water Usage Effectiveness (WUE), measured in litres per kilowatt-hour. In 2023, the average U.S. data centre consumed 4.52 litres of water for every kilowatt-hour of energy used. To put this in perspective: training a single large language model like GPT-3 consumed roughly 700,000 litres of fresh water.
This consumption is not evenly distributed. It is concentrated in the specific locales where data centres cluster, often placing them in direct competition with agricultural and municipal water users. The thermodynamics of the GPU dictate that water must be evaporated to reject heat efficiently, but the geography of cheap solar power (often used to greenwash the energy consumption) places these facilities in arid regions.
In the American Southwest, where the Colorado River basin is in a state of chronic crisis, the arrival of gigawatt-scale data centres is a flashpoint. In Tucson, Arizona, a project known as “Project Blue” faced fierce community opposition. The proposed facility would span 290 acres and consume millions of gallons of water annually in a desert city that has spent decades preaching conservation to its citizens. The conflict revealed the opacity of the industry: the project was shrouded in secrecy, its water demands only revealed through diligent local activism. For the residents of Tucson, the abstraction of “AI” collapsed into the very concrete reality of a neighbour who would drink from their dwindling aquifer while offering little in the way of local employment.
The Colonial Dynamics of Cooling
The conflict over water reveals the “colonial” dynamics of the AI stack. The benefits of the intelligence (the productivity gains, the stock market rallies, the medical breakthroughs) are global and often concentrated in the corporate headquarters of the Global North (San Francisco, Seattle, Redmond). The costs (the water depletion, the noise pollution, the grid instability) are intensely local.
In Latin America, this dynamic has sparked a wave of “data centre resistance”. In Cerrillos, Chile, a working-class district of Santiago, a proposed Google data centre became the site of a fierce legal and social battle. The facility was projected to consume 169 litres of water per second in a region suffering from a “mega-drought”. The local slogan, “No to the data centre! Data theft!”, conflated the extraction of personal information with the extraction of the water table, identifying both as forms of dispossession.
These protests are not Luddite rejections of technology; they are assertions of metabolic sovereignty. In Uruguay, similar protests erupted when it was revealed that a Google facility would draw directly from the drinking water reserves during a drought so severe that tap water had become brackish. The image of working-class families boiling salty water while a foreign data centre secured permits for fresh reserves is a potent symbol of the “Metabolic Rift”, the rupture between the logic of capital accumulation and the ecological cycles of the earth.
The industry is attempting to engineer its way out of this impasse. New “direct-to-chip” liquid cooling technologies promise to close the loop, recirculating fluid rather than evaporating it. Yet these retrofits are expensive and slow. For the immediate future, the expansion of AI capacity in water-stressed regions will remain a zero-sum game between the cooling towers of the cloud and the taps of the citizenry. As noted by researchers, “the water footprint of digitalisation could be larger and more problematic than its carbon footprint” because water is strictly local. You cannot offset a thirsty data centre in Santiago with a wet wetland in Scotland.
IV. The Lithographic Crust: Mining the Substrate
Beneath the energy and water lies the substrate itself: the metals. The physical instantiation of AI requires a complex mineralogy, from the copper in the transmission lines to the rare earth elements in the GPUs and the ultra-pure silicon of the wafers. The “dematerialisation” of the economy promised by digital tech was a myth. AI is re-materialising the economy, driving a super-cycle in commodities.
The Copper Squeeze
Copper is the nervous system of the grid. It connects the wind farm to the substation, the substation to the data centre, and the server to the rack. The AI boom, coinciding with the broader electrification of transport (EVs) and industry, has created a “copper squeeze”. Analysts warn that by 2035, only 70% of global copper demand may be met.
Data centre developers are famously price-inelastic regarding copper; the metal represents a fraction of the total capital expenditure of a $10 billion cluster, so they will pay whatever is necessary to secure it. This creates a bidding war that ripples through the global economy, potentially pricing out other critical infrastructure projects like affordable housing or municipal grid upgrades. Goldman Sachs forecasts copper prices to remain elevated, driven by this “structural demand” from the grid and AI sectors.
The supply side is rigid. Opening a new copper mine takes 15 to 20 years. The major deposits in Chile and Peru are ageing, with declining ore grades requiring more energy and water to process, a recursive loop of resource intensity. As the “silicon metabolism” expands, it must chew through more rock to find the conductive metal it needs, deepening the scars on the Atacama Desert and the Andes. This is the “strategic concession” writ large: tearing open the earth to facilitate the flow of information.
The Energy of Precision: The Lithographic Limit
At the heart of the machine is the silicon wafer, etched with features approaching the atomic scale. The transition to “High-NA EUV” (High Numerical Aperture Extreme Ultraviolet) lithography represents the pinnacle of this manufacturing prowess. Yet, this too is an energy story.
A single High-NA EUV machine, the size of a double-decker bus, consumes up to 1.4 megawatts of power. This is ten times the energy consumption of previous generation immersion lithography tools. To generate the extreme ultraviolet light (13.5 nanometres wavelength), the machine fires a high-power laser at droplets of molten tin, vaporising them into plasma 50,000 times a second. It is a process of brute force physics harnessed for microscopic delicacy.
By 2030, fabs equipped with these tools are projected to consume 54,000 gigawatts annually: more than the consumption of nations like Singapore or Greece. The pursuit of “efficiency” at the transistor level (Moore’s Law) is paradoxically driving a massive increase in energy consumption at the manufacturing level. We are saving watts in the inference but burning megawatts in the fabrication. This highlights the Jevons Paradox in action: as the efficiency of computing increases, the total resource consumption of the computing sector expands rather than contracts.
The Rare Earth Choke Point
The supply chain for the GPUs themselves, the H100s and GB200s, is a geopolitical minefield. The rare earth elements required for the magnets and capacitors are overwhelmingly processed in China. The “security dilemma” of the AI arms race has led to export controls and retaliatory bans. When the U.S. restricted high-end chip sales to China, Beijing responded with controls on gallium and germanium, critical for semiconductor production.
This creates a precarious interdependence. Western data centres are being built to run models that compete with China, yet they are built with materials that pass through Chinese supply chains. Nvidia’s Jensen Huang has highlighted this vulnerability, noting that while the U.S. has the design leadership, it lacks the “sovereign” manufacturing and material capacity of its rival. The “War of Clocks” applies here too: it takes weeks to design a chip, but years to open a mine for the dysprosium it requires.
V. The Autonomy of Demand: A Metabolic Imperative
Why is this consumption accelerating so aggressively? In traditional markets, demand saturates. A household only needs so much heat; a driver only needs so many miles. But AI introduces a novel economic phenomenon: endogenous demand.
For most of history, demand was exogenous, tethered to human rhythms. A loom waits for the weaver; a server idles until queried. Even the great machines of the industrial age were ultimately constrained by the physical limits of human flesh. AI breaks this tether. Once deployed, frontier systems, especially agentic and recursive ones, do not pause. They monitor, generate, critique, retrain, and spawn new tasks in self-reinforcing loops. The primary consumer of advanced AI is increasingly other AI: models training models, agents optimising agents.
The Jevons Paradox of Intelligence
The Jevons Paradox states that as technology increases the efficiency with which a resource is used, the total consumption of that resource increases rather than decreases. We are witnessing this in real-time with “compute”. As models become more efficient (e.g., DeepSeek’s optimisation or smaller, distilled models), the cost of intelligence drops. This does not lead to less compute usage; it leads to the embedding of intelligence into every possible process.
If the cost of inference drops to near zero, we will not stop at having an AI write our emails. We will have AI agents that spawn sub-agents to debate the optimal email, simulate the recipient’s reaction, and redraft it a thousand times before sending. The demand for compute becomes recursive. The IEA’s projections of data centre demand doubling by 2030 are likely conservative if this recursive demand takes hold.
Recursive Self-Improvement (RSI)
The “intelligence explosion” hypothesis relies on the concept of Recursive Self-Improvement (RSI), where AI systems are used to design better AI systems. This closes the loop. In this scenario, demand is unchecked by human biology (the need for sleep, the limit of attention). An automated researcher can run millions of experiments in parallel, each requiring vast compute. This creates a vertical demand curve. As foreshadowed by economic models of RSI, if AI can automate the research process, the economy could theoretically hit a “singularity” of output, constrained only by physical inputs: energy and chips.
This explains the “irrational” exuberance of the capital expenditure (CapEx) we are seeing. Tech giants are not building for current human demand; they are building for the anticipated demand of a machine-to-machine economy. They are betting that the “silicon metabolism” will become the dominant driver of GDP, much like the railway mania of the 19th century. In that era, investment in rails exceeded 6% of US GDP; currently, AI investment is approaching comparable intensity relative to the tech sector’s size. The difference is that railways connected existing cities; AI infrastructure is building a new, synthetic world.
VI. The War of Clocks: Temporal Friction
The resulting tensions play out across mismatched temporalities: a war of clocks.
The Silicon Clock ticks in milliseconds and training runs, quasi-exponential and impatient. It is the clock of Moore’s Law and the quarterly earnings call. The Corporate Clock quarters by earnings and investor horizons. It demands immediate ROI. The Political Clock crawls through elections and consultations. The Infrastructure Clock ticks in decades. The time required for dams, grids, reactors, and transmission corridors is measured in 10, 15, or 20 year cycles.
These rhythms do not harmonise; they collide. As of late 2025, the fallout is unmistakable. The clash is perhaps best illustrated by the labour disputes at TSMC’s fabrication plant in Arizona. TSMC, operating on the Silicon Clock (and the intense work culture of Taiwan), attempted to import workers to speed up construction, citing a lack of skilled local labour. This collided with the American labour clock: unionised, regulated, and culturally distinct. The result was a delay to 2025 (and potentially beyond) and a lawsuit alleging discrimination and unsafe working conditions.
The “Silicon Clock” assumes a friction-free world of interchangeable labour and resources; the real world is sticky, unionised, and regulated. When TSMC managers complained that American workers were “slow”, they were expressing the frustration of a metabolic system that cannot tolerate the biological and social limits of its host. The friction is not a matter of software updates; it is that this new metabolism reshapes the planet in its own image, and the planet (and its people) are pushing back.
VII. The Game Theory of Scaling: A Tragedy of the Grid
Civilisations are not mere piles of infrastructure; they are strategic equilibria. AI scaling obeys this same cold arithmetic.
Consider the current compute arms race as a security dilemma: a situation where one actor’s attempt to increase their own power automatically decreases the security of everyone else. In this environment, defection pays in the short term. If a single firm or nation chooses to ignore energy constraints or safety protocols to build a larger cluster, they gain an immediate strategic edge. Because no rival can afford to be left behind, mutual escalation becomes the only rational choice, even if it threatens to collapse the shared infrastructure they all rely on.
This creates a tragedy of the commons where shared electrical grids are depleted by unchecked training runs, leading to the blackouts and moratoriums we are seeing across the globe. The current Nash equilibrium, the point where no player can change their strategy without immediately worsening their position, tilts heavily toward centralisation. Those who can finance at planetary scale and absorb grid volatility consolidate power.
However, equilibria are not permanent. History shows competitive traps can be escaped through coordination: shared standards, deliberate restraint. But the window is closing. Once the concrete is poured, the transmission lines set, and the silicon metabolism baked into geography, the strategic bones become nearly impossible to rewrite. The “Game Theory of Scaling” suggests we are currently locked in a “defect-defect” cycle, where the rational choice for every individual actor is to consume more energy, faster, regardless of the collective cost to the grid or the climate.
VIII. The Sovereign Turn: Fortress AI
In response to these frictions, the geography of the internet is fracturing. The dream of a borderless global cloud is dead. In its place rises the fortress of “Sovereign AI”. Nations have realised that if AI is the substrate of future economic and military power, they cannot rely on foreign servers. We are seeing a rush to build “sovereign compute” capacity: infrastructure located within national borders, powered by national grids, and subject to national laws.
This race is driven by the realisation that “cloud sovereignty” is no longer enough; one needs “compute sovereignty”. It is not enough to have your data stored in Paris; the GPUs that process it must also be in Paris, powered by French electrons. Macron’s massive €109 billion roadmap is a dirigiste attempt to ensure that France does not become a client state of American hyperscalers. It is a recognition that in the AI era, computing power is as strategic as oil reserves or wheat harvests.
The resulting “balkanisation” of the internet is physical. We are moving toward a world of “sovereign clouds” separated by hard borders, export controls, and firewall moats. The “Security Dilemma” ensures that this fragmentation will continue: if the US restricts chip exports to China, China must develop indigenous supply chains, eventually creating a rival ecosystem that is harder to sanction and control. The “One World, One Internet” vision is being replaced by a “Multi-Polar Silicon Metabolism”.
IX. Theoretical Horizons: The Planetary Computer
To understand the scale of this transformation, we must turn to new theoretical frameworks. The old tools of political economy, labour vs. capital, are insufficient to describe a system where capital (compute) is becoming autonomous and geophysically dominant.
The Stack and the Accidental Megastructure
Benjamin Bratton’s concept of “The Stack” provides a useful map. He describes planetary computation not as a tool, but as an accidental megastructure: a layer of the planet that has its own geography and sovereignty. The AI data centre is the cathedral of this new layer. It dictates the flow of energy and the organisation of cities (the “City” layer).
Bratton argues that we must stop viewing AI as a “mind” that hates or loves us. It is a geophysical force. As Eliezer Yudkowsky put it, “The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else”. The “something else” today is cooling water and electricity. The Stack is reorganising the Earth’s resources to optimise its own throughput. The “autonomy of demand” we see in recursive AI is the Stack acting for itself, indifferent to the biological needs of the User.
Cosmotechnics and the Metabolic Rift
The philosopher Yuk Hui offers a counter-narrative through the concept of “Cosmotechnics”: the idea that technology is not universal, but rooted in specific cosmological and moral orders. The current AI explosion is a specific “monotechnological” vision, driven by a Western/Silicon Valley cosmology of infinite expansion and acceleration.
Hui warns that this singular vision threatens “technodiversity”. The “Metabolic Rift”, a concept from Marx revitalised by ecologists like Kohei Saito and Slavoj Žižek, describes how capitalism creates an irreparable break in the natural cycles of the earth. AI deepens this rift. It extracts order (data/intelligence) and exports entropy (heat/waste) at a rate that the biosphere cannot absorb.
The protests in Chile and the moratoriums in Ireland are not just NIMBYism; they are the immune response of the local “Cosmotechnics” against the homogenising, extractive logic of the universal machine. They are asserting that water has a value beyond its utility as a coolant for a chatbot. They are the friction that the machine tries to smooth over, but which constitutes the only real check on its expansion.
Conclusion: The Concrete Equilibrium
We flatter ourselves when we frame the AI revolution as a revolution of the mind, debating consciousness, creativity, and the oracle’s soul. This keeps the human intellect at centre stage, as if the drama were about thought exceeding thought. It is a comforting vanity.
But the real revolution is metabolic. Like the steam engine mistaken for an “iron horse”, we focus on the mimicry, the mind-like fluency, while the mines deepen, the rails spread, and geography reorders. The decisive events of 2025 are not breakthroughs in “consciousness”, but sovereign grid wars, zoning battles over insatiable neighbours, and a resource requirement measured in gigawatts and rare earths.
We call it a revolution of the mind to avoid admitting it is a surrender of the world: land, water, and power conceded to sustain the machine’s hunger. The “Strategic Concession” is already underway. We are pouring the concrete, digging the mines, and rewiring the grid to accommodate a new species of resident, one that eats gigawatts and drinks rivers.
As the “Silicon Clock” accelerates, tearing away from the “Infrastructure Clock”, the friction will only produce more heat, both thermodynamic and political. The question is not whether the AI will wake up and destroy us. The question is whether, in our rush to build its body, we will exhaust the world that sustains our own. The machine does not need to be conscious to consume the earth; it just needs to be hungry. And as the smoke rises from the coal plants of Virginia and the water levels drop in the reservoirs of Santiago, it is clear that its appetite is just beginning to wake.


