
xAI’s rivals are all building similarly large data centers to develop their most powerful generative-AI models; a metropolis’s worth of electricity will surge through facilities that occupy a few city blocks. These companies have primarily made their chatbots “smarter” not by writing niftier code but by making them bigger: ramming more data through more powerful computer chips that use more electricity. OpenAI has announced plans for facilities requiring more than 30 gigawatts of power in total—more than the largest recorded demand for all of New England. Since ChatGPT’s launch, in November 2022, the capital expenditures of Amazon, Microsoft, Meta, and Google have exceeded $600 billion, and much of that spending has gone toward data centers—more, even after adjusting for inflation, than the government spent to build the entire interstate-highway system. “These are the largest single points of consumption of electricity in history,” Jesse Jenkins, a climate modeler at Princeton, told me.
Even conservative analyses forecast that the tech industry will drop the equivalent of roughly 40 Seattles onto America’s grid within a decade; aggressive scenarios predict more than 60 in half that time. According to Siddharth Singh, an energy-investment analyst at the International Energy Agency, by 2030, U.S. data centers will consume more electricity than all of the country’s heavy industries—more than the cement, steel, chemical, car, and other industrial facilities put together. Roughly half of that demand will come from data centers equipped for the particular needs of generative AI—programs, such as ChatGPT, that can produce text and images, solve complex math problems, and perhaps one day inform scientific discoveries…
The optimist’s case is that, by then, advanced nuclear reactors will have obviated many of the new fossil-fuel plants, and AI tools will have invented technologies that can solve the climate crisis. That may well happen. But today, “the market has converged on Add gas now, and then add nuclear later,” Jenkins said. In other words, if natural-gas turbines seem to offer the most expedient path to an AI-enhanced future, then clean air may have to wait…
Northern Virginia offers a glimpse into what the AI rush may bring to the rest of the nation. Loudoun is running out of space, but new data-center hubs are popping up in Phoenix, Atlanta, and Dallas. Amazon and Meta are building AI data centers in Indiana and Louisiana, respectively, that will each require more than two gigawatts of electricity, dozens of times more than standard facilities. OpenAI has proposed that the U.S. establish “AI Economic Zones”: little Loudouns everywhere.
Infrastructure is necessary to keep the modern world going but does not always get much attention until there are big changes (good or bad). We expect, electricity, water, the Internet, roads, and more to just be there and work just fine. Each of these required large-scale efforts to build and require a lot to maintain. AI may appear through the air but it is based by a massive infrastructure that includes land, servers, water, and electricity. Putting this all together is no easy task as it takes years, if not longer, to plan, find funding, get building approval, and complete construction.
Not noted as much in this article is how opposition to data centers has often focused on these infrastructure aspects. Who will pay for the water and electricity needed? What happens if there is air and noise pollution? Residents tend not to want to live near power plants, waste transfer facilities, and sewage treatment plants. How many people want to live near data centers?
Given the resources needed for data centers, how much are utility companies supporting these plans? Development, particularly intensive development, can generate new business. Growth machines, coalitions of local leaders and organizations, benefit from new development.








