
WASHINGTON — The future of artificial intelligence infrastructure may lie in smaller, interconnected data centers rather than massive, centralized facilities, according to industry leaders speaking at Data Center World.
.jpg)
Pete Sacco, founder and CEO of PTS Data Center Solutions, said the industry is shifting away from large-scale data centers as computing needs evolve from training AI models to running inference workloads — a transition that demands lower latency and closer proximity to users.
“Clusters of small data centers requiring only 5-20 MW of power are the future of the AI boom,” Sacco said during the event.
Sacco pointed to growing challenges with large data center developments, including long interconnection wait times — sometimes stretching up to five years — and increasing community resistance to massive facilities.
At the same time, the rise of inferencing, which involves responding to real-time user queries, is reshaping infrastructure needs. Unlike training large language models, which can occur in remote locations, inferencing requires near-instant responses.
“You can’t have 500-MW data centers sitting in the New Mexican desert and be able to deliver real-time inferencing within millisecond scale,” Sacco said.
He noted that while a 10-millisecond delay may be acceptable for training workloads, inferencing typically requires response times closer to one millisecond — a threshold that can only be achieved when data centers are located near population centers.
Sacco added that inferencing is expected to surpass training as the dominant use of data centers by next year, accounting for more than 55% of AI computing demand.
To support this shift, Sacco is advancing a decentralized model through a new venture, Gray Wolf Data Centers. The approach envisions networks of smaller facilities — potentially dozens within a region — that operate together as a distributed computing system.
“Instead of having a 1,200-MW data center — which there are no more places for — I can build 120 10-MW data centers in a region [and] glue them all together,” he said.
The model also incorporates elements of a distributed autonomous organization, or DAO, where ownership and operations are decentralized and coordinated through shared systems.
Energy strategy is another key component of the approach. Rather than relying solely on traditional utilities, each data center could be powered by localized energy solutions, including grid connections, microgrids, solar power and battery storage. Sacco also pointed to emerging technologies such as hydrogen energy, small modular nuclear reactors and, eventually, nuclear fusion as potential long-term power sources.
.jpg)
“Regardless of the electricity generation source, the days of the centralized utility are gone,” he said.
As a proof of concept, Gray Wolf is developing its first facility in Connecticut, a location chosen despite high electricity costs due to its proximity to dense demand centers in the Northeast corridor.
To address energy challenges, the company plans to generate power on-site by converting carbon-based waste — including tires, food and medical waste — into electricity. Sacco said this approach could produce power at costs below 10 cents per kilowatt-hour, creating opportunities to both power operations and sell excess energy.
The evolving model reflects broader shifts in the construction and technology sectors as developers seek more flexible, scalable solutions to support the rapid growth of AI-driven demand.
Originally reported by Robert Freedman, Lead Editor in Construction Dive.