AI at the Edge

For decades, data centers operated as the quiet, hidden backbone of the digital world. That era is definitively over. The data center industry is now at the center of a transformative era, driven by the relentless advancement of artificial intelligence. This is not an incremental change; it is a fundamental, market-breaking event.

The industry is currently navigating a perfect storm of three simultaneous, interconnected pressures:

  1. Exponential Compute Demand: The sheer processing power required by AI models represents a "massive acceleration" in demand.
  2. Power Scarcity: The inability of existing electrical grids to service this unprecedented demand.
  3. The Physics of Latency: The inability of centralized cloud data centers to meet the millisecond-level, real-time response needs of production-grade AI.

The scale of this demand is staggering. Global data center energy demand is projected to double in just five years, from approximately 50 gigawatts (GW) to 100 GW by 2028. In the United States alone, power demand from AI data centers is forecast to grow more than thirtyfold by 2035, surging from 4 GW in 2024 to 123 GW.

This explosive demand has collided with a finite supply, creating what industry reports are calling a "data center availability crisis". According to a CBRE H1 2025 report, the North American data center vacancy rate has fallen to an all-time low of 1.6%. This is driven by "hyperscale and AI occupiers" racing to secure any available capacity. JLL's mid-year 2025 analysis confirms this, pegging vacancy at a record-low 2.3% and noting that market absorption levels have quadrupled since 2020.

The primary constraint is not land, labor, or capital; it is power. A mid-year 2025 JLL report captures the new industry mantra: "Power has become the new real estate". Power availability is the "defining constraint on growth" , and existing grids in core markets like Northern Virginia, Tokyo, and London are tapped out.

This power bottleneck has fundamentally inverted the data center sales model. The low vacancy rates persist despite record-high construction levels. This apparent contradiction is resolved by the fact that new grid connections now have an average wait time of four years in the U.S.. Operators can no longer build "speculatively." As JLL analysts note, the model has shifted from "build-it-and-they-will-come" to "commit-before-it's-built-or-you-won't-get-in". The proof is in the pre-leasing: 74.3% of all capacity currently under construction is already pre-leased. Cloud and AI providers are locking in infrastructure 18 to 24 months before their intended deployment, meaning the market is effectively sold out before it even exists.

This intense pressure is forcing the AI infrastructure market to bifurcate into two distinct models. The first, and more widely reported, is the "AI Training Campus"—massive, centralized hubs, with some plans scaling to 5,000 acres and 5-gigawatt capacities, built in new, power-rich regions. The second, and arguably more complex, model is the "AI Inference Node", smaller, distributed facilities located near users to run those trained models in real-time. The industry's strategy for this second challenge is the AI edge data center response.

Why Centralized AI Fails the Real-Time Test

While the massive, 5-gigawatt training campuses are necessary for building foundational AI models, they are insufficient for deploying them in the real world. The traditional model of centralizing AI inference in large, remote cloud data centers is failing the real-time test.

The new wave of AI is not confined to chatbots. It is being embedded directly into the physical world, powering applications like autonomous vehicles, real-time language translation, smart-city sensors, industrial robotics, augmented reality, and financial fraud detection.

These applications are, by definition, "latency-sensitive". They cannot tolerate the physical delay (latency) of sending a query thousands of miles to a centralized cloud and waiting for a response. Some production AI applications require millisecond response times to function properly. An autonomous vehicle, for example, needs to make split-second decisions and cannot wait for a 200-millisecond round-trip to a data center.

An extensive 2024 measurement study highlights this physical gap: 58% of end-users can reach a nearby edge server in less than 10 milliseconds, while only 29% of end-users can achieve a similar sub-10ms latency from a nearby cloud data center. This is the latency barrier, and for real-time AI, it is non-negotiable.

This clarifies a common point of confusion. Some new generative AI models, such as those using chain-of-thought reasoning, can tolerate response times of "several seconds". This is acceptable for a user prompting a chatbot for a creative script. It is catastrophic for a smart-grid sensor or a self-driving car.

This distinction is the business case for AI at the Edge. AI training, the brute-force process of building a large language model (LLM), is a massive batch job. It is power-sensitive but not latency-sensitive, and as the Uptime Institute predicts, "Most AI models will be trained in the cloud".

However, AI inference, the act of using that model to make a real-time decision, is a constant, low-latency task. The true battleground for the next wave of digital infrastructure, and the core of the industry's AI edge data center response, is to build a new architecture that can capture this high-value inference workload.

Defining "AI at the Edge"

Defining "AI at the Edge" In response to the limitations of centralized clouds, the industry is strategically distributing compute power. JLL defines an edge data center as a facility that "brings computing power closer to where the data is generated or consumed". The Uptime Institute provides a more granular definition, describing it as "Distributing computing and storage capabilities to the very edge of the network... be it a factory floor... a cell tower or smart building".

In response to the limitations of centralized clouds, the industry is strategically distributing compute power. JLL defines an edge data center as a facility that "brings computing power closer to where the data is generated or consumed". The Uptime Institute provides a more granular definition, describing it as "Distributing computing and storage capabilities to the very edge of the network... be it a factory floor... a cell tower or smart building".

This architectural shift is driven by three core value propositions:

  1. Solving for Latency: This is the primary driver. Placing compute at the edge enables the "real-time decision-making" and "millisecond-by-millisecond" analysis that modern AI applications demand.
  2. Solving for Bandwidth and Cost: The "massive amount of real-time, localized data" generated by AI and IoT devices is often "impractical to send to a central data center". Processing this data locally reduces traffic on core networks and "decreases data transfer costs".
  3. Solving for Privacy and Sovereignty: Many organizations, particularly in finance and healthcare, have "data-related concerns". They "prefer to keep their volumes of sensitive data on-premises" to comply with regulations like GDPR and ensure data security. Edge computing processes data locally, satisfying these sovereignty rules.

This new architecture is enabled by a new class of hardware and software. On the hardware side, this includes AI-optimized edge chips like the NVIDIA Jetson and Intel Movidius, as well as prefabricated, modular/micro-modular data centers that can be deployed rapidly. On the software side, technologies like Federated Learning allow AI models to be trained across decentralized devices without the sensitive raw data ever leaving its source.

This AI edge data center response is also maturing beyond its initial hype. The early vision of "micro-edge disruptors", tiny data centers on every cell tower—has not fully materialized. The Uptime Institute notes that demand for small-scale edge sites (in the tens of kilowatts) "has not met the initially high expectations".

Instead, the real market growth is in larger, megawatt-scale facilities deployed in new geographical (edge) regions. This is confirmed by a 2025 CBRE report, which notes increasing interest in "Edge data centers with relatively easy-to-secure power capacity of up to 10MW". This redefines the "edge" not as a single tiny box, but as a new tier of 5-10 MW regional hubs, large enough to be economical and powerful but distributed enough to deliver low-latency performance.

Traditional vs. Edge Data Centers

Traditional vs. Edge Data Centers A Fundamental Shift in Infrastructure Design The shift to an edge-based architecture represents a fundamental change in infrastructure design and purpose. This comparison highlights the key differences between the traditional, centralized model and the new, distributed edge model.

The shift to an edge-based architecture represents a fundamental change in infrastructure design and purpose. The following table provides a clear comparison between the traditional, centralized model and the new, distributed edge model.

Metric Traditional (Hyperscale/Cloud) Data Center Edge Data Center
Location Centralized, remote, power-rich regions. Distributed; "at the edge of the network," closer to end-users and data sources.
Primary Use Case Large-scale, non-latency-sensitive workloads: Big Data analytics, batch processing, AI training, archival storage. Real-time, latency-sensitive workloads: AI inference, IoT, autonomous vehicles, AR/VR, gaming.
Typical Latency High (50ms+); data must travel long distances over a WAN. Ultra-low (sub-10ms); minimizes physical distance data travels.
Scale & Footprint Large-scale infrastructure; massive, multi-megawatt campuses. Compact, efficient, smaller-scale footprint; from single racks to 10MW regional hubs.
Power & Cooling Intensive power and cooling systems; optimized for economies of scale. More energy-efficient systems designed for smaller, distributed deployments.
Deployment Model Multi-year construction; requires land acquisition and massive capital outlay. Rapid deployment (weeks) using modular, "plug-and-play" designs; scales incrementally.

Re-Engineering the Data Center for High-Density AI

The AI revolution is not just changing where data centers are located; it is fundamentally "reimagining" and "re-engineering" their physical design from the ground up.

AI's Thermal Barrier

The single greatest driver of this internal redesign is heat. AI workloads, powered by dense clusters of GPUs, are pushing rack power densities to levels that were unfathomable just a few years ago. JLL reports that rack densities are now "exceeding 250 kilowatts".

New generations of AI chips are accelerating this trend. NVIDIA's next-generation GB200 chip is expected to reach rack densities of 130 kilowatts (kW). Its B200 GPU alone has a thermal design power (TDP) of 1200W.

These thermal loads have created a physical barrier. Traditional air-based cooling systems, the standard for 30 years, "can no longer meet the demands". As JLL experts state, the industry has gone "beyond what's possible with the physics of air cooling".

The Liquid Cooling Revolution

The Liquid Cooling Revolution To manage the intense heat generated by modern AI infrastructure, liquid cooling has emerged as the "inevitable development" and is "quickly becoming the new standard" for all AI workloads. Market penetration for liquid cooling is forecast to surpass 30% in 2025. This technology generally takes two primary forms: Direct-to-Chip (DTC) Cooling Also known as cold-plate cooling, this method uses small pipes to deliver a coolant (like water or a refrigerant) directly to a "cold plate" mounted on the hottest components, such as the CPU and GPU. The heat is transferred to the liquid and piped away. This is the method NVIDIA is officially guiding for its high-density servers. Immersion Cooling This more radical approach involves submerging the entire server and all its components directly into non-conductive, dielectric fluid. The fluid absorbs the heat and is then circulated to a heat exchanger.

To manage this intense heat, liquid cooling has emerged as the "inevitable development" and is "quickly becoming the new standard" for all AI workloads. Market penetration for liquid cooling is forecast to surpass 30% in 2025.

This technology generally takes two primary forms:

  1. Direct-to-Chip (DTC) Cooling: Also known as cold-plate cooling, this method uses small pipes to deliver a coolant (like water or a refrigerant) directly to a "cold plate" mounted on the hottest components, such as the CPU and GPU. The heat is transferred to the liquid and piped away. This is the method NVIDIA is officially guiding for its high-density servers.
  2. Immersion Cooling: This more radical approach involves submerging the entire server and all its components directly into non-conductive, dielectric fluid. The fluid absorbs the heat and is then circulated to a heat exchanger.

This shift is not theoretical; it is being driven by billions in active investment.

  • Equinix has announced plans to deploy liquid cooling in 100 of its data centers across 45 cities.
  • Digital Realty has already launched a high-density colocation offering capable of handling 70 kW per rack, powered by liquid cooling.
  • Aligned Data Centers is building a new, purpose-built, liquid-cooled campus for AI cloud provider Lambda, specifically to host the next-generation NVIDIA Blackwell GPUs.
  • Colovore recently secured a $925 million debt facility to expand its platform, which is purpose-built for liquid cooling at extreme densities of up to 200 kW per rack.

The implications of this shift are enormous. The entire global stock of data centers, built and optimized for air cooling, is now legacy infrastructure. Retrofitting a live facility for liquid cooling, which involves extensive new plumbing, is a "major undertaking" and a "high risk" operation. However, operators like Digital Realty are mastering "The art of the data center retrofit" , using a modular "system of systems" approach to upgrade facilities one section at a time. This signals the beginning of a massive, multi-trillion-dollar capital expenditure cycle to upgrade the world's digital infrastructure to be "AI-ready."

Power is the New Real estate

Even a perfectly liquid-cooled data center is useless if it cannot be powered. As established, "power availability has become the defining constraint on growth". Existing grid infrastructure "will struggle to support" the massive expansion of digital infrastructure. The Uptime Institute confirms that AI is "intensifying" this already strong demand.

This power scarcity has triggered a "new gold rush" , inverting the site-selection process. Operators are now "pushing development into new markets in search of capacity". This search is moving development into "secondary and tertiary markets" and "new hotspots" like Richmond, Virginia, based purely on power availability and grid connection timelines.

Traditional green energy strategies, such as signing off-site Power Purchase Agreements (PPAs) for solar or wind energy, are no longer sufficient. While important, these strategies are being called "insufficient on their own". The core problem is that solar and wind are intermittent, whereas data centers and AI workloads require 24/7/365, high-reliability firm power.

To solve this, the industry is moving "behind the meter" to generate its own power. This response involves two major strategies:

  1. Microgrids: Operators are increasingly building self-sufficient "nanogrid/microgrid power" systems. A microgrid integrates on-site power generation (like fuel cells or solar) with energy storage (like batteries), allowing the data center to operate independently ("island mode") during a grid failure, thus ensuring resilience.
  2. Next-Generation Firm Power: The most significant trend is the industry's "growing enthusiasm" for new, 24/7 clean power sources that can be deployed on-site or nearby.

This represents a fundamental identity shift. Data center operators were once passive customers of the utility. The failure of the grid to keep pace has forced them to become active producers of energy. By investing in on-site generation and exploring technologies like Small Modular Reactors (SMRs), enhanced geothermal, and hydrogen fuel cells, the data center operator of the AI era is evolving into a distributed power company.

The New Power Playbook

The New Power Playbook The new mandate for 24/7, clean, and reliable "firm power" has forced the industry to look beyond traditional renewables. The following table compares the leading on-site and firm power candidates that data center operators are now evaluating. Solar & Wind (PPAs) Reliability (Power Type) Intermittent. Requires battery storage for 24/7 use, which adds cost and complexity. Land/Space Requirements Very High. Requires "enormous amounts" of land; "potentially millions of acres" for grid-scale projects. Initial Capital Cost Low (for PPAs) to Medium (for on-site). Cost-competitive. Operational Scalability / Maturity High (Mature). Mature, well-understood technology. PPAs are a common procurement strategy. Hydrogen Fuel Cells Reliability (Power Type) 24/7 Firm Power. Proven track record for reliable primary or backup power. Land/Space Requirements Very Low. Extremely space-efficient. A 10 MW fuel cell installation can fit on ~1 acre. Initial Capital Cost High. "High Initial Investment Cost" is a primary barrier to adoption. Operational Scalability / Maturity High (Mature). Proven and scalable; systems can be added modularly from kW to multi-MW. Enhanced Geothermal (EGS) Reliability (Power Type) 24/7 Firm Power. Provides reliable, abundant, clean power. Land/Space Requirements Low-to-Medium. Uses modular plant designs. Footprint is far smaller than solar/wind. Initial Capital Cost High. Requires significant upfront investment in exploration and drilling. Operational Scalability / Maturity Medium (Emerging). "Uniquely positioned" to scale to gigawatts before 2030. Cost per MW declines as project size increases. Small Modular Reactors (SMRs) Reliability (Power Type) 24/7 Firm Power. "Abundant green energy". Land/Space Requirements Very Low. Highly compact and modular, providing up to 300 MW from a small footprint. Initial Capital Cost Very High. A "fraction of the traditional large-scale nuclear cost", but still a massive capital investment. Operational Scalability / Maturity Low (Pre-Commercial). "Still in the early stages of development." Commercial deployment not expected until 2030 at the earliest.

The new mandate for 24/7, clean, and reliable "firm power" has forced the industry to look beyond traditional renewables. The following table compares the leading on-site and firm power candidates that data center operators are now evaluating.

Energy Source Reliability (Power Type) Land/Space Requirements Initial Capital Cost Operational Scalability / Maturity
Solar & Wind (PPAs) Intermittent. Requires battery storage for 24/7 use, which adds cost and complexity. Very High. Requires "enormous amounts" of land; "potentially millions of acres" for grid-scale projects. Low (for PPAs) to Medium (for on-site). Cost-competitive. High (Mature). Mature, well-understood technology. PPAs are a common procurement strategy.
Hydrogen Fuel Cells 24/7 Firm Power. Proven track record for reliable primary or backup power. Very Low. Extremely space-efficient. A 10 MW fuel cell installation can fit on ~1 acre. High. "High Initial Investment Cost" is a primary barrier to adoption. High (Mature). Proven and scalable; systems can be added modularly from kW to multi-MW.
Enhanced Geothermal (EGS) 24/7 Firm Power. Provides reliable, abundant, clean power. Low-to-Medium. Uses modular plant designs. Footprint is far smaller than solar/wind. High. Requires significant upfront investment in exploration and drilling. Medium (Emerging). "Uniquely positioned" to scale to gigawatts before 2030. Cost per MW declines as project size increases.
Small Modular Reactors (SMRs) 24/7 Firm Power. "Abundant green energy". Very Low. Highly compact and modular, providing up to 300 MW from a small footprint. Very High. A "fraction of the traditional large-scale nuclear cost" , but still a massive capital investment. Low (Pre-Commercial). "Still in the early stages of development." Commercial deployment not expected until 2030 at the earliest.

Case Studies of Industry Leaders Building the Distributed Future

This triple-threat response: distributed location, liquid-cooled designs, and on-site power, is not just theory. It is the active, funded strategy of the industry's largest players. CRN's 2025 list of the "25 Hottest AI Companies for Data Center and Edge" shows an entire ecosystem pivot, including chipmakers (NVIDIA, AMD), infrastructure providers (HPE, Dell, Supermicro), networking (Cisco), and edge specialists (Scale Computing).

Three case studies exemplify this new, hybrid response:

Case Study 1: Equinix (The Ecosystem Integrator)

Equinix is leveraging its massive global footprint of over 270 data centers to be the connector for distributed AI. Their strategy is not to build the AI models, but to provide the "AI-optimized global network" that connects all the partners who do. Their Equinix Fabric® platform allows enterprises to securely connect to a "vendor-neutral AI ecosystem" of over 2,000 partners. To "de-risk" this complex new architecture for its customers, Equinix has launched a global "AI Solutions Lab" where businesses can test and validate their AI designs before deploying them at scale.

Case Study 2: Digital Realty (The Incumbent & Retrofit Specialist)

As one of the world's largest data center owners, Digital Realty's challenge is different: how to integrate new, 100-kW liquid-cooled AI racks with the "legacy IT infrastructure" of 15-kW air-cooled racks that comprise most of its portfolio. Their solution is a "system of systems" modular design. Instead of attempting a slow and costly "high-risk" retrofit of an entire building, they can upgrade one room or even one rack at a time. This modular approach is proven: it provided one European financial services client with a "6x faster time to deployment" for a new, high-density workload.

Case Study 3: DartPoints (The Pure-Play Edge Specialist)

DartPoints represents the pure-play AI edge data center response. Their business model is built on providing high-performance colocation specifically for "advanced workloads including artificial intelligence" in distributed edge markets. A recent partnership with Virtuous AI highlights this strategy perfectly. Virtuous AI runs its generative AI models from DartPoints' Greenville, South Carolina, data center. DartPoints provides the secure, compliant, and carrier-neutral physical infrastructure, allowing the AI company to focus on its algorithms and scale rapidly in a low-latency market.

These three companies represent the three essential layers of the industry's response. Equinix is the Ecosystem & Connectivity Layer, the "digital Switzerland" connecting all the new partners. Digital Realty is the Incumbent & Real Estate Layer, solving the trillion-dollar problem of making the last 20 years of infrastructure compatible with the next five years of AI. DartPoints is the New Market & Specialist Layer, providing the agile, specialized facilities where edge demand is emerging. Together, this multi-layered, hybrid, and comprehensive AI edge data center response is what will power the next wave of global innovation.

The Future is Distributed, Resilient, and Liquid-Cooled

For decades, the data center industry quietly powered the digital world. That era is over. AI has thrust the industry into a "transformative era”, forcing it to navigate "rising costs, worsening power constraints," and persistent supply chain delays.

In response to this three-front crisis of power, latency, and demand, a clear strategic direction has emerged. This is not an "edge vs. cloud" debate. The future is an integrated, hybrid architecture that strikes a "balance between edge and cloud computing capacity", using massive, centralized campuses for training and a distributed network of edge nodes for inference.

The data center of the future is not a single monolithic building. It is a distributed, resilient, and intelligent network. Its foundation is built from high-density, liquid-cooled hardware, and it is powered by a new ecosystem of on-site, reliable, and increasingly independent energy sources. The AI revolution is not just a new workload for the data center industry; it is the catalyst that is fundamentally redesigning the industry itself.


Works cited and additional resources

  1. The AI-driven data center revolution: Why 2025 is a defining year - JLL Spark, https://spark.jllt.com/resources/blog/the-ai-driven-data-center-revolution-why-2025-is-a-defining-year/
  2. 2025 Global Data Center Outlook - JLL, https://www.jll.com/en-us/insights/market-outlook/data-center-outlook
  3. AI power: Expanding data center capacity to meet growing demand - McKinsey, https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/ai-power-expanding-data-center-capacity-to-meet-growing-demand
  4. The Evolving Landscape of Data Centers: Power, AI and Capital in 2025, https://www.datacenterfrontier.com/sponsored/article/55283194/the-evolving-landscape-of-data-centers-power-ai-and-capital-in-2025
  5. Growth of AI creates unprecedented demand for global data centers - JLL, https://www.jll.com/en-us/newsroom/growth-of-ai-creates-unprecedented-demand-for-global-data-centers
  6. The Rise of AI at the Edge | Micron Technology Inc., https://www.micron.com/about/blog/applications/ai/the-rise-of-ai-at-the-edge
  7. Inference Zones: How Data Centers Support Real-Time AI - CoreSite, https://www.coresite.com/blog/inference-zones-how-data-centers-support-real-time-ai
  8. Can US infrastructure keep up with the AI economy? - Deloitte, https://www.deloitte.com/us/en/insights/industry/power-and-utilities/data-center-infrastructure-artificial-intelligence.html
  9. Data center availability crisis deepens as vacancy hits historic low, https://www.jll.com/en-us/newsroom/data-center-availability-crisis-deepens-as-vacancy-hits-historic-low
  10. North America Data Center Trends H1 2025: AI & Hyperscaler ..., https://www.cbre.com/insights/briefs/north-america-data-center-trends-h1-2025-ai-and-hyperscaler-demand-lead-to-record-low-vacancy
  11. Record-low data center vacancy fuels modern-day “gold rush” - JLL, https://www.jll.com/en-us/newsroom/record-low-data-center-vacancy-fuels-modern-day-gold-rush
  12. North America Data Center Trends H1 2025 - CBRE, https://www.cbre.com/insights/reports/north-america-data-center-trends-h1-2025
  13. North America Data Center Report Midyear 2025 - JLL, https://www.jll.com/en-us/insights/market-dynamics/north-america-data-centers
  14. Agenda 2025 - Global - All in one - TechEx Events, https://techexevent.com/agenda-2025-global-all-in-one/
  15. Distributed AI Infrastructure: Accelerating Innovation at Scale ..., https://blog.equinix.com/blog/2025/09/25/distributed-ai-infrastructure-accelerating-innovation-at-scale/
  16. Data Centers - CBRE, https://www.cbre.com/insights/books/us-real-estate-market-outlook-2024/data-centers
  17. Breaking barriers to Data Center Growth | BCG, https://www.bcg.com/publications/2025/breaking-barriers-data-center-growth
  18. What Is an AI Data Center? - IBM, https://www.ibm.com/think/topics/ai-data-center
  19. Latency Comparison of Cloud Datacenters and Edge Servers - NSF Public Access Repository, https://par.nsf.gov/servlets/purl/10184999
  20. Five data center predictions for 2025 - Uptime Institute, https://uptimeinstitute.com/uptime_assets/5806c68ee3f5faba8774036e08866d6b27c5e72912d068620c31ecd4cddaa191-five-data-center-predictions-for-2025.pdf?utm_source=LinkedIn&utm_medium=Social&utm_campaign=Predictions%20Report&trk=test
  21. Global edge data center market to cross $300B by 2026: JLL, https://www.jll.com/en-us/newsroom/global-edge-data-center-market-to-cross-300-billion-dollar-by-2026
  22. Data Centers At The Edge - Uptime Institute, https://uptimeinstitute.com/data-centers-at-the-edge
  23. How AI at the Edge is Revolutionizing Real-Time Decision Making - DataBank, https://www.databank.com/resources/blogs/how-ai-at-the-edge-is-revolutionizing-real-time-decision-making/
  24. Edge AI: How AI Agents and Real-Time Data Shape Industries - Sand Technologies, https://www.sandtech.com/insight/edge-ai-how-ai-agents-and-real-time-data-shape-industries/
  1. Edge Data Centers 2025 - Azura Consultancy, https://www.azuraconsultancy.com/edge-data-centers/
  2. What are Edge Data Centers? | Detailed Guide - Blackridge Research & Consulting, https://www.blackridgeresearch.com/blog/what-is-an-edge-data-center
  3. Updates: Past TMT Predictions' greatest hits and (near) misses - Deloitte, https://www.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/2025/past-tmt-predictions-hits-and-misses.html
  4. TMT Predictions 2025 | Deloitte Insights, https://www.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions.html
  5. AI Data Center Statistics & Trends, https://www.rsinc.com/ai-data-center-statistics-trends.php
  6. Unofficial Notes Data Center Dynamics (DCD) Connect Investment Forum - Summit Ridge Group, https://summitridgegroup.com/wp-content/uploads/DCD-Notes-14YM-2.pdf 31. Resources - Research & Reports - Deployment Models at the Edge - Uptime Institute, https://uptimeinstitute.com/resources/research-and-reports/deployment-models-at-the-edge
  7. Global Data Center Trends 2025 | CBRE, https://www.cbre.com/insights/reports/global-data-center-trends-2025
  8. Designing Data Centers for AI: Infrastructure for High-Density Compute | Knowledge Hub, https://www.wesco.com/us/en/knowledge-hub/articles/designing-data-centers-for-ai-infrastructure-for-high-density-compute.html
  9. Thermal Management for Data Centers 2025-2035: Technologies, Markets, and Opportunities - IDTechEx, https://www.idtechex.com/en/research-report/thermal-management-for-data-centers/1036
  10. Liquid cooling enters the mainstream in data centers - JLL, https://www.jll.com/en-us/insights/liquid-cooling-enters-the-mainstream-in-data-centers
  11. How the AI Gold Rush Is Influencing Data Center Design Trends - Gensler, https://www.gensler.com/blog/ai-influencing-data-center-design-trends
  12. Liquid Cooling to Scale in AI Data Centers, Penetration to Surpass 30% in 2025, https://www.techpowerup.com/forums/threads/liquid-cooling-to-scale-in-ai-data-centers-penetration-to-surpass-30-in-2025.340164/
  13. Optimization Control Strategies and Evaluation Metrics of Cooling Systems in Data Centers: A Review - MDPI, https://www.mdpi.com/2071-1050/16/16/7222
  14. Liquid Cooling Comes to a Boil: Tracking Data Center Investment ..., https://www.datacenterfrontier.com/cooling/article/55292167/liquid-cooling-comes-to-a-boil-tracking-data-center-investment-innovation-and-infrastructure-at-the-2025-midpoint
  15. Data Center Design in the Age of AI: Integrating AI with Legacy Infrastructure - Digital Realty, https://www.digitalrealty.com/resources/articles/integrating-ai-with-legacy-infrastructure
  16. Data Centers in the age of AI | Digital Realty, https://www.digitalrealty.com/resources/articles/data-centers-in-the-age-of-ai
  17. Renewable energy for data centers - Uptime Institute, https://uptimeinstitute.com/renewable-energy-for-data-centers
  18. How data center operators can transition to renewable energy - Uptime Institute Blog, https://journal.uptimeinstitute.com/how-data-center-operators-can-transition-to-renewable-energy/
  19. (PDF) Data Centers and Green Energy: Paving the Way for a Sustainable Digital Future, https://www.researchgate.net/publication/376305080_Data_Centers_and_Green_Energy_Paving_the_Way_for_a_Sustainable_Digital_Future
  20. Renewable Energy Use in Data Centers: Green Revolution - Dgtl Infra, https://dgtlinfra.com/renewable-energy-data-centers/
  21. Smart Energy for the Data Center - Uptime Institute, https://uptimeinstitute.com/smart-energy-for-the-data-center
  22. Protecting data center availability with microgrids - AlphaStruxure, https://alphastruxure.com/protecting-data-center-availability-with-microgrids/ 48. Scaling 24/7 Power for the AI Era: The Enhanced Geothermal Data ..., https://fervoenergy.com/fervo-uipa-the-enhanced-geothermal-data-center-corridor-july-2025/
  23. How Fuel Cells Help Solve the Growing Data Center and AI ..., https://fchea.org/how-fuel-cells-help-solve-the-growing-data-center-and-ai-challenge/
  24. The 25 Hottest AI Companies For Data Center And Edge: The 2025 ..., https://www.crn.com/news/ai/2025/the-25-hottest-ai-companies-for-data-center-and-edge-the-2025-crn-ai-100
  25. Equinix unveils distributed AI infrastructure to help businesses accelerate the next wave of AI innovation | TelecomTV, https://www.telecomtv.com/content/digital-platforms-services/equinix-unveils-distributed-ai-infrastructure-to-help-businesses-accelerate-the-next-wave-of-ai-innovation-53923/
  26. Enabling Efficient AI Workloads in Digital Realty Data Centers: Advanced Engineering Group Spotlight, https://www.digitalrealty.com/resources/articles/enabling-efficient-ai-workloads-in-digital-realty-data-centers-advanced-engineering-group-spotlight
  27. DartPoints Puts New Capital to Work: Launches Multi-Market Expansion to Power HPC, AI and Large Enterprise Growth, https://dartpoints.com/dartpoints-puts-new-capital-to-work-launches-multi-market-expansion-to-power-hpc-ai-and-large-enterprise-growth/
  28. DartPoints Expands Reach in AI | DartPoints, https://dartpoints.com/dartpoints-expands-reach-in-ai/ 55. Uptime Institute Global Data Center Survey 2024, https://datacenter.uptimeinstitute.com/rs/711-RIA-145/images/2024.GlobalDataCenterSurvey.Report.pdf
  29. Uptime's 15th Annual Global Data Center Survey Results Shows Both Commitment and Hesitancy as Industry Plans for Wider AI Usage, Climate Change Reporting, and the NVIDIA Revolution to Come, https://uptimeinstitute.com/about-ui/press-releases/uptimes-15th-annual-global-data-center-survey-results-shows-both-commitment-and-hesitancy
  30. Scaling bigger, faster, cheaper data centers with smarter designs - McKinsey, https://www.mckinsey.com/industries/private-capital/our-insights/scaling-bigger-faster-cheaper-data-centers-with-smarter-designs

The Symbiotic Growth of AI and Edge Infrastructure


Most Recent Related Stories

Boost Data Center Reliability with Asset Monitoring Read More
Microsoft Expands AI Infrastructure with Nscale Lease in Portugal Read More
Accelerate Data Center AI with MCP Servers Read More