Data Center News - CoreSite Connect[ED] Blog

The Backbone of AI: Performance, Proximity and Bandwidth

Written by The CoreSite Team | 08/15/2025

Artificial intelligence is reshaping industries faster than ever – but this wave of innovation rides on a complex and evolving infrastructure. AI has two primary requirements: training requires massive, centralized, high-density computing power, and inference needs to happen close to users and devices for accurate, actionable, valuable results.

According to Zayo, data center network bandwidth purchasing surged by 330% in 2024, driven largely by AI workloads in hyperscale data centers.1 Meeting the capacity for data transfer that AI needs means leveraging hybrid IT infrastructure that not only delivers low-latency performance but also balances power efficiency, geographic flexibility and seamless connectivity.

Training at Scale Starts in the Core

Figure 1. As AI continues to grow, so too does the demand for data center capacity.1

AI training uses large amounts of data to teach a machine learning model to perform specific tasks. As the model processes this data, it refines its parameters to improve accuracy and performance. This isn’t a one-time event – it’s an iterative, compute-heavy cycle that demands powerful hardware and optimized infrastructure.

Given the complexity and scale of AI training, it must be processed at core data centers equipped to handle the load. These hyperscale facilities provide the advanced computation power – via high performance GPUs or AI accelerators – to run large-scale training. They offer dense storage solutions required to manage massive datasets. High-speed interconnects within the data center allow for seamless parallel processing and data movement between systems, reducing training time and improving model performance.

Looking ahead, the infrastructure demands for AI are expected to grow rapidly. A recent McKinsey report forecasts that by 2030, global demand for AI will require 2.5 times more data center capacity, with about 70% equipped to host advanced AI workloads.2 But it’s not just about adding more space; supporting AI at scale requires a shift in how data centers are being designed and located.

Bringing AI Inference Closer

Figure 2. Data center locations are shifting to distributed areas, based on the primary function of the data center in AI inference.2

Once models are trained, the real work begins. Inference is when models put their training to use – analyzing data, making decisions and delivering personalized results in real time. Unlike training, inference is typically less compute-intensive but far more sensitive to latency. It’s about data-transfer speed, efficiency and consistency. This is why AI inference workloads are moving closer to the end points.

To support this shift, colocation providers are enabling AI in “Inference Zones,” areas in geographic proximity to the AI inference workloads. These are relatively compact regions akin to cloud availability zones, typically located in metro areas, where AI is being implemented, often on IoT end devices. These zones offer data transfer in real time, reduce backhaul traffic and help meet compliance requirements by processing data locally.

Performance is just one benefit. Local inference also improves data privacy, reduces costs from long-haul data transfers and enhances the overall efficacy of AI-driven applications.

But while inference demands proximity, AI training still relies on power. Supporting both ends of the AI lifecycle requires something more: seamless connectivity across increasingly distributed environments.

Power Drives AI. Fiber Connects It.

The rise in AI workloads is driving unprecedented energy demand in data centers. Average power density per rack has more than doubled in just two years – from 8 kW to 17 kW – and is expected to reach 30 kW by 2027.2 Overall, data center energy consumption could increase by 160% by 2030 to keep up with AI growth.3

Because of these power demands, operators are shifting away from traditional tech hubs where power is limited or constrained. Instead, they’re expanding into regions with abundant, stable or renewable energy sources, and in locations with favorable utility incentives.

However, distributing compute across more diverse geographies introduces a critical need for reliable, high-speed connectivity. Dark fiber plays a key role here by offering private, high-capacity links that connect multiple data center sites with low-latency, secure networks. These links support rapid data replication, real-time coordination between core, cloud and colocation data centers and consistent performance across distributed AI clusters.

In this evolving landscape, power availability guides where AI infrastructure is built, while dark fiber connectivity ensures it can scale effectively and operate seamlessly.

Infrastructure Built for AI’s Next Leap

AI is reshaping how the world works, and the infrastructure behind it must evolve to keep pace. This means building hybrid, geographically diverse and interconnected data center ecosystems that bring compute and data closer to where they’re needed, whether in centralized training hubs or regional Inference Zones.

Organizations that embrace this new infrastructure mindset, one that accounts for the distribution of workloads across the entire data center continuum and balances power, location and seamless connectivity, won’t just keep up with AI’s demands – they’ll be leading the charge.

 

Know More

CoreSite helps power the next era of AI with high-performance colocation and interconnection solutions. Our data centers support hybrid architectures with access to scalable power, low-latency inter-site connectivity and direct cloud onramps. Explore how CoreSite’s ecosystem of network, cloud and service providers can help you build smarter, more connected infrastructure that's ready for the demands of AI.

Ready to talk about how CoreSite can help you bring AI into your infrastructure? 

Contact us to start the conversation.

In the meantime, learn more about what our clients are doing with AI and download our Tech Brief to get insight on actualizing AI's potential for your organization.

References

1. Data center bandwidth soars 330%, driven by AI demand (Data Center Knowledge)
2. AI power: Expanding data center capacity to meet growing demand (McKinsey)
3. AI is poised to drive 160% increase in data center power demand (Goldman Sachs)