Data Center News - CoreSite Connect[ED] Blog

A Complete Newbie’s Primer on the Data Center Ecosystem

The first rule of data centers: Not all are created equal.  When we’re talking about different types of data centers and their utilization, perhaps the most important distinction is between those that are purpose-built and those that are in facilities that have been retrofitted to include data centers. 

Editor's note: This blog was originally posted in the Forbes Technology Council forum, an invitation-only community for world-class CIOs, CTOs and technology executives, and can be found here. It has been modified slightly to conform to CoreSite style guidelines.

For instance, when we’re talking about “on-prem” – that is, on the premises – the data center could start out as a telco closet or a single room within a company building, which mushrooms into something bigger with changes in technology. With few exceptions, on-prem data centers are typically not purpose-built; they’re built into an existing structure and adapted to data needs. These are common in sectors such as fintech or pharma, or even Fortune 500 companies – organizations that want their own data centers – so that they can keep their data and run their own applications up close, often due to strict privacy concerns, other times as a matter of proximity and convenience.

Enterprises today have the ability to align their choice and mix of data centers to support distributed computing strategies that enable scalability, resilience and better control of operational costs.

On-prem (or corporate) data centers are in direct contrast with what we call “hyperscale.” It sounds futuristic, but really, the “hyper” just denotes the size of the building – we’re talking about buildings that tend to be gigantic and support 100-megawatts of capacity. Essentially, hyperscale data centers are like server warehouses – purpose-built, with a ton of infrastructure.

Hyperscale data centers require access to a lot of power and a lot of physical space. They typically have their own substations and provide power and cooling to a single tenant. (They can also be a place for other businesses to host servers, but these arrangements usually are very standardized.) The cloud and AI require thousands upon thousands of servers to run the AI models and the compute, and these hyperscale facilities are in effect where “the brains” of public clouds and AI resides.

The third type of data center, colocation (or “colo”), is centered around interconnection to an ecosystem of digital communities. These are not as large as the massive, single tenant hyperscale facilities that I alluded to earlier, but they're also purpose-built. These buildings, in effect, serve a dual purpose: Besides providing the power and infrastructure to support workloads on customer-owned servers, they also enable the different constituents to interconnect within their digital ecosystems in “meet me rooms,” where content providers can connect with broadband networks and enterprises can connect to private and public clouds leveraging “exchanges” or “digital hubs.”

A Shift Toward Outsourcing

All organizations that use any kind of data infrastructure are trying to achieve a common goal: interconnectivity. For some organizations, an on-prem center is adequate. However, the problem with that model is that the technology changes quite quickly, but the infrastructure that supports it doesn't. Consequently, if running a data center isn’t a core part of your business – i.e., if your business is a bank or a hospital – then adapting to technological changes can be inefficient, cost a lot of capital and still result in difficulties with interconnection.

Over the past decade, we have observed that more and more enterprises are outsourcing their data center needs, migrating their workloads from their on-prem to third-party data centers. They do this principally because they need their workloads to be interconnected with the other parts of their digital ecosystem or digital supply chain. Say you run a bank, for instance: You have to connect to networks that carry your traffic or authenticate transactions, and you need to connect to the cloud, where you have all the storage for the banking that your clients are currently doing and have done. You might also want to make connections to another cloud for your algorithmic quant trading arm. So, if you have a large and disparate portfolio of companies you work with, it makes sense to go to a third-party data center, where you can have both the compute you own and operate plus access to the platforms you interconnect with. These sorts of facilities can still utilize megawatts of power, which is more than an on-prem data center could reliably support, but that pales in comparison to the amount required by hyperscale facilities built to host the underlying cloud infrastructure.

Owned and Shared Infrastructure

Data center components include redundant power systems such as backup generators, HVAC and advanced cooling systems (such as liquid cooling), high-speed networking equipment and layered physical security systems managed by trained personnel.

For many people, it can be hard to grasp the idea that, in a colo setup, the company owns the servers that sit in an off-site data center. They erroneously assume that physical assets are owned by the data center provider. In reality, while the company that owns data centers is responsible for providing space, consumable utilities (power, water) and maintenance as well as operation and connectivity, racks and servers are owned by individual companies and other tenants.

Another thing to keep in mind is that workloads (and by extension architecture) are becoming ever more distributed. What’s driving that distribution is the increasing amount of data generated by businesses and end users, and the proliferation of AI. Generative AI tools require unprecedented amounts of power, to the point where the largest data center markets are power-constrained. Ultimately, for enterprises and individuals to continue operating these workloads, digital interconnection hubs will have to pop up in more locations. Companies can then colocate data in close proximity to where the AI models are being run to enhance performance and reduce cost.

In a sense, we’ve all started from Pangea, where there was one blob – Ashburn marking the east coast, Silicon Valley the west coast – and that was it. But this is no longer supportable; it's impossible to have that much power in one location, and the need for broader interconnectivity, with low latency, necessitates a more complex and diverse data center ecosystem.

An Imperative for Self-Awareness

Enterprises are no longer all brick-and-mortar. However, even platforms whose business is entirely digital may still want to own a part of their data center. And in today's world, every enterprise uses the cloud. To some extent, the cloud sits physically in these hyperscale clusters – and whatever a business is doing, they’re going to have to integrate with it one way or another.

For some companies, it's okay to have their own data center and interconnect with the cloud via the internet; that's perfectly fine if you're a small user with no significant security concerns or if batch processing isn’t a necessity. But bluntly, the days are over when an enterprise can exist in an end-state where all IT functions are performed on-prem, in isolation.

Fundamentally, enterprise leaders must know who they are – that is, develop a certain degree of digital self-awareness. They must have a keen understanding of their processes, workflows and requirements in order to determine how best to architect their IT infrastructure. To attain the optimal balance of cost and performance, they must look at every piece of the puzzle and think of what they are today and what they expect to become over time.

 

Know More

Visit CoreSite’s Knowledge Base to learn more about the ways in which data centers are meeting clients’ constantly increasing power and other infrastructure requirements.

The Knowledge Base includes informative videos, infographics, articles and more. This digital content hub highlights the pivotal role data centers play in transmitting, processing and storing vast amounts of data across both wireless and wireline networks – acting as the invisible engine that helps keep the modern world running smoothly.