Everyone is talking about the Internet of Things (IoT) and most of what is being said is steeped in massive assumptions. Gartner talks about 20.8 billion connected things by 2020; Cisco believes the number will be closer to 50 billion connected devices by 2020. Those are staggering assumptions, and that degree of device proliferation will forever change technology and the way we use it. “Smart” initiatives (smart cities, grids, factories, connected cars) will fundamentally change where and how interconnection works and where sensor data will reside. I could potentially see a dramatic rewriting of network peering nodes and data center hotspots. However, when I look at the composition of data that is generated by connected devices, I see a larger issue: what to do with that data.
The forecast that I think is most stunning comes from Cisco, “Mobile data traffic will grow at a CAGR of 53 percent between 2015 and 2020, reaching 30.6 EB per month by 2020.” Sensors will be responsible for the lion’s share of that growth but will also provide two challenges-- where should that data be hosted and analyzed, and how do you determine what packets are high value and what is garbage? Let’s tackle each of these concerns individually:
This proliferation of internet connected devices and sensors will happen everywhere and the density of those devices will depend on the application. This highly distributed, decentralized architecture is not new in IT, but the degree of dispersion will be unprecedented. No longer will the core of global backbones be the most valuable link in the chain. The edges of the network will be the most logical locations for compute, storage and analysis of packets.
I see some disconnects with building IoT centric network and data center architecture. The art of planning, building and commissioning data center capacity relies on balancing supply and demand of power, electrical and mechanical resources. This will require the backing of large-scale colocation providers that can provide maximum efficiency of resources and provide high quality, low cost capacity, but that have the ability to serve customers closer to the edge. Meeting customers where they reside will continue to be a core tenant of architectural flexibility. This model is evident in data center “campuses,” where data center providers create enterprise-grade facilities and connect them to carrier hotels in metro locations.
Everyone agrees that the amount of sensor-driven mobile data will grow at an absurd rate. Many also see a benefit from locating compute and storage closer to the sensors to avoid network congestion. All good things to have top of mind, but the heavy lifting happens when one needs to sift through that mountain of data and make decisions based on collected data. Running analytics on unstructured datasets is not ideal and will generally not produce any conclusions, especially not timely ones. Data centers that are targets of IoT driven event data need to have a fabric or platform that allows users to sift through the garbage and identify the useful data. Fabric elements should include:
IoT represents a huge market opportunity, but it comes with a correspondingly huge responsibility. Taking on the task of aggregating and analyzing IoT event driven data requires a new data center and network architecture; one that provides a secure environment that supports interoperability.
Ted is an industry professional with over 20 years of experience in critical thinking, client centricity, market analysis, corporate development and networking sales with an emphasis on the Cloud, enterprise networking, carrier services and hosted infrastructure services.Read more from this author