26 bn connected devices by 2020 to create challenges for data center
The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models, according to Gartner. Gartner estimates that the IoT will include 26 billion units installed by 2020, and by that time, IoT product and service suppliers will generate incremental revenue exceeding USD 300 billion, mostly in services.
“IoT deployments will generate large quantities of data that need to be processed and analyzed in real time,” said Fabrizio Biscotti, Research Director at Gartner. “Processing large quantities of IoT data in real time will increase as a proportion of workloads of data centers, leaving providers facing new security, capacity and analytics challenges.”
The magnitude of network connections and data associated with the IoT will accelerate a distributed data center management approach that calls for providers to offer efficient system management platforms. “IoT threatens to generate massive amounts of input data from sources that are globally distributed. Transferring the entirety of that data to a single location for processing will not be technically and economically viable,” said Joe Skorupa, VP and Distinguished Analyst at Gartner. “The recent trend to centralize applications is incompatible with the IoT. Organizations will be forced to aggregate data in multiple mini data centers where initial processing can occur. Relevant data will then be forwarded to a central site for additional processing.”
This new architecture will present operations staffs with significant challenges, as they will need to manage the entire environment as a homogeneous entity while being able to monitor and control individual locations. Furthermore, backing up this volume of data will present potentially insoluble governance issues, such as network bandwidth and remote storage bandwidth, and capacity to back up all raw data is likely to be unaffordable. Consequently, organizations will have to automate selective backup of the data that they believe will be valuable. This sifting and sorting will generate additional Big Data processing loads that will consume additional processing, storage and network resources.
“Data center operations and providers will need to deploy more forward-looking capacity management platforms that can include a data center infrastructure management (DCIM) system approach of aligning IT and operational technology (OT) standards and communications protocols. Already in the data center planning phase, throughput models derived from statistical capacity management platforms or infrastructure capacity toolkits will include business applications and associated data streams,” said Biscotti. “Those comprehensive scenarios will impact design and architecture changes by moving toward virtualization, and cloud services. This will reduce the complexity and boost on-demand capacity to deliver reliability and business continuity.”