26 bn con­nected de­vices by 2020 to cre­ate chal­lenges for data cen­ter

InformationWeek - - News -

The In­ter­net of Things (IoT) has a po­ten­tial trans­for­ma­tional ef­fect on the data cen­ter mar­ket, its cus­tomers, tech­nol­ogy providers, tech­nolo­gies, and sales and mar­ket­ing mod­els, ac­cord­ing to Gart­ner. Gart­ner es­ti­mates that the IoT will in­clude 26 bil­lion units in­stalled by 2020, and by that time, IoT prod­uct and ser­vice sup­pli­ers will gen­er­ate in­cre­men­tal rev­enue ex­ceed­ing USD 300 bil­lion, mostly in ser­vices.

“IoT de­ploy­ments will gen­er­ate large quan­ti­ties of data that need to be pro­cessed and an­a­lyzed in real time,” said Fabrizio Bis­cotti, Re­search Di­rec­tor at Gart­ner. “Pro­cess­ing large quan­ti­ties of IoT data in real time will in­crease as a pro­por­tion of work­loads of data cen­ters, leav­ing providers fac­ing new se­cu­rity, ca­pac­ity and an­a­lyt­ics chal­lenges.”

The mag­ni­tude of net­work con­nec­tions and data as­so­ci­ated with the IoT will ac­cel­er­ate a dis­trib­uted data cen­ter man­age­ment ap­proach that calls for providers to of­fer ef­fi­cient sys­tem man­age­ment plat­forms. “IoT threat­ens to gen­er­ate mas­sive amounts of in­put data from sources that are glob­ally dis­trib­uted. Trans­fer­ring the en­tirety of that data to a sin­gle lo­ca­tion for pro­cess­ing will not be tech­ni­cally and eco­nom­i­cally vi­able,” said Joe Sko­rupa, VP and Distin­guished An­a­lyst at Gart­ner. “The re­cent trend to cen­tral­ize ap­pli­ca­tions is in­com­pat­i­ble with the IoT. Or­ga­ni­za­tions will be forced to ag­gre­gate data in mul­ti­ple mini data cen­ters where ini­tial pro­cess­ing can oc­cur. Rel­e­vant data will then be for­warded to a cen­tral site for additional pro­cess­ing.”

This new ar­chi­tec­ture will present op­er­a­tions staffs with sig­nif­i­cant chal­lenges, as they will need to man­age the en­tire en­vi­ron­ment as a ho­mo­ge­neous en­tity while be­ing able to mon­i­tor and con­trol in­di­vid­ual lo­ca­tions. Fur­ther­more, back­ing up this vol­ume of data will present po­ten­tially in­sol­u­ble gov­er­nance is­sues, such as net­work band­width and re­mote stor­age band­width, and ca­pac­ity to back up all raw data is likely to be un­af­ford­able. Con­se­quently, or­ga­ni­za­tions will have to au­to­mate se­lec­tive backup of the data that they be­lieve will be valu­able. This sift­ing and sort­ing will gen­er­ate additional Big Data pro­cess­ing loads that will con­sume additional pro­cess­ing, stor­age and net­work re­sources.

“Data cen­ter op­er­a­tions and providers will need to de­ploy more for­ward-look­ing ca­pac­ity man­age­ment plat­forms that can in­clude a data cen­ter in­fra­struc­ture man­age­ment (DCIM) sys­tem ap­proach of align­ing IT and op­er­a­tional tech­nol­ogy (OT) stan­dards and com­mu­ni­ca­tions pro­to­cols. Al­ready in the data cen­ter plan­ning phase, through­put mod­els de­rived from sta­tis­ti­cal ca­pac­ity man­age­ment plat­forms or in­fra­struc­ture ca­pac­ity tool­kits will in­clude busi­ness ap­pli­ca­tions and as­so­ci­ated data streams,” said Bis­cotti. “Those com­pre­hen­sive sce­nar­ios will im­pact de­sign and ar­chi­tec­ture changes by mov­ing to­ward vir­tu­al­iza­tion, and cloud ser­vices. This will re­duce the com­plex­ity and boost on-de­mand ca­pac­ity to deliver re­li­a­bil­ity and busi­ness con­ti­nu­ity.”

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.