Evolution of the traditional data centre
IT budgets for data centres are going down and Gartner predicts that by 2025, 80% of enterprises will shut down their tradional data centres. With future projections indicating a massive shift away from traditional data centres, Network Middle East examines what the future of data centres looks like
The amount of data generated by 2025 is set to rise to 175 zettabytes and as such data centres will continue to play an essential role in the storage, computation and management of information. Major tech companies are continuing to invest in new data centres by buying land near power sources for future sites. Across a wide range of industries — from healthcare to finance to manufacturing — companies rely on data centres to support the growing creation and consumption of data.
However, IT budget spend on data centres has reduced as per analysts. With Gartner predicting a massive shift away from traditional data centres, what does the future of data centres look like?
SMBS have evolved from renting physical servers, to virtual machines, to Saas applications, to finally function as a Service or serverless computing platforms. The terms for renting have been reduced in scope and time, with contract cycles coming down from months to minutes. This trend will continue to gain popularity as Saas applications are more cost- effective and take less time to get to market.
Shailesh Davey, vice president, Manageengine and a co-founder of Zoho Corp explains how data residency regulations have made it mandatory to retain data in enterprise data centres (EDCS) for larger enterprises to stay in compliance.
But even larger enterprises are moving to the public cloud or Saas applications to deliver a better customer experience and leverage big data analytics, like analysing customers’ social media data. The data within EDCS is controlled within a private network, and connects to the data in the public cloud using secure tunnels. This hybrid cloud setup enables enterprises to access data stored both publicly and privately, helping them stay compliant.
According to Fadi Kanafani, managing director and general manager – Middle East, Netapp, this shift in the data centre industry was anticipated few years ago. Hence, Netapp is advising its customers to stop building data centres and invest in building a Data Fabric instead. “Although the traditional data centre as we know it will see a downward spiral, it will not disappear. It will go through an evolution driven by business outcomes.
The future data centre is essentially software defined, borderless, dynamic and makes up a critical part of the Data Fabric circle of data management.
As organisations start moving workloads back from public to hybrid clouds, it is important to look at the architecture itself. Is it open enough to allow seamless data mobility? Is data gravity being addressed? What are the cost implications? Are data efficiencies available in the hybrid cloud? What is the security posture that needs to be in place? Has data classification been done? Is data tiering being reviewed as part of the data management life cycle? And lastly, does the design lend itself for multi- cloud architecture? Having said that, the success of a hybrid cloud
strategy will directly tie into the design of the Data Fabric architecture and services that have the answers to all these questions – and many more.
Khwaja Saifuddin, senior sales director, Middle East at Western Digital points out that hybrid cloud isn’t necessarily about separating workloads between private cloud vs. public cloud. “A hybrid cloud works in a particular application use case where you can take best advantage of each platform. So, for example, if you have an application that requires a petabyte or more of storage capacity then it is not a good candidate for cloud hosting due to the cost. Uploads to the cloud are generally exempt from bandwidth charges, which means you can upload your data, process it, retrieve the result, then delete it to avoid the storage fees,” adds Saifuddin.
By hosting the data locally, you can migrate your compute resources from one hosting provider to another, without worrying about migrating the data. The result is an extremely agile multicloud environment, where you have access to the widest array of cloud services possible, while cutting costs and protecting yourself from vendor lockin all at the same time.
Davey further highlights that to support a successful hybrid cloud implementation, some important considerations are availability of low latency connectivity to multiple public cloud networks, identity and access management, same procedures followed by the Devops team across both the public and private cloud environments and development of expertise to run virtual routers.
Scope for AI in data centres
AI is now being realised in real-world use cases including manufacturing, security, farming, automotive, healthcare, education and businesses. The ability of devices and objects interacting through Internet of Things to pour even more data allow for greater reasoning and reaction.
AI is playing an increasing role through machine learning, adds Saifuddin. “Those in automotive, EDA, and research employ real-time data analytics and machine learning workloads every day. At the core of these emerging workloads are increased adoption of in-memory databases and clustered databases that benefit more immediately from low latency storage.”
Cloud applications and network, storage, and HVAC equipment generate a lot of telemetry data, and need to be monitored and analysed for proper functioning of the data centre. Wherever data is generated in huge quantities, there is a good opportunity to utilise AI/ ML technologies to detect anomalies and provide predictive insights and what-if analysis. Alert generation, error recovery procedures, and preventive actions can be automated based on past data. AI/ ML technologies can also work in tandem with the on- call technicians to recover from unforeseen situations.
AI initiatives tend to be a very high resource- consuming practice due to high computational power required and the rendering of the data needed for machine and deep learning. Netapp has teamed up with Nvidia, the leader in GPU computing, to help accelerate business outcomes through a validated design in a converged architecture.
“This approach can help data scientist become productive in a matter of days instead of weeks and months – while, administratively the solution becomes all containerised to ensure simplicity and agility. Although, these architectures are mainly on-premise, the platform is open to integrate with the cloud to augment resource requirements whenever needed, adds Kanafani.
Gartner predicts that by 2025, 80% of enterprises will shut down their tradional data centres.
For larger enterprises to stay in compliance, data residency regulations have made it mandatory to retain data in enterprise data centres). But even larger enterprises are moving to the public cloud or Saas applications, says Davey of Manageengine.
Hybrid cloud isn’t necessarily about separating workloads between private cloud and public cloud. It works in a particular application use case where you can take best advantage of each platform, says Saifuddin of Western Digital.
The future data centre is essentially software defined, borderless, dynamic and makes up a critical part of the Data Fabric circle of data management, says Kanafani of Netapp.