Report: The Future Is Hyperscale
Cloud megavendors like AWS and Microsoft are leading the hyperscale datacenter boom. But legacy workloads mean uncertainty for more average organizations.
- By John K. Waters
Virtually every analyst and industry watcher with an eye on the hyperscale datacenter market is expecting the tsunami of Big Data and the demands of new cloud-based, data-driven applications to fuel significant growth over the next five years or so.
But the true fire behind that growth is the real-world benefits of hyperscale demonstrated by companies like Amazon Web Services (AWS), Google, Netflix and Microsoft.
That's one of the conclusions of a new report from BCC Research, which predicts a growth spurt from $39 billion in 2017 to almost $100 billion by 2022, a roughly 20 percent annual growth rate.
"Large scale cloud providers are leading by example, demonstrating the cost savings and high scalability of hyperscale datacenters and hyperscale computing." said Michael Sullivan, senior editor at BCC Research and the report's author, in an e-mail.
The other conclusion: Those benefits won't be easy for the average IT organization.
"The complexity of re-architecting legacy workloads to run in a hyperscale environment is a daunting challenge for enterprise IT organizations," Sullivan told Redmond. "So, the movement to hyperscale, while attractive, is slow as companies determine where they can best leverage the technology."
For example, early adopters have found it difficult to determine the converged infrastructure requirements of mission-critical applications after virtualizing their infrastructure, according to Sullivan. Plus, virtualization creates performance bottlenecks that can slow application performance. And designing and implementing a converged infrastructure reliant on shared resources among clustered servers can raise complications.
"Transforming from the IT environments for different functions, architectures and vendor control systems to streamlined virtualized and centralized computing, storage and network resources is a tremendous paradigm shift," he wrote.
Definitions of "hyperscale datacenter" vary slightly, but essentially, the term refers to facilities built with stripped-down, commercial, off-the-shelf computing equipment supporting millions of virtual servers called "nodes" that offer storage, software components and networking.
A hyperscale datacenter solution is designed to provide a single, massively scalable compute architecture developed from individual servers. The hyperscale datacenter can accommodate increased computing demands in a smaller physical space, and with less cooling and electrical power.
According the BCC Research, the current list of major players in this market includes: AT&T Inc., Cisco Systems Inc., DataCore Software, Equinix Inc., Fogo Data Centers, Global Switch, Hewlett-Packard Enterprise, IBM, Juniper Networks, Lenovo Group Ltd., The NEC Group, Oracle Corp., Pivot3, Quantum Corp., Radware, SanDisk, Tintri Inc., Violin Memory and Western Digital Corp, among others.
"BCC Research believes the market for hyperscale datacenters technologies represents a long-term trend that will transform the way datacenter technologies are deployed, while also reducing the cost to IT organizations," Sullivan wrote in the report. "Hyperscale datacenters will also enable the continuing scaling of computer storage and networks required by today's environment."
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at [email protected].