This article was originally published on CIO.com.
In the era of AI, and despite the software-driven everything trend, storage — and hardware more generally — is finding renewed strategic relevance in the enterprise executive suite.
When industry veteran Charlie Giancarlo decided a little over a year ago that it was time to step back into a tech company role, he had his pick of opportunities.
Coming out of an eight-year stint as a partner at private equity firm Silver Lake Partners, he had an inside track on the future of the entire tech industry. He knew which sectors were getting investments, and which technologies were best poised for breakout growth.
You might expect, therefore, that he would have chosen to lead an AI or cloud company. Instead, he made what may seem a surprising choice: storage.
“[I’m] always looking at trends,” he explained to me. “Not so much what’s ‘hot,’ but what are the fundamental technology trends that are growing along an exponential path, why [they are doing so] and what it means.”
Most of all, he shared, he’s always looking to identify which of these trends have the longest growth path.
An old family friend and owner of several successful businesses once shared with me his secret to success. “Identify the path of progress,” he told me. “And get in its way.”
When Giancarlo chose to join Pure Storage as their CEO, it seems it was because he saw the path of progress barreling its way.
His bet is instructive, however, beyond just what it means for the storage sector. It speaks to a broader and renewed strategic relevance of hardware as data and artificial intelligence (AI) become drivers and enablers of competitive value.
AI challenges the commodity hardware ethos
It was only a short time ago that we were having a very different conversation.
As the cloud rose to prominence, one of its foundational ethos was the idea that you could take commodity hardware and cluster it together with specialized software to create highly scalable and resilient architectures.
This approach made it easy and inexpensive to both scale environments up or down and to replace failed components. Most importantly, this software-driven approach enabled organizations to move off of expensive, proprietary systems.
The web-scale companies adopted this ethos with gusto and enterprise organizations soon began to follow suit. This march towards a commodity hardware dominated and software-driven world seemed inexorable.
And then AI happened.
Considering how long AI has been part of the public consciousness, it’s almost funny that it snuck up on the entire tech industry. While industry leaders have been working on AI technologies for decades, until recently it didn’t play a meaningful role in enterprise strategy nor was it a significant element of tech company go-to-market motions.
And then, AI was everywhere.
Because of its sudden rise as a top-of-mind issue, enterprise leaders were largely unprepared to deal with AI — and most critically, were ill-equipped to deal with the impact these new AI workloads would have on their newly cloudified architectures.
As enterprises have begun working with AI, machine learning, advanced analytics, and other data- and resource-intensive workloads, they have found that commodity-based architectures built for traditional workloads buckle under the demands of these much more intense workloads.
This gap has caused enterprise leaders, and the entire technology industry, to, once again, revisit the role of hardware in an AI-powered and data-driven world.
Re-envisioning data-centric architectures for the AI era
It was this trajectory that Giancarlo saw as he surveyed the technology landscape that was emerging.
“Applications used to be heavyweight,” he explained. “You had to build all the components…and then customize it for the application. There was no [easy] way to move it to another stack…but the data was, in relative terms, very light.”
“Now, it’s the opposite,” he continued. “Today, it’s easy to move the workloads. But data gravity is now the problem.”
At the same time that data was growing more complex and voluminous, it was also becoming more critical to these emerging AI workloads — causing the infrastructure layers supporting it, including storage, to become a problem.
“We discovered that storage, which we had not seen as the thing we needed to work on first [as we dealt with AI workloads], suddenly wasn’t able to keep up,” explained Jeremy Barnes, Chief Architect with Element AI, an AI solutions provider. “It wasn’t able to feed all of our GPUs to the extent [we needed].”
As this gap became apparent, the industry began to recognize that hardware would once again play an essential role in dealing with these new types of workloads.
IBM, for instance, has introduced integrated systems, based on its Power9 chip, that are purpose-built and optimized for AI workloads. Likewise, a startup called SambaNova Systems recently received an A round investment of more than $50 million to deliver on its plans to create an entirely new integrated hardware/software platform optimized for AI workloads.
These developments represent a broad industry trend re-emphasizing the role of hardware. More importantly, however, they point to the need to drive workload optimization by shifting towards an integrated, data-centric architecture.
Moving past the hardware vs. software debate
I believe it was this confluence of forces that created the path of progress that Giancarlo spotted when he chose to join Pure. The growing volume and importance of data, the intensity of AI workloads, and the need to optimize across both the hardware and software layers have suddenly put storage, as shocking as it may seem, in the critical path of progress for enterprises.
To a certain degree, we should have expected this return swing of the pendulum. The history of the technology industry has been a bit like a yo-yo bouncing between two extremes. As soon as Marc Andreessen wrote the words, “software is eating the world,” I was confident that we would soon see a resurgence of hardware.
Still, this is different than the typical industry all that was once old is new again syndrome. In this case, it’s less about the pendulum swinging away from software and back to hardware and, instead, a recognition that the distinction is now irrelevant.
The point is that it’s not really about software eating the world or hardware being strategic again. Instead, it’s about understanding the unique requirements of each workload, how they align to delivering competitive business value, and the need to architect systems holistically to meet them.
“We’re software-defined,” explained Giancarlo. “But we operate on a single hardware platform that we happen to produce. We built our own hardware because industry standard hardware couldn’t deliver the performance that was necessary [for the workloads we support].”
Pure, of course, is not alone in recognizing this trend. Long-time industry storage vendors along with a rash of start-ups are also rushing to close the gap, taking various approaches to help enterprises deal with their new and emerging storage challenges.
It’s too soon to tell how the storage sector will shake out as AI and other data-centric and resource-intensive workloads become ever-more prominent. But what is sure is that, against all odds, storage has not only become strategic, it’s become sexy.
[Disclosure: IBM is a past client of Intellyx. Pure Storage is a current client of Intellyx.]