Three clicks and you’re done: With Private Cloud AI, HPE wants to remove the complexity of AI and free it from the public cloud. A groundbreaking vision or more of the same?
It was the highlight of HPE Discover in Las Vegas in June. HPE CEO Antonio Neri and Nvidia CEO Jensen Huang announced Private Cloud AI to a crowded sphere like two seasoned rock stars. Huang is not there in Barcelona, but AI is no less discussed. “We see ourselves as administrators of AI,” announces Neri in his keynote.
With Private Cloud AI, HPE believes it has cracked the AI code. The company aims to provide an easy-to-deploy private alternative to GenAI solutions, which currently reside primarily in the public cloud. HPE is not alone in this view. Has HPE discovered the magic formula or reinvented the wheel?
AI is hybrid
For many companies, applications like ChatGPT are the most obvious way to experiment with AI. Whether you prefer ChatGPT from OpenAI, Gemini from Google or Anthropics Claude, the models have one thing in common: they are available via the public cloud. It doesn’t have to be that way when it comes to HPE. Discover is the ideal opportunity to convince the world that AI belongs in private infrastructure.
HPE describes generative AI as “the ultimate hybrid workload.” Neil McDonald, Vice President of HPC and AIshows this statement. “Classic AI was based on coding. GenAI is about data. If models are not embedded in data, they do not produce results. Most of the data is local.”
Mohan Rajagopalan, who manages Ezmeral, makes several arguments for running AI locally. “The first reason is economic. AI production can be expensive. On private infrastructure you have more control over your costs than if you place everything with hyperscalers.” Hyperscalers will be happy to counter this argument and argue that the “entrance ticket” is lower with them.
That’s why HPE also plays heavily on the issue of sovereignty: an issue that is particularly sensitive in Europe, says Rajagopalan. “In certain situations, data needs to stay within strictly defined boundaries and companies don’t want to move their data to the cloud at all. Data is the basis for AI. When you invest in your own infrastructure, you have the guarantee that it will be available at all times. This also applies to your data.”
Lots of good will, fewer results
Several studies show that although there is a lack of good will on the part of companies, implementation is less smooth. Rajagopalan does not want to turn a deaf ear to the stumbling blocks that businesses face. “I’ve had a lot of conversations about it. In the past, you could run models in the cloud and see what came out. The use cases for AI are now much more diverse. Success stories start with the use case.”
“Only one in ten Proof of concept with AI goes effectively into production,” CTO Fidelma Russo puts her finger on the sore wound. “The life cycle is complex. Infrastructure optimization is a long process that companies don’t want to invest time in. If the learning curve of a technology is too high, people quickly lose interest.”
“Companies are learning to embrace AI the hard way,” McDonald repeats to his colleagues. “The three big obstacles are Time to create valueData management and compliance. There has not yet been a solution that addresses these problems simultaneously. The underlying infrastructure should not be a burden. An integrated solution eliminates infrastructure headaches so companies only need to worry about how they want to use AI.”
If the AI learning curve is too high, people will quickly lose interest.
Fidelma Russo, CTO and VP Hybrid Cloud HPE
AI in three clicks
From this perspective, Private Cloud AI emerged. HPE markets the platform as a whole turnkey Solution that companies can easily run on their infrastructure. “We don’t want to just give customers servers and let them do whatever they want. This is a three-click turnkey solution available in various sizes and configurations. Private Cloud AI allows you to process data where it is. We bring AI to your data,” Russo continues.
Private Cloud AI is the mix you would get if you put the HPE and Nvidia lineup in a blender. Both companies find each other not only in terms of hardware, but also in terms of software. During Discover, the tandem will be expanded into a tricycle with Deloitte, helping companies with the practical side.
Rajagopalan provides further text and explanation: “AI is a very technical workload. You are better off if you can work with experts in the field who communicate in a language you understand. This way you create a win-win situation for all parties. For us, it’s not just about the technology, but also how we deliver it. This is where you make the difference. With Private Cloud AI you buy a very fast car with the full support package.”
By making AI as accessible as possible, HPE also wants to include SMBs in the entire AI process, a segment that is lagging behind. “SMEs are limited by their available skills and budgets. You don’t want to invest in capabilities, but in proven use cases. All the necessary functions, from infrastructure to software to models, are integrated into the platform, so you don’t have to do any programming yourself.” However, the prerequisite is to purchase at least one ProLiant server with Nvidia GPUs: the entry ticket will easily cost you several thousand Euro.
Recognizable recipe with a unique ingredient
With Private Cloud AI, HPE is positioning itself directly against the hyperscalers. Microsoft, Google, AWS and Co. lure you into their data centers to gain access to the most advanced AI models and capabilities, while HPE examines how to adapt models to the possibilities in the customer environment. But HPE isn’t alone in taking a hybrid approach to AI.
Lenovo and Dell, HPE’s two major competitors in the server market, are also responding strongly to the local component of AI. The vision sounds similar, the interpretation is slightly different. All three begin with their historical background in hardware. Lenovo and Dell are looking not only at servers, but also at the PC as a vector for running AI locally. One big constant: Nvidia is never far away. Then there are also parties like Nutanix who, like HPE, want to make a name for themselves with “bite-sized” AI solutions for hybrid IT environments.
HPE isn’t reinventing the wheel with Private Cloud AI, but no two wheels are the same. HPE’s recipe contains another ingredient that the competition doesn’t have. HPE, with its subsidiary Aruba, is a major player in the networking industry and is happy to play this card. It is therefore not surprising that HPE is digging deep into its pockets to acquire Juniper Networks. We have analyzed the takeover in detail.
“Hyperscalers are very strong in computing, but HPE stands out in networking. AI needs a powerful network. If your Ferrari can only go twenty kilometers per hour, a car that fast is of no use,” concludes Rajagopalan.