Hewlett Packard Enterprise (NYSE: HPE) split from HP Inc. (HPQ) in 2015, with HP focusing on personal computers and printers and Hewlett Packard Enterprise focusing on business-focused servers, storage, networking, the cloud, software, and various services. HPE benefits from many of the same trends in enterprise computing that had pushed Dell Technologies’ (DELL) stock up over 100% earlier this year. In my Dell article, I discussed how investors saw clearer evidence in its last earnings report released on May 30, 2024, that demand for artificial intelligence (“AI”) enterprise servers could be a huge growth driver for the company moving forward.
The market recently became more excited by HPE’s prospects after it released its second quarter fiscal year (“FY”) 2024 earnings report on June 4, showing AI systems revenue more than doubled from the first quarter’s results. Investors in HPE hope that now that the company’s AI initiatives are picking up, its stock performance can achieve a similar performance as Dell’s. The stock rose around 11% the day after the company released the report.
The problem is that although the market may give HPE credit for potentially benefiting from AI adoption, its strategy has several risks, including AI servers being less profitable than the traditional server business. Investors should proceed cautiously and not buy this stock purely based on its AI opportunity. Additionally, although several analysts raised its price target post-earnings, the stock price has already exceeded the consensus price target of $20.12.
This article will discuss HPE’s latest AI initiatives and review its latest earnings report. I will also examine the company’s valuation and some risks. Last, I will discuss why I believe HPE is a hold.
The company may be an AI beneficiary
Although some people focus on HPE’s opportunity to provide AI servers to its customers, it’s important to remember that the company is a complete AI solutions provider, meaning it offers services beyond its hardware components. HPE has several other services, including software tools, security solutions, Machine Learning as a Service, consulting, and a hybrid cloud tool builder named GreenLake. Hybrid computing is critical for companies using AI. Chief Executive Officer (“CEO”) Antonio Neri commented on the hybrid cloud’s importance at the Discover 2024:
AI will require hybrid cloud. But AI is not one single thing or a monolithic workload. To deploy AI, you must orchestrate hundreds of microservices, multiple AI models, specific accelerators, and connect many different data sources, all of which are highly distributed across your hybrid IT state. That is why you need a hybrid strategy. At the same time, you must maintain data governance, regulatory compliance, and security end-to-end, making on-premise private clouds essential to your hybrid mix.
As generative AI proliferates, HPE’s hybrid computing solutions, software, and other peripheral services should become more valuable over the long term. Collaborations with partners like Microsoft are also meaningful. When analysts asked Chief Financial Officer (“CFO”) Marie Myers at the BofA Securities 2024 Global Technology Conference about the partnership with Microsoft, she said, “HP is helping Microsoft extend the Azure AI platform to customers like OpenAI.”
Management also announced at its recent Investor Relations Summit @ HPE Discover 2024 that it was collaborating with NVIDIA (NVDA) on a new venture named Nvidia AI Computing by HPE. This venture addresses three critical components needed for successfully utilizing Generative AI and large language models. NVIDEO CEO said at the conference that the three necessary components are a “model stack, a data stack, and a computing stack.”
NVIDIA will contribute a product named NIM as a model stack, which are prebuilt AI models that customers can customize for different tasks. NVIDIA defines NIM as “inference microservices that provide models as optimized containers — to deploy on clouds, data centers or workstations, giving them the ability to easily build generative AI applications for copilots, chatbots and more, in minutes rather than weeks.”
NVIDIA AI Enterprise will contribute the data stack or the operating system for AI, which provides data processing, vectorization (improves the processing speed of AI algorithms), semantic embedding (captures the meaning of data), and more.
HPE will contribute the computing stack, which includes the servers, storage, and networking required to run power-intensive AI models in collaboration. If successful, this collaboration can potentially drive significant revenue for HPE. HPE also plans to bring along its sales channels and partnerships, including Deloitte, HCL, Infosys (INFY), TCS, and Wipro (WIT).
Still, you might wonder why NVIDIA teamed up with HPE when it already has a collaboration with Dell (I discussed this collaboration in my article on Dell). Well, besides NVIDIA wanting as many outlets to sell AI chips as possible, HPE is one of the premier enterprise server companies and has built some of the world’s largest computers with NVIDIA. During HPE Discover, CEO Antonio Neri highlighted several high-profile projects that the two companies collaborated on:
NVIDIA has been our visionary partner who share our purpose and commitment to innovation. Over the years, HPE has partnered with NVIDIA to achieve amazing success with our customers in delivering best-in-class AI and supercomputing solutions. One great example of that is the newest Venado supercomputer at the Los Alamos National Laboratory, which is the first U.S. supercomputer to feature the NVIDIA Grace Hopper GPUs. Another example is our recently signed $200 million deal to build a new supercomputer for one of Japan’s largest research institutions, AIST, which we will build in collaboration with NVIDIA using our HPE Cray XD systems, featuring NVIDIA H200 Tensor Core GPUs.
The second and third points in the above image mention green computing and liquid cooling. During my first article on Super Micro Computer (SMCI), I discussed how that company took market share from Dell and HPE. Supermicro actively promotes “Green Computing” on its website, and part of that focus is its liquid cooling solutions. In my Supermicro article, I quoted its CEO Charles Liang, who said (emphasis added), “Our high-power efficiency systems, free-air and liquid-cooling expertise has become one of our key differentiators of success.” However, Supermicro’s edge over its competitors may be more in terms of being the first to actively promote liquid cooling solutions for servers because HPE has had liquid cooling technology for a while now.
In the past, liquid cooling may have been less widespread for servers because they can be expensive to install and often require a significant upfront investment. With lower power chips used in traditional servers, air cooling was sufficient. However, with the usage of increasingly higher-powered chips to run generative AI algorithms, liquid cooling is a requirement for AI servers in data centers. Now that HPE emphasizes selling AI servers, management is talking much more about liquid cooling technology. For instance, CEO Neri said on the Second Quarter 2024 Earnings Conference Call:
As accelerated computing, silicon innovation advances, higher power density demands direct liquid cooling technologies building direct liquid cooling AI systems is complex and requires manufacturing expertise and infrastructure, including power, cooling, and water. With more than 300 HPE patents in direct liquid cooling, proven expertise and significant manufacturing capacity for this kind of systems, HPE is well-positioned to help customers meet the power demands for current and future accelerated compute silicon designs.
CFO Marie Myers also commented on the company’s liquid cooling prowess at the BofA Securities 2024 Global Technology Conference:
Liquid cooling is a skill that HPE has had for decades. And it goes back to the original sort of acquisitions that we did in some of the heavy compute space. And those decades of experience are now going to be very, very beneficial to customers that we talked about like Microsoft. But moreover, as the chips come on board with [NVIDIA AI chip] Blackwell, maybe it’s not quite understood that they’re going to be completely in need of liquid cooling. And I think that’s where HPE has more than 300 patents and decades of experience.
HPE management likely made those statements to emphasize to investors and potential customers that Supermicro is not the only one with liquid cooling technology. Selling AI solutions could provide a nice tailwind for HPE’s revenue and free cash flow growth prospects over the next several years. Let’s examine the company’s second-quarter results and discover how much of an impact AI has had so far.
HPE second quarter FY 2024 Earnings Results
HPE’s second-quarter results showed a significant change from the first quarter when the company showed poor conversion of AI server orders into sales. CEO Neri said the following about the poor first-quarter performance in the company first quarter FY 2024 earnings call:
AI server demand remains very strong, evidenced by our growing cumulative order book. However, GPU [AI chip] availability remains tight, and our delivery timing has also been affected by the increasing length of time customers require to set up the data center space, power and cooling requirements needed to run these systems.
In other words, HPE couldn’t obtain enough AI chips from suppliers (likely NVIDIA), and customers were not ready to receive the servers because the customer’s data centers had failed to complete the set-up of the auxiliary equipment needed to run the servers.
The company displayed much better results in the second quarter. AI systems orders reached $4.6 billion this quarter, showing strong demand for its AI systems. The big difference is that the company converted more of those orders into sales, which grew from around $400 million to over $900 million sequentially. Management attributed the improvement to an improved supply of AI chips.
CEO Neri said on the second quarter call (emphasis added):
Our lead time to deliver NVIDIA H100 solutions is now between six and 12 weeks, depending on order size and complexity. We expect this will provide a lift to our revenues in the second half of the year.
In Dell’s last reported quarter, the company indicated the traditional enterprise server market was strong, boosted by high demand for AI servers. Dell’s Chief Operating Officer Jeff Clarke said on the company’s first quarter FY 2025 earnings call, “We’ve seen an expansion in the number of enterprise customers buying AI solutions, which remains a significant opportunity for us given we are in the early stages of AI adoption.” Dell’s first fiscal quarter ended on May 3. HPE CEO Neri said a similar thing on its second-quarter FY 2024 earnings call (HPE’s fiscal second quarter ended April 30):
Enterprise customer interest in AI is rapidly growing, and our sellers are seeing a higher level of engagement. Enterprise orders now comprise more than 15% of our cumulative AI systems orders, with the number of Enterprise AI customers nearly tripling year-over-year. As these engagements continue to progress from exploration and discovery phase, we anticipate additional acceleration in enterprise AI systems orders through the end of the fiscal year.
The image below shows that the server business is the one area showing positive double-digit revenue growth at 18% year-over-year. HPEFS on the image stands for HPE Financial Services, the company’s financing arm. The two pieces of good news in the segment results are that the company’s largest segment, the Server segment, is also the fastest growing. The second piece of good news is that the Intelligent Edge and Hybrid Cloud segments are in cyclical downturns, are bottoming, and once those areas return to growth, those segments should further boost the company’s overall results. I will discuss HPE’s other segments in future articles.
The company’s second quarter FY 2024 net revenue of $7.2 billion exceeded the midpoint of company revenue guidance and beat analysts’ consensus revenue by $372.61 million.
Margins are a potential issue as we move down the income statement. In the second quarter, non-GAAP (generally accepted accounting principles) shrank by 310 basis points to 33.1%. The non-GAAP operating margin was 9.5%, down 200 basis points sequentially and year-over-year. CFO Marie Myers said on the earnings call that the gross margin decline was “driven by a mix shift from our higher margin Intelligent Edge revenue to Server revenue, plus an unfavorable mix within hybrid cloud.”
Since the Server segment is the fastest-growing segment, investors should pay attention to this segment’s operating margins in future quarters. The CFO noted during the earnings call that “pricing remains aggressive in the server market, particularly in AI systems,” which means that although there is heavy demand for AI, there is a potential price war for AI servers between Dell, Supermicro, HPE, and other AI server manufacturers.
Cash Flow from operations (“CFO”) to sales is up to 19.51%. This metric shows how efficiently a company can convert sales into cash flow. HPE is significantly better than its peers at converting sales into cash flow. Another positive sign is that its CFO-to-Sales metric is trending up, a potential positive sign for FCF growth.
CFO said the following about second quarter FCF on the earnings call:
We generated $1.1 billion in cash flow from operations and $610 million in free cash flow this quarter. HPE typically consumes significant amount of cash in the first half of the year and then generates cash in the second half. We are ahead of traditional free cash flow patterns thus far in fiscal 2024, given higher than expected net income in Q2, prepayments for AI systems, and timing of working capital payments.
HPE ended the second quarter with $2.67 billion in cash and equivalents against $7.49 billion of long-term debt. The company has a debt-to-equity ratio of 0.52, meaning it has approximately half the liabilities as it has equity. The company has a net debt of $9.252 billion and a trailing 12-month EBITDA (Earnings Before Interest, Taxes, Depreciation, and Amortization) of $4.893 billion. Therefore, the company’s net debt-to-EBITDA ratio in the second quarter was 1.89. Generally, a company with a net debt-to-EBITDA ratio of less than three has a lower risk of defaulting on its debt. It has an interest coverage ratio of 3.4, meaning its EBIT covers its interest payments by an adequate amount.
The following guidance raised full-year FY 2024 revenue guidance from 0-2% to 1-3% growth. CFO Marie Myers said on the second quarter earnings call, “We expect a materially stronger second half led by AI systems, traditional servers, and storage, networking and HPE GreenLake.” However, the company lowered GAAP operating profit growth from 7-11% to 2-6%—ouch.
This earnings report displayed mixed results. On the favorable side, the proliferation of AI should boost revenue growth. On the unfavorable side, the first-quarter and second-quarter earnings reports brought up several risks.
Risks
Although sales of AI systems should boost revenue growth, they could hurt margins. Some analysts worry that sales of AI servers might be less profitable in the long term than initially thought for companies like HPE, Dell, and Supermicro, as AI chip suppliers like NVIDIA charge higher prices for their chips and squeeze server manufacturers’ margins. HPE may ultimately only produce meaningful margins on the software and other services it provides outside the AI servers.
Additionally, the server industry is commodity-like with little to no moat, meaning that no one company can differentiate itself for a meaningful period and often competes based on price. Also, Dell is an early leader in AI servers. HPE’s AI systems backlog of $3.1 billion is softer than Dell’s last reported backlog of $3.8 billion. There is some uncertainty about HPE’s strategy to catch up. Since Supermicro and Dell are early leaders in providing AI servers, HPE may eventually decide to compete on price, worsening margins.
Dell emphasized using NVIDIA chips almost exclusively in the past, while HPE has used many chips that compete with NVIDIA. So, some analysts think that NVIDIA favors Dell over HPE, a potential reason that Dell had enough chips from NVIDIA to satisfy demand in the same quarter (HPE’s first quarter 2024) that HPE had difficulty obtaining enough AI chip supply from NVIDIA to meet its end demand. Some fear that each time NVIDIA brings a new chip to market, it might reward Dell’s loyalty by providing it with the new chip before HPE. Suppose that scenario occurs; HPE may have a more challenging time competing, as server customers factor in that they can obtain the latest NVIDIA chips faster from Dell.
Valuation
HPE has a price-to-FCF of 9.30, below its three- and five-year median, suggesting an undervalued stock.
HPE sells at a price-to-earnings (P/E) ratio of 15.25, well below Dell’s P/E ratio of 27.81. Still, the difference between the valuations may be justified on a P/E basis due to Dell growing its earnings-per-share (“EPS”) 67% year-over-year while HPE declined 25%.
The following image shows HPE’s forward P/E and year-over-year annual EPS growth estimates from FY 2024 to FY 2026. By FY 2025, the company’s Intelligent Edge and Hybrid Cloud business should be out of its cyclical downturn, so let’s compare the company’s FY 2025 EPS growth estimate of 9.86% to its forward P/E of 10.04. Since the forward EPS growth rate and forward P/E nearly match, HPE is trading at fair value at the closing stock price of $20.92 on June 26, 2024.
The following chart shows HPE’s shareholder yield, which is the company’s capital return via dividends, net share buybacks, and debt reduction. Generally, the market overvalues a stock when the shareholder yield is low and undervalues it when it is high. The company’s shareholder yield of 5.77% is closer to the overvalued side than undervalued.
Seeking Alpha’s Quant rates the stock’s valuation a C.
HPE is a Hold
Although HPE will likely benefit from AI adoption, the market has likely priced in that potential upside. That may be why the stock barely budged after announcing its collaboration with NVIDIA at HPE Discover 2024. Considering the company’s risks and the fact that other technology investments may benefit more from increasing generative AI adoption, investors may want to avoid this stock at its current price. Put this stock on a watch list and consider buying it if its shareholder yield rises above 10%. I rate HPE a Hold.