Quick Quantum Quips: Quantum commercialization is on our doorstep

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

September 2020 Developments

Recent developments in the quantum computing industry make one thing certain: The commercialization of quantum systems will occur during this decade. The vision of what quantum commercialization will look like varies from something that is very similar to classical systems and is consumed in the cloud to something as miniature as a desktop form factor. Regardless, quantum systems will have computational capabilities for commercial and academic use. TBR expects early production-grade systems to be used in a hybrid configuration with high-performance computing (HPC). As with many other elements of the economy being disrupted by technological innovation, the challenge will be in finding skilled labor to harness the power of quantum systems for economic advantage.

  1. IBM unveiled its quantum road map in September. Included in its road map are the release of a 433-qubit system named Osprey in 2022 and a 1,121-qubit system named Condor in 2023, the latter of which will be capable of enabling scalability. IBM also introduced a super-fridge, named Goldeneye, which is 10 feet tall and 6 feet wide. This development will support the eventual creation of a 1 million-qubit quantum system, which is expected to be released by 2030. This road map makes it clear that at IBM, commercialization of quantum computing is expected within the decade, and therefore, the time has arrived for companies to explore becoming quantum-ready at scale.
  2. Zapata Computing unveiled a Scientific Advisory Board (SAB) to help better align its research agenda around quantum computing with the business needs of global companies interested in pursuing quantum computing within their road maps. Zapata seeks to expand scientific innovation more rapidly than it could do on its own while using SAB-initiated collaboration to pursue advancements that are targeted at customer demand. Expanding within academia remains a goal even though the SAB targets enterprise-level collaboration.
  3. D-Wave, in partnership with the Universities Space Research Association and Standard Chartered Bank, announced a quantum research competition with the goal of bringing quantum computing to nonprofits and universities. The competition aims to advance developments around quantum computing and AI, and the prize for the winner is free time to access the D-Wave 2000Q system.
  4. D-Wave appointed Daniel Ley as SVP of sales and Allison Schwartz as Global Government Relations and Public Affairs leader in September. These appointments highlight that D-Wave is targeting the public sector for sales of its quantum systems, and rightfully so as many governments have allocated budget dollars for quantum investments.
  5. Q-CTRL partnered with Quantum Machines to integrate Q-CTRL’s quantum firmware with Quantum Machines’ orchestration hardware and software offering. The quantum computing market is becoming crowded as startups emerge and more established firms devote some resources to quantum computing innovation. As such, smaller firms like Q-CTRL and Quantum Machines partnering to augment individual capabilities will become more commonplace the closer we get to commercialization at the end of the decade.
  6. Microsoft, in partnership with the University of Copenhagen, has discovered a new material that can be used to simplify topological quantum computing. Presently, large magnetic fields are necessary for computation to take place. The research combined aluminum, europium sulfide and indium arsenide, which together enable a quantum wire device to be an additional and necessary component of topological quantum systems. Ridding the system of the need for magnetic fields is a major breakthrough because the inclusion of a strong magnetic field, while advantageous for the system, could result in unintended negative impacts to other components or systems located within close proximity to the quantum system.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our upcoming version, which will publish in December, will focus on the software layer of quantum systems. You can also sign up for our webinar on the topic as well, which will be held on Dec. 16 at 1 p.m. EST.

HPE buys Cray: Is this the definition of insanity?

We know Moore’s law drives consolidation in the industry. What we do not know, however, is if any two hardware-centric vendors can come together and build a business accretive to the top line. Michael Blumenthal tried this strategy by combining Burroughs and Sperry to create Unisys, and that certainly did not work. More recently Dell acquired EMC, and while jury remains out on that consolidation play, early indications have been positive.

HPE hardware acquisition history

Hewlett Packard Enterprise (HPE) has deployed this strategy multiple times over the years. Today HPE announced it will acquire Cray for $1.3 billion, which equates to $35 a share, or a $5.19 premium over yesterday’s closing price of $29.81. Similar hardware-centric deals HPE has conducted over the years include:

  • Acquiring Apollo after its first-mover advantage in engineering workstations was eclipsed by Sun Microsystems
  • Acquiring Compaq after it had acquired Tandem and Digital Equipment Corporation (DEC), which had likewise struggled as much in business model integration as with technology integration
  • Acquiring SGI, which was hemorrhaging cash but was a strategic HPE OEM partner that HPE could not afford to let fail or be acquired by a rival
  • And now Cray, the last of the venerable high-end niche vendors to double down on higher-margin high-performance computing (HPC)

HPC becomes mainstream as accelerators keep pace with big data compute demands

HPC certainly has growing appeal. That appeal stems from several economic drivers

  • As always, Moore’s law theory gets borne out in reality as cost and form factors decrease to the point where distributed computing (a fundamental tenet of Ken Olsen’s original business plan for DEC in the early 1960s) can be done at the board level if not the chip level. Graphics processing units (GPUs), tensor processing units (TPUs) and field-programmable gate arrays (FPGAs) can keep pace with increasing demands coming from big data analytics.
  • Supply chain excellence and software tuning of these commodity components can allow for custom-designed systems, purpose-built to the compute demands of the HPC customers.
  • IBM certainly keeps innovating in HPC, especially with its RISC-based Power chips suited for analytics.
  • Lenovo has taken a huge bite out of HPE’s share of the HPC space through its design engineering and supply chain flexibility, manufacturing commodity Intel boards at scale through Lenovo’s global manufacturing space. Per Lenovo it went from having none of the top 500 HPC installations in the world in 2014 to having 140 of them in 17 countries in 2018. Much of this success came at HPE’s expense.

Will the acquisition go against type and be viewed as a sane move?

A definition of insanity is to engage in the same activity over and over again while expecting a different outcome. HPE’s history has been to acquire struggling firms in niche hardware areas in hopes of growing share. With fewer and fewer silicon-centric vendors left standing, the odds of success can certainly increase in time.

The Cray acquisition may well aid HPE in stalling Lenovo’s recent successes in the HPC space, but Lenovo’s operating best practices are well suited to commoditizing markets. Supply chain excellence honed to attack the hyperscale market brings decided cost advantage to the HPC space. Talent recruited from Intel and other firms likewise gives Lenovo the software tuning competencies necessary to extract fit-for-purpose performance from commodity chipsets.

Quantum also looms large on the horizon as the next chapter for the high-end compute requirements to help solve the world’s intractable problems. Seven nanometer wafers may not be the end of the line for silicon innovation, but it is certainly getting close. This acquisition seems poised to satisfy the immediate here and now, while once again being eclipsed by niche innovation elsewhere, with that elsewhere coming in the quantum domain in three to five years.

Recent articles have come out suggesting HPE is cutting back on quantum research, intending instead to extract more life out of the traditional computing space with processors for deep learning and analytics. HPE has certainly acquired a company that has been admired for decades as being the “tip of the spear” in silicon innovation. HPC innovations certainly can work for today, but that tip of the spear will be blunted by the inexorable laws of physics, making further silicon innovation increasingly more challenging. Future offerings in what has been Cray’s core market will come from quantum innovators. Once quantum reaches economic advantage over high-end classical computing, the industry will see yet another round of business exits for those vendors lacking transformation fearlessness. Like many of HPE’s other hardware-centric acquisitions, this move appears to have reasonable short-term impact and limited long-term upside.

Commoditization economics and emerging workloads disrupt the data center landscape

Commoditization mitigation strategies require business model shifts and an ever-watchful eye on exascale cloud entrants

Volume or value?

Toward the end of 2018 in the data center market, two distinct vendor strategies emerged: Vendors began either increasing sales volume or selling lower-volume but higher-value solutions. TBR believes that in 1H19, now that vendors have determined their camps, they will begin to craft competitive strategies directly targeting specific peers. For example, Dell EMC has publicly stated its intent to increase its market share in both servers and storage, and we believe the vendor will target key competitors to gain this share. Similarly, Lenovo’s large-scale data center investments imply significant competitive goals.

In February Lenovo unveiled TruScale Infrastructure Services. This directly competes with Hewlett Packard Enterprise’s (HPE) GreenLake and Dell EMC’s Cloud Flex. It also addresses customer demand for private cloud infrastructure that is financed like a public cloud offering. TruScale is available for Lenovo’s entire stack of data center infrastructure solutions. In April Lenovo unveiled a server portfolio refresh, which likely reinforces its TruScale solutions and increases its competitive edge against Dell EMC and HPE.

TBR believes that during the next few months, Dell EMC and HPE will fight back against Lenovo’s marketing tactics to preserve market share. HPE has an advantage in that it is pursuing value-centric data center sales, so it is likely willing to concede less-profitable sales to Lenovo or Dell EMC. Dell EMC’s stated objective to increase market share in servers and storage will increase competition between the company and Lenovo as both aim to scoop up HPE’s lower-margin customers.

ODM participation heats up as commoditization drives provisioning simplicity

Because data center hardware becomes increasingly commoditized as software capabilities become more advanced, we believe data center vendors will increasingly find themselves competing against ODMs, especially for larger deals. Smaller customers will still show a preference for OEMs as they need the additional software and services provided with OEM data center solutions. Lenovo’s manufacturing capabilities give the company an advantage in the hyperscale space, where Lenovo’s past financials illustrated some successes, and enable the vendor to differentiate from its OEM peers.

On the hyperscale front, ODMs are rising to dominance, but OEMs such as Lenovo remain a force to be reckoned with in the space. As cloud becomes an increasingly central piece of IT environments, public cloud providers seek ways to expand their environments as cost-effectively as possible to preserve profits. TBR believes very large enterprises are likely to explore leveraging hyperscale vendors as well for their on-premises environments if it is cost-effective.

Consumption-based pricing models tie to the commoditization march

TBR’s Hyperconverged Platforms Customer Research continues to highlight the correlation between private cloud installments and HCI. Most recent findings indicated that 80% of respondents leveraged their HCI purchase for a private or hybrid cloud environment. Since customers are already turning to HCI for cloud, it is a logical next step for vendors to price HCI like a public cloud solution to deepen the competition.

With their channel partners also engaged, Dell EMC, HPE and Lenovo are the three main players in the consumption-based pricing space. Their solutions are not limited to just HCI, but HCI is one of the solutions that can be purchased in this manner. The key value proposition of consumption-based pricing for data center vendors is the ability to bundle software and services into hardware consumption-based deals. This is likely to boost the margin on the solutions. Further, it guarantees larger deals, as in many cases, these consumption-based pricing deals lock customers in for a predetermined duration that has early termination penalties.