Collaboration is key to the advancement of quantum computing technology

Key Insights

Commercialization is both a challenge and a solution in the quantum space. Commercializing systems will help alleviate some of the anticipated investment shortages as investors want faster ROI than the quantum landscape road map can promise.

Training can help end customers better understand the practical uses for quantum computing, as well as the capabilities of current systems and the approximate timeline of when systems will be powerful enough to meet unique business needs.

TBR’s Quantum Computing Market Landscape, which is global in scope, deep dives into the quantum computing-related initiatives of key players in the space. It lays out the vendor landscape, details current leaders and laggards, and discusses the differing strategies of vendors in the market. The report discusses alliances as well as the tie-ins between quantum computing vendors and their nonquantum computing counterparts. Predictions around use cases and workloads that will benefit initially from quantum computing are explored as well as current customer sentiment around the technology.

Quick Quantum Quips: Investments in componentry accelerates

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

December 2020 Developments

Quantum componentry is of rising importance in the quantum computing ecosystem as vendors seek to not only achieve economic advantage but also to create a quantum system that can be scaled for commercial use. Recent investments in the quantum space suggest a preference for homegrown quantum solutions in the European Union (EU), prompting vendors not based in the EU to increase their local presence to capitalize on this regional trend. Startups continue to emerge to capitalize on a piece of what promises to be a lucrative market while vertical-specific nonquantum vendors innovate around quantum to improve their own offerings for end customers.

  1. Intel launched Horse Ridge II, the second generation of its quantum computer controller. Horse Ridge II improves on its predecessor’s scalability and flexibility. Scalability is currently a key challenge most vendors in the quantum computing market face as they struggle with scaling qubit volume to achieve economic advantage as well as scaling the volume of systems. These systems are very nascent, the componentry is complex and it takes a team of scientists and engineers to create a single system. This is not a scalable model, but vendors such as Intel continue to work to simplify components and, in some cases, even outsource component manufacturing to make it possible.
  2. Phasecraft was founded in 2018 and is an emerging quantum computing player in the market. The vendor recently received £3.7 million in seed funding led by venture capital firm LocalGlobe, bringing its total funding since its inception to £5.5 million. Phasecraft has existing partnerships with Google and Rigetti and aims to leverage its expertise to bridge the gap between hardware and applications. Phasecraft currently employs 10 people.
  3. ColdQuanta was one of six finalists out of 80 participants in NASA Entrepreneur’s Challenge, winning $100,000. The purpose of the competition was to promote the development of and experimentation with new technologies and devices to advance NASA’s science exploration goals. ColdQuanta’s proposal was to leverage its cold atom technology to create a compact quantum gravity sensing device that could be deployed on a satellite or shuttle to monitor and map the Earth’s resources and assist in the assessment of natural disaster damage. While the challenge was not specific to quantum computing, the development of quantum devices further increases visibility of quantum technology’s capabilities beyond just computation.
  4. Microsoft is increasing its EU presence through initiatives in Denmark, which TBR believes will prove advantageous as the EU has shown a desire to fund locally sourced quantum technologies. While Microsoft’s investments in the region go far beyond quantum computing and the company’s most recent announcement is tied specifically to Denmark’s sustainability initiatives, we believe there will be benefits to Microsoft’s quantum computing arm. Microsoft currently has over 1,000 employees in Denmark and has invested in a quantum computing research lab in collaboration with the University of Copenhagen and the Technical University of Denmark.
  5. AT&T, in conjunction with Purdue University, has developed a testbed for 5G research that leverages 5G millimeter technology and quantum cryptography to improve network security. Vertical-specific innovation such as that coming out of AT&T around quantum-safe networking security is increasing as industries seek to uncover ways to coexist with emerging disruptive technology such as quantum computing. While the primary goal of AT&T’s quantum research investments is to use the resulting technology internally, TBR believes it is likely that in the long term, the vendor may also turn its discoveries into a marketable solution.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our latest version, published in December, focuses on the software layer of quantum systems.

Quick Quantum Quips: Quantum systems become increasingly accessible

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

November 2020 Developments

Access to quantum systems and vertical-specific use cases is beginning to emerge in more commercially available ways. While quantum computing has yet to achieve economic advantage, these developments are necessary next steps toward this goal.

  1. IQM Quantum Computing (IQM),a quantum hardware startup based in Finland, was selected to produce Finland’s first quantum system. The company committed to delivering a 50-qubit system by 2024. IQM has a geographical advantage in the quantum computing market because it is located in Europe and there are few vendors on the continent investing in quantum hardware. IQM’s partnership with Atos on quantum provides IQM with increased visibility into the European Union.
  2. Zapata Computing closed its latest round of funding, a series B round that raised $38 million. Comcast’s and Honeywell’s venture capital arms both invested in this round of funding, with Honeywell as an existing investor and Comcast as a new addition. The investments in quantum computing from vendors working in adjacent fields demonstrate the value quantum computing can provide. TBR believes Zapata’s software capabilities are some of the most mature in the industry, making it a valuable long-term partner to Honeywell in the quantum computing market.
  3. Duke University has begun expanding its existing quantum computing facility at its Chesterfield location in Durham, N.C., adding 10,000 square feet. The expansion will be completed by March 2021, and the facility is one of five in the U.S. gaining support from a $115 million grant by the U.S. Department of Energy. Duke University’s quantum computing efforts focus on trapped-ion quantum systems. The systems in development at Duke will be purpose-built to solve specific problems.
  4. AlgoDynamix unveiled a behavior-forecasting use case for financial services customers underpinned by D-Wave quantum annealing technology. This offering is consumed as a cloud service and is significant in the quantum computing market for two reasons, according to TBR. First, it is a very specific vertical use case that leverages quantum computing technology. Second, it demonstrates that a quantum-specific vendor partnering with a vertical-specific vendor can create very practical applications in the greater quantum ecosystem. The analytics of this use case are SaaS-based and do not require customer-specific data to be leveraged, making onboarding new customers to the offering relatively simple.
  5. Honeywell unveiled a 10th-generation 10-qubit quantum system named System H1. The computer leverages Honeywell’s quantum charge-coupled device (QCCD) trapped-ion technology, which is a differentiator in that the QCCD makes it easier to upgrade the system throughout its lifetime. This enables existing customers to take advantage of system advancements as they are developed. System H1 can be accessed as a cloud service either directly through a cloud API or through partners including Microsoft Azure Quantum, Zapata or Cambridge Quantum Computing. All access to System H1 is billed as a subscription service.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our upcoming version, publishing in December, will focus on the software layer of quantum systems. You can also sign up for our webinar on the topic, which will be held on Dec. 16 at 1 p.m. EST.

Quick Quantum Quips: New firms add their names to the quantum landscape

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

October 2020 Developments

Niche entities within the quantum ecosystem are starting to gain notoriety as big-name scientists place bets on smaller firms with big ideas. At the same time, big brands not typically associated with quantum computing are beginning to throw their own hats in the ring as the monetization opportunities become nearer term and the upside of quantum grows massive. Meanwhile, firms are reallocating funding as the accessibility and functionality of quantum systems increase.

  1. Silicon Quantum Computing (SQC) gains new talent with John Martinis leaving Google as its top quantum scientist and joining the Australia-based startup. Silicon Quantum Computing, one of the few quantum computing organizations with a female lead, was founded in 2017 by Professor Michelle Simmons. Martinis said he joined the organization because he believes its unique approach to silicon-based fabrication at the atomic level could be a differentiator in the space. His contract with SQC will last at least for the next six months.
  2. Cambridge Quantum Computing (CQC) unveiled the latest updates to its quantum software development kit named t|ket>. The recent updates increase the number of supported quantum devices and improve circuit optimization and noise mitigation. CQC’s t|ket> is supported on Amazon Bracket and IonQ systems and also supported specifically for application development on Windows operating systems.
  3. Toshiba unveiled plans to develop commercial-grade quantum key distribution (QKD). The vendor has a deal inked with the National Institute of Information and Communications Technology (NICT) in Japan to install its QKD at multiple points on NICT’s network. The system is expected to be rolled out in 4Q20 and deployed in 2Q21. Toshiba intends to capitalize on this niche within the larger quantum ecosystem and currently is not planning to expand beyond the QKD space. TBR believes this demonstrates that classical computing vendors are preparing to update security protocols ahead of key advancements in quantum technology.
  4. IBM, in partnership with The Coding School, has committed to providing free quantum education to 5,000 students globally. This investment is aimed specifically at high school students with the goal of increasing overall accessibility and diversity among those studying quantum computing. There is currently a well-known skills shortage in the quantum computing space, and as the technology becomes more mainstream, the gap will widen. IBM is one of the leading vendors proactively investing in education at both the university and high school levels to help bridge this gap.
  5. D-Wave made headlines this month for an undesirable reason as its valuation was slashed nearly in half. This development came following a refinancing effort on the part of the annealing quantum company. Specifically, D-Wave’s initial $450 million valuation was cut to about $170 million during a restructuring that raised $40 million in funds, of which NEC Corp. contributed $10 million. D-Wave has undergone executive leadership changes recently, including the promotion of Alan Baratz to CEO to replace Vern Brownwell, who retired. TBR believes D-Wave’s valuation slump may have to do with advancements in quantum computing. We believe that annealing is a valuable tool in the quantum ecosystem but that as true quantum computers become more capable, a true quantum system could replace quantum annealing in some places.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our December 2020 iteration will focus on the software layer of quantum systems. Additionally, register for our Dec. 16 webinar on the topic.

With Project Apex, Dell aims to surround the public cloud and tame it

At the virtual Dell Technologies World on Oct. 21 and 22, the company painted a picture of the future, a picture it calls Project Apex. “Apex” can refer to a summit, but it is also the term used to describe the top predator in an ecosystem. Dell Technologies spokespeople did not clarify which definition they intended in naming the project, but it is likely that the predator definition is used widely within the organization. The company aims to use Project Apex to conquer not only public cloud providers, its biggest threat, but also competitors that have similar offerings, such as Hewlett Packard Enterprise (HPE) and Lenovo.

Project Apex is a combination of Dell Technologies Cloud and the company’s goal of offering everything “as a Service.” Dell Technologies Cloud is a multicloud system that includes public and private clouds as well as all of an organization’s assets. Dell Technologies is prepared to manage these assets, both on premises and in the company’s data centers. This system combines the benefits of public cloud — demand-based pricing, simplified operation and outsourced management — with those of on-premises resources — greater control and flexibility, and more efficient use of edge devices. Dell Technologies intends to surround and engulf public clouds.

Project Apex is similar to HPE’s GreenLake initiative, which has the tagline, “The Cloud That Comes to You.” It is not surprising that the two largest data center hardware companies have similar strategies. In fact, while Lenovo’s multicloud and consumption-based pricing strategies are promoted less than those of Dell Technologies and HPE, Lenovo is moving in the same direction. These common strategies are a response to a common threat: the public cloud. Public cloud providers are meeting an increasing share of organizations’ computing and storage requirements, reducing hardware providers’ revenue and profits. All data center vendors have cloud service providers (CSPs) as customers, but CSPs’ scale and ability to provide their own services drive down hardware companies’ margins. The public cloud is a threat, and these combinations of multicloud offerings and consumption-based pricing are the hardware companies’ countermeasure.

Dell Technologies paints a rosy picture of the future, with free movement of data and workloads from the edge to the cloud and everywhere in between. This kind of fluidity would make it much easier for companies to implement and refine large numbers of diverse applications, enabling responsive and flexible digital transformation. The future, of course, is never as bright as pictured in the brochures. But the technology world is making progress in that direction, and Dell Technologies, as the self-defined provider of “essential infrastructure,” is well positioned to deliver it, albeit incrementally.

Project Apex includes the major technologies and techniques that fuel digital transformation. Dell Technologies Chairman and CEO Michael Dell listed six: hybrid cloud, 5G, AI, data management, security and edge. Every large IT system will include these components as well as others. 5G is especially interesting because, apart from critical hardware components for data transmission, it is a software-defined system, giving networking the flexibility that underpins Project Apex.

Project Apex is more a direction than a goal, and Dell Technologies and other tech companies have been moving in that direction since virtualization and its inevitable offspring, the cloud, became important. With the increasing importance of edge devices and edge-generated data, the Project Apex vision, where the public cloud is part of the picture but is no longer dominant, becomes more plausible.

Right now, however, the public cloud is growing rapidly at the expense of traditional on-premises data centers, and hybrid multiclouds are mostly just a vision. There is progress in “as a Service.” Dell Technologies on Demand, the company’s “as a Service” portfolio, now has a $1.3 billion annual run rate, reflecting 30% year-to-year growth. Annual recurrent revenue, which includes traditional financing and services, is $23 billion. Dell Technologies and the other hardware vendors cannot really see the light at the end of the tunnel, but they can describe it.

Quick Quantum Quips: Quantum commercialization is on our doorstep

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

September 2020 Developments

Recent developments in the quantum computing industry make one thing certain: The commercialization of quantum systems will occur during this decade. The vision of what quantum commercialization will look like varies from something that is very similar to classical systems and is consumed in the cloud to something as miniature as a desktop form factor. Regardless, quantum systems will have computational capabilities for commercial and academic use. TBR expects early production-grade systems to be used in a hybrid configuration with high-performance computing (HPC). As with many other elements of the economy being disrupted by technological innovation, the challenge will be in finding skilled labor to harness the power of quantum systems for economic advantage.

  1. IBM unveiled its quantum road map in September. Included in its road map are the release of a 433-qubit system named Osprey in 2022 and a 1,121-qubit system named Condor in 2023, the latter of which will be capable of enabling scalability. IBM also introduced a super-fridge, named Goldeneye, which is 10 feet tall and 6 feet wide. This development will support the eventual creation of a 1 million-qubit quantum system, which is expected to be released by 2030. This road map makes it clear that at IBM, commercialization of quantum computing is expected within the decade, and therefore, the time has arrived for companies to explore becoming quantum-ready at scale.
  2. Zapata Computing unveiled a Scientific Advisory Board (SAB) to help better align its research agenda around quantum computing with the business needs of global companies interested in pursuing quantum computing within their road maps. Zapata seeks to expand scientific innovation more rapidly than it could do on its own while using SAB-initiated collaboration to pursue advancements that are targeted at customer demand. Expanding within academia remains a goal even though the SAB targets enterprise-level collaboration.
  3. D-Wave, in partnership with the Universities Space Research Association and Standard Chartered Bank, announced a quantum research competition with the goal of bringing quantum computing to nonprofits and universities. The competition aims to advance developments around quantum computing and AI, and the prize for the winner is free time to access the D-Wave 2000Q system.
  4. D-Wave appointed Daniel Ley as SVP of sales and Allison Schwartz as Global Government Relations and Public Affairs leader in September. These appointments highlight that D-Wave is targeting the public sector for sales of its quantum systems, and rightfully so as many governments have allocated budget dollars for quantum investments.
  5. Q-CTRL partnered with Quantum Machines to integrate Q-CTRL’s quantum firmware with Quantum Machines’ orchestration hardware and software offering. The quantum computing market is becoming crowded as startups emerge and more established firms devote some resources to quantum computing innovation. As such, smaller firms like Q-CTRL and Quantum Machines partnering to augment individual capabilities will become more commonplace the closer we get to commercialization at the end of the decade.
  6. Microsoft, in partnership with the University of Copenhagen, has discovered a new material that can be used to simplify topological quantum computing. Presently, large magnetic fields are necessary for computation to take place. The research combined aluminum, europium sulfide and indium arsenide, which together enable a quantum wire device to be an additional and necessary component of topological quantum systems. Ridding the system of the need for magnetic fields is a major breakthrough because the inclusion of a strong magnetic field, while advantageous for the system, could result in unintended negative impacts to other components or systems located within close proximity to the quantum system.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our upcoming version, which will publish in December, will focus on the software layer of quantum systems. You can also sign up for our webinar on the topic as well, which will be held on Dec. 16 at 1 p.m. EST.

Cloud supports enterprise needs related to COVID-19, facilitating public cloud revenue growth

With Amazon Web Services (AWS), Microsoft, Google and Alibaba established as the IaaS cloud market leaders, Technology Business Research, Inc. (TBR) has noted an increase in partner ecosystem activity, particularly among IT services vendors, such as Accenture, Infosys and Cognizant, that are vying for a share of cloud services like migration and implementation.

Consolidation will accelerate as leaders embrace coopetition, evidenced by activity from Microsoft and Oracle that allied to target AWS’ dominance in the IaaS space. This trend will further separate the leaders from the rest of the pack while creating an adjacent opportunity as customers deploy multivendor and hybrid cloud environments — which bodes well for infrastructure specialists such as IBM’s Red Hat and VMware, particularly as the latter maintains its emphasis on being vendor agnostic. Further, TBR expects rising enterprise appetites around technologies like containerized applications will facilitate PaaS market momentum in the near term as customers develop and test the application frameworks internally before making them live on their hybrid architectures.

Public cloud remains the largest and fastest growing segment of the cloud market. The outbreak of COVID-19 has forced enterprise customers to increase their usage of cloud infrastructure and solutions, a trend that will benefit leading cloud providers and lead to further consolidation in areas such as IaaS and PaaS through the current forecast period. The Public Cloud Market Forecast details how hybrid deployments, new use cases for enterprise apps, and trends in emerging technology will make public cloud even more relevant in the future.

NVIDIA acquires ARM: Creating a next-generation AI platform

NVIDIA announced Sept. 14 an agreement to acquire ARM holdings from SoftBank for $40 billion, subject to regulatory approval in the U.S., the U.K., the European Union and China. The acquisition has been rumored for several weeks, but the announcement generated negative comments from ARM customers. The two companies’ IP portfolios complement each other, especially in the context of rapidly growing AI workloads. TBR believes the combined company can successfully create new integrated AI hardware platforms, while growing profitable in each former company’s primary business, graphics processors for NVIDIA and mobile CPUs for ARM.

Complementary IP and different business models

ARM is in the CPU business. NVIDIA is in the graphics processing unit (GPU) business, and NVIDIA GPUs are increasingly used in non-graphics AI processing applications. Both companies rely on microprocessor design to deliver value and grow their businesses, but the way each company monetizes its IP is very different. NVIDIA is a traditional product-based business; it makes processors and boards that it sells to equipment manufacturers and to cloud service providers. ARM follows a licensing model; it sells the rights to use its designs and instruction sets to equipment manufacturers that often modify the ARM designs to meet their needs.

One concern of current ARM customers is that NVIDIA will eventually move ARM to a product model; only NVIDIA will make hardware that incorporates ARM designs, shutting off customers’ ability to customize ARM-based chips. This would be a disaster for the major mobile OEMS, including industry behemoths Apple and Samsung. ARM chips power virtually all smartphones and tablets, and mobile vendors rely on derivative ARM designs for differentiated products. Apple makes its own modifications and recently announced that its PCs will be migrated from Intel to ARM processors, allowing the company to have a uniform hardware platform for all its major products. Samsung designs its own ARM processors but relies on third-party ARM designer Qualcomm for many of its products. To make matters more confusing, Samsung manufactures both Qualcomm and Apple processors.

NVIDIA announced that it would continue the current ARM licensing business model and, in fact, would license some of its GPU IP in the same manner. Nevertheless, ARM customers are concerned because strategically vital licensed IP would now be owned by a hardware vendor. TBR believes the ARM licensing model will continue for ARM designs and the same model will greatly benefit NVIDIA’s GPU business as well.

NVIDIA is transitioning from graphics to AI

NVIDIA is the dominant vendor in GPUs, and for that reason, if its processors were used only for graphics, its growth would be limited to the growth of graphics applications. GPUs, however, are also well-suited for AI deep learning applications because both graphics and deep learning rely on massively parallel processing.

2Q20 is a crossover quarter. For the first time, NVIDIA data center revenue, which is almost all AI, was greater than revenue from graphics applications in PCs. NVIDIA data center revenue grew 167% year-to-year; NVIDIA will soon be dominated by AI applications in data centers. There is competition in AI processors from Google’s tensor processing unit (TPU) and from field-programmable gate arrays (FPGAs), as well as several new AI processing entrants, including two from Intel. Nevertheless, NVIDIA enjoys an enormous lead in a very rapidly growing business.

GPUs and CPUs working together

GPUs and CPUs coexist. Every device that uses GPUs for AI needs CPUs for all the other required processing. In data centers, the CPU is now almost always an Intel product. While ARM designs are increasingly powerful, as illustrated by Apple’s decision to use them for PCs, they are not yet used widely for data center devices. Where the GPU is doing most of the work, however, ARM-NVIDIA designs could be quite viable. ARM-NVIDIA designs would also work well in edge devices. This synergy positions NVIDIA well in a world where deep learning is becoming increasingly important.

Applications for deep learning are becoming more diverse, creating a variety of settings and requirements for CPU-GPU platforms. This proliferation of design requirements is a challenge for a product-based company like NVIDIA. The ARM licensing business model fits this diversifying market very well. TBR believes NVIDIA will first experiment with the licensing of older GPU designs, but then move rapidly to licensing GPU IP for all AI applications, greatly accelerating adoption of NVIDIA designs for AI and inhibiting growth of competing AI chip designs.

The ARM acquisition will accelerate AI

While NVIDIA and ARM are not competitors, therefore reducing anti-trust concerns, many parties have expressed concerns about this acquisition. Both companies are very important, with NVIDIA dominating AI processors and ARM monopolizing mobile CPUs. There are also concerns about a U.S. company controlling these two critical components. In the U.K., there is concern about the loss of jobs. TBR, however, believes this union will prove beneficial, certainly to the combined company, but also to other companies basing their business on the growth of AI.

Quick Quantum Quips: Quantum algorithms and infrastructure reach milestones

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter will keep the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

August 2020 Developments

Like IBM did with its Selectric typewriters in the 1960s, the company is successfully weaving its quantum computing thread through myriad aspects of the greater quantum ecosystem, underpinned by strategic sponsorships and the inclusion of partners in the IBM Quantum Experience. Amazon Web Services (AWS) is pushing back on this approach by offering a vendor-agnostic view of quantum cloud computing. Academia has also thrown its hat into the ring with ongoing innovation and advancements in quantum computing. The competitive landscape of quantum computing has begun to take on the look and feel of the early classical computing world; however, the modern industry has addressed the mistakes made with classical computing, and therefore progress can be more formulaic and swift. August 2020 developments are starting to tie pieces of investments together to show a glimpse of when the post-quantum world may come, and as advancements continue the future state appears closer on the horizon than previously thought.

  1. AWS swiftly increased its presence in the quantum computing space by making its quantum computing cloud service, Braket, generally available. Underpinned by hardware from IonQ, D-Wave and Rigetti, Braket has been in testing mode for about eight months, during which time academic institutions and hand-picked customers, including Fidelity, were able to access and test the system. AWS intentionally selected the hardware vendors it partnered with because they all are underpinned by different quantum technology. AWS Braket comes to market to take on IBM and Microsoft, both of which have invested in quantum cloud services. However, a key difference is that IBM and Microsoft are also investing in their own quantum computing hardware while AWS has no current plans to do so.
  2. IBM continues to reach its targeted quantum computing goals, including successfully doubling last year’s quantum volume attainment of 32 to 64 in August. A 27-qubit system achieved this quantum volume milestone, and a mixture of hardware enhancements and software were drivers behind its success. While on its own this is not a particularly notable achievement in terms of commercial applicability, IBM’s ability to double quantum volume annually makes it clear that commercial applications are just around the corner in the quantum space, especially if the applications are leveraged in conjunction with high-performance computing. In total, IBM now has 28 quantum computers available through the IBM Quantum Experience.
  3. MIT led a weeklong summer camp for high school students on quantum computing called Qubit by Qubit. This is significant in the quantum computing realm because the pool of qualified personnel in quantum computing is so limited and the technology is still such a long game that many high school students are unaware of the career opportunities in the space. However, the quantum space needs to develop a pipeline of students who eventually major in a quantum-related field for the technology to succeed long-term. It cannot scale commercially with just a few thousand qualified personnel in the world to work with it. While COVID-19 has wreaked havoc on many aspects of everyday life, access to information has never been easier as experts are offering lectures and other activities online, providing eager learners with far more opportunities to gain knowledge. The summer camp was paired with a yearlong course if students chose to pursue it, and both programs were created in partnership with The Coding School. The summer camp was an online program that included live instruction sessions. TBR believes the summer camp focused on superconducting quantum computing because Amir Karamlou is focused on the topic as an MIT alumnus and graduate research fellow and because camp sponsor IBM conducts its own research on superconducting quantum computing. IBM was one of the technology sponsors of the program.
  4. The University of Sydney is working on developing an algorithm that can predict the noise impacting qubits in a given environment. While the project is still in the developmental phase, the researchers were able to map the noise of qubits in an experiment and believe the technology will be scalable and will enable users of quantum systems to leverage their algorithms to adapt a system to overcome the impacts of the noise. The test was done on a 14-qubit IBM system accessed through the IBM Quantum Experience.
  5. Rigetti raised $79 million in a round of Series C funding in August. The round of funding was led by Bessemer Venture Partners, which added members of its team to Rigetti’s board of directors as a result. TBR notes that Rigetti faces an uphill battle as hardware innovation is the most expensive aspect of quantum innovation and the majority of its quantum hardware competitors are major, better capitalized corporations with a division devoted to quantum hardware. Rigetti continues to raise funds through funding rounds, which increases the risk that investors will become anxious to see ROI and forgo further investment or seek faster repayment.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our most recent version, which focused on services, was released in June. Look for our next iteration in December, focused on middleware.

COVID-19 causes analytics services market to pause, allowing vendors to prove the true value of analytics and better train their AI models

All vendors tracked in TBR’s Digital Transformation: Analytics Professional Services Benchmark except Oracle expanded their analytics services revenue in 1Q20, albeit at a slower pace from the previous year, highlighting that optimizing IT operations — through the use of analytics — is becoming table stakes for buyers.

Accenture took over the No. 1 spot from IBM Services in revenue size in 1Q20, something TBR saw coming a couple of years ago. In TBR’s 1Q18 A&I Professional Services Benchmark, we wrote, “In 1Q14, when TBR launched the inaugural edition of this benchmark, Accenture’s quarterly A&I services revenue was just over half the volume of IBM’s. In 1Q18 Accenture was nearly 85% of IBM’s size in overall A&I services revenue, surpassing Big Blue in three service lines and one region. Though IBM made significant strides to reshape its services organization over the last four years, those efforts came too late to protect its market share.”

TBR’s Digital Transformation: Analytics Professional Services Benchmark addresses changes in leading digital transformation vendors’ strategies and performances as well as their investments and go-to-market positions as it relates to the ever-evolving analytics services market. The report includes use cases and analysis of IT services’ and consultancies’ management of technology partnerships as well as highlights region-specific market trends to benchmark key service line, regional and operational data across 20 leading analytics services vendors.