Posts

Can Gelsinger restore the rule of Moore’s Law?

On Jan. 13 news broke that Intel CEO Bob Swan will retire and be replaced by Intel alum and current VMware CEO Pat Gelsinger, effective Feb. 15. Although Intel is facing challenges and has suffered setbacks in the last few years, it would be an exaggeration to say the company is in trouble. It dominates PC and server CPUs market share and extracts much greater profits from those devices than do their manufacturers.

Despite share gains by AMD, Intel’s hegemony over the x86 instruction set and its related silicon will assure continued profits for many years, as any transition in such critical components is slow and careful. Swan came to the corner office when Intel was struggling with issues that have persisted, and, in some cases, intensified. Some of these challenges have been self-inflicted from an inside-out perspective, while others have been outside-in threats that internal deficiencies have compounded. Notwithstanding, a growing number of activist investors have been pressuring Intel’s board for leadership change.

The inside-out challenge is to marry manufacturing scale and agility

Intel has enforced the rule of Moore’s Law on the industry for years. Faster, better, cheaper form factors have almost single handedly underpinned the digitization of business and the consumerization of IT throughout much of our daily lives. The inexorable march has seen the rapid rise and fall of business entities as proprietary minicomputer architectures gave way to the Microsoft Windows/Intel CPU, or Wintel, juggernaut that enjoyed a near virtual lock on the market. Intel built its share dominance on two core best practices: chip design and manufacturing.

Each new generation of CPUs requires both a thorough redesign and a massive technically challenging improvement in the chip manufacturing process. For decades, Intel has relied not only on its technical skills but also on its massive revenue to stay ahead of competitors. Other chip vendors rely on third-party chip factories, called foundries. Over the past three years, the main independent foundry, Taiwan Semiconductor Manufacturing Company (TMSC), has outperformed Intel, as has Samsung. The technology race in chip manufacturing is closely related to the thinness of the substrate. Intel has not yet produced its promised 7nm chip, and its current road map states it will not produce 5nm chips until 2023; whereas TMSC and Samsung produced 5nm chips in sample quantities in 2019.

Because of the delay in manufacturing technology, Intel has not been able to meet demand, resulting in PC vendor backlogs. These backlogs have been beneficial for PC vendors, reducing price competition and increasing margins. Intel margins are down, but not severely. The constrained supply made it easier for Intel’s main competitor in PC CPUs, AMD, to gain market share, but because of buyer conservatism and the long lead time necessary to design new PCs, the erosion has been small.

On the server side, Intel has to embrace a more agile manufacturing philosophy and a willingness to essentially become a contract manufacturer of third-party designs as the consolidated Wintel form factor gives way to multiple designs in what is commonly called accelerated computing. At the same time, market uniformity and scale are also giving way. Powerful, small, low-cost form factors are going to proliferate as digitization continues. Edge compute and various smart things will contribute to this shift, and the ability to run smaller manufacturing runs will become paramount.

Revamping Intel’s development process and pivoting to more agile manufacturing will be two core internal challenges confronting Gelsinger, but not the only ones. The outside-in pressures will mount as well.

Quick Quantum Quips: Integrations and abstractions quicken quantum readiness

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). The overall activity around quantum computing has been accelerating as more pieces of the quantum ecosystem inch closer to commercial readiness.

For more details, reach out to Geoff Woollacott or Stephanie Long to set up a time to chat.

May 2020 Developments

Selecting five key announcements becomes more challenging as quantum activity begins gaining market momentum. The highlighted excerpts illustrate the growing visibility quantum has among governments, venture capitalists, and a variety of researchers seeking to gain familiarity with the technology for application discovery. That said, TBR recently conducted interviews for next month’s Quantum Computing Market Landscape, which will focus on professional services, and found respondents from services firms are less optimistic about economic advantage being achieved within the next year but believe large enterprises need to be experimenting with quantum tool sets now for application discovery to capitalize on the technology when it achieves economic advantage. Similarly, governments are backing different entities to ensure their nation states have competencies in quantum technology as a way to protect their economies and increase national security.

  1. Congress had a bill introduced by Republican Rep. Morgan Griffith, R-Va., called the Advancing Quantum Computing Act, that would mandate the Department of Commerce and other relevant agencies conduct four different studies on quantum computing to assess the technology’s impact on U.S. commerce and society. Within two years of the bill’s passage, the Department of Commerce and other relevant agencies are to conduct studies that 1) rank industries and the extent to which they are leaning into quantum, 2) calibrate all federal interagency activities related to quantum, including agencies with regulatory oversight of private sector quantum activity, 3) rank the activities of at least 10 and not more than 15 different countries, including how these country activities compare to U.S. quantum market maturity, and 4) focus on the quantum ecosystem supply chain to assess the current, emerging and long-term risks. The Department of Commerce would then have six months to share its findings with the House Energy and Commerce and Senate Commerce, Science and Transportation committees. While it is a well-intentioned bill, TBR questions the lengthy timeline for enactment of these studies, believing the quantum market will evolve rapidly and that six-month-old market information could be dated when addressed by the relevant agencies.
  2. Australia aims to grow its domestic quantum computing capabilities through a working partnership between the Pawsey Supercomputing Centre and Quantum Brilliance intended to begin implementing the recommendations of Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s national science agency. Quantum Brilliance is using diamond to develop quantum computers that operate at room temperature without the need for cryogenics or complex infrastructure. Quantum Brilliance CEO Andrew Horsley makes the bold claim: “Diamond means that instead of filling up a room with cryogenics, you can hold a quantum computing in your hand.” Quantum Brilliance machines will be on-site at the Pawsey Supercomputing Centre and will be available to “anyone, anywhere” interested in working on quantum engineering.
  3. Entropica, a Singapore-based quantum computing software startup, raised S$2.6 million in seed funding led by tech investor Elev8. U.S.-based Rigetti Computing also participated in the round. Entropica’s co-founders Tommaso Demarie and Ewan Munro previously worked at Singapore’s Centre for Quantum Technologies before starting Entropic in May 2018. Entropica is another entrant aiming to build the integration layer between classical and quantum computing that is vital to broader commercial adoption of the technology. In addition to Rigetti, Entropica cites collaborating with quantum systems manufacturers IBM and Microsoft.
  4. QuTech and Intel released a paper on their work to develop a faster turnaround testing and validation of quantum materials and devices. The approach reduces the complexity of cryogenic components by allowing them to be readily integrated in any type of cryostat. The researchers claim the approach increases the number of wires that can operate at cryogenic temperatures in the cryostat by an order of magnitude. This discovery mitigates a current input/output bottleneck based on the current limitations of the number of wires operating at cryogenic temperatures. The cryogenic multiplexer platform is based on common CMOS componentry and has the potential to reduce the form factor of existing superconducting systems. QuTech is a collaboration between TU Delft and TNO.
  5. IBM announced enhancements to its Qiskit Quantum Algorithms and Applications (QA&A) toolset. First, IBM announced a circuit library consisting of feature-rich circuits suited for practical applications or for complex theoretic properties; second, IBM introduced the Optimization module, which is a library for researchers and beginners to engage in development and experimentation in quantum combinatorial optimization; and third, IBM rebuilt Qiskit’s core algorithmic tools to optimize them for research and prototyping. The Qiskit enhancements have also been fashioned to address users with eight different backgrounds from beginners to experienced quantum information theorists as well as software engineers, quantum chemists, optimization researchers, finance researchers, physics researchers, and machine learning researchers.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our latest version published in December. The next iteration, scheduled to publish in June, will focus on the quantum-related professional services being deployed to increase business awareness and technical skills that will be in short supply once quantum’s economic advantage becomes reality.

And, lastly, on behalf of the entire TBR team, we hope you stay healthy and safe in these unique times.

Not your father’s partner programs: How vendors and partners are evolving cloud ecosystems

Chicken or egg first? For partner programs, that makes a big difference

As Intel (Nasdaq: INTC), Microsoft (Nasdaq: MSFT) and Cisco (Nasdaq: CSCO) created the modern computing era in the 1990s, partner programs were at the forefront. The success of these companies and the distributed computing era in general was largely built on the backs of technology and distribution partners. In fact, these companies still rely on partners to drive a majority of their revenue today. The same cannot be said for the cloud era of IT, which was led by the direct sales strategies of top vendors such as Salesforce (NYSE: CRM) and Amazon Web Services (AWS; Nasdaq: AMZN). These two vendors became leaders in their respective cloud markets by selling directly to customers, bypassing distribution partners altogether. Partners are certainly playing a larger role now, but the timing does impact their position in the value chain for cloud. Without a well-defined value-add in the self-service, transactional and passive sales strategies for cloud, partners are forced to create or carve out activities that are both unaddressed by the cloud provider and hold value for the end customer. Rather than traditional IT vendors relying on partners to drive their business, in cloud those partners are on their own in many respects to identify and develop their own value-add. Being creative, developing intellectual property and focusing on the gaps between multivendor solutions are much more important activities for partners in cloud programs compared with traditional ones.

Partners may look the same, but are in fact quite different

“What does a cloud partner look like?” was a common question as these new cloud-centric programs came to be. It was unclear if a new startup class of born-on-the-cloud partners would come into existence, or if the existing stock of VAR, distributor, MSP, systems integration (SI) and hosting partners would eventually transform their businesses to align with the new cloud business opportunities. As shown in Figure 1, the types of partners participating in new cloud programs is just the first category of changes programs are undergoing. As the answer to what type of partners are needed for these programs comes into view, it is looking like a little bit of the former and a lot of the latter. Cloud-native partners that are focused on consulting, managed services, intellectual property development and cloud solution integration hold a small but important space in the market. The difficult thing for vendors is that there are not very many of these newly formed partners, and to make matters worse, many are being acquired. It is also difficult to spur their creation or fit them into a traditional partner program. While traditional partners are cattle that can be controlled and herded in a consistent direction, cloud-native partners are wilder animals that create, forge and follow their own path. In terms of existing partners changing to focus on cloud solutions, that, too, is a difficult task. The truth is that many traditional VAR-type partners, focused on reselling and implementation activities, may not survive the transition to cloud solutions. Part of this is generationally driven, as many of the baby boomer-owned partner businesses lack the incentive to adapt their business model with retirement looming. Many of these partners will ride the slow decline of traditional IT opportunity until eventually closing their doors. Those traditional partners that do make the transition to a more cloud-focused business model will compose the largest segment of cloud partners. While they may keep the same name, these partners will be operating in a fundamentally different manner compared with traditional partner models.

emerging trends in partner program attributese
Figure 1: Emerging Trends in Partner Program Attributes

AI chips: Explosive growth of deep learning is leading to rapid evolution of diverse, dedicated processors

Artificial intelligence (AI) utilization has been accelerating rapidly for more than 10 years, as decreases in memory, storage and computation cost have made an increasing number of applications cost-effective. The technique of deep learning has emerged as the most useful. Large public websites such as Facebook (Nasdaq: FB) and Amazon (Nasdaq: AMZN), with enormous stores of data on user behavior and a clear benefit from influencing user behavior, were among the earliest adopters and continue to expand such techniques. Publicly visible applications include speech recognition, natural language processing and image recognition. Other high-value applications include network threat detection, credit fraud detection and pharmaceutical research.

Deep learning techniques are based on neural networks, inspired by animal brain structure. Neural networks perform successive computations on large amounts of data. Each iteration operates on the results of the prior computation, which is why the process is called “deep.” Deep learning relies on large amounts computation. In fact, deep learning techniques are well known; the recent growth is driven by decreasing costs of data acquisition, data transmission, data storage and computation. The new processors all aim to lower the cost of computation.

The new chips are less costly than CPUs for running deep learning workloads

Each computation is limited and tends to require relatively low precision, necessitating fewer bits than found in typical CPU operations. Deep learning computations are mostly tensor operations — predominantly matrix multiplication — and parallel tensor processing is the heart of many specialized AI chips. Traditional CPUs are relatively inefficient in carrying out this kind of processing. They cannot process many operations at the same time, and they deliver precision and capacity for complex computations that are not needed.

Nvidia (Nasdaq: NVDA) GPUs led the wave of new processors. In 2012, Google announced that its Google Brain deep learning project to recognize images of cats was powered by Nvidia GPUs, resulting in a hundredfold improvement in performance over conventional CPUs. With this kind of endorsement and with the widespread acceptance of the importance of deep learning, many companies, large and small, are following the money and investing in new types of processors. It is not certain that the GPU will be a long-term winner; successful applications of FPGAs and TPUs are plentiful.

Intel: Optimizing its scale advantage for Business of One flexibility

TBR perspective

Usually sound business execution of world-class engineering, coupled with world-class monolithic manufacturing, has made Intel a dominant force around which technology businesses have orbited for decades. Intel’s dominance has been baked in the PC and server form factors, while ever smaller price points and form factors have shifted end-customer purchase criteria from computational performance specifications to business outcomes and user experiences.

Intel’s success has broadly expanded IT to address business problems and reshape our personal lives. Intel’s revenue growth prospects have diminished as its innovation has continued to increase the capacity and shrink the form factors and unit cost of its products. Intel delivers mature components that are embedded in mature products. Nevertheless, Intel thrives. The company has made mistakes, though, such as failing to address the mobile market. Intel’s capital- and engineering-intensive business requires it place large bets on its vision of the future. Now, facing waves of innovation in artificial intelligence (AI), Internet of Things (IoT) and processor design, Intel is, in effect, rearchitecting the company to reduce its dependence on the CPU, and thereby expand its market.

The key to Intel’s new architecture is companywide integration. Intel has always had more products and technologies, including video, networking, storage and memory silicon, than CPUs. As silicon becomes more diversified and is embedded in an increasing number of devices, Intel aims to create, along with customers, a far larger variety of solutions, often at a much smaller scale than the company’s monolithic products. To capitalize on the company’s enormous intellectual property, Intel must break down silos within the company. This will result in products that will often benefit from breaking down silos in silicon by facilitating the integration of computation, storage and communications.

The cultural challenge Intel will face will be in orchestrating and timing the various development teams such that the innovation cycles come together in world-class packages of tightly coupled compute, storage and networking form factors to power the smallest of edge compute instances and the largest of the high-performance computing (HPC) instances. The necessary work of rearchitecting the sales and marketing organizations remains for the next CEO, who has not yet been named, but the task is far less daunting than coordinating development and manufacture.

The thread that will stitch together these instances in the multicloud, always-on world of compute will be software. Software made interoperable through a “pruning,” as Intel Chief Engineering Officer and Technology, Systems Architecture & Client Group President Murthy Renduchintala described it, of the existing assets and frameworks into a cogent set of frameworks and tool sets to power innovation and optimize these scaled designs for specific workloads powered by AI is fed by voice and video as much as they have been fed by human interaction through keyboards in the past.

 

Intel Analyst Summit: Intel (Nasdaq: INTC) hosted an analyst event for the first time in four years to outline its technology road maps through 2021 and to articulate the business and cultural changes it believes are necessary for it to capitalize on the growing business opportunity Moore’s Law economics has unleashed. The senior leadership team gave about 50 analysts very detailed and frank briefings under a nondisclosure agreement (NDA), with ample time for follow-up conversations throughout the event.