Fujitsu’s Strategic Evolution: Transforming for a Future with Uvance at the Core

On Oct. 1, TBR attended Fujitsu’s Executive Analyst Day in Santa Clara, Calif., and engaged with Fujitsu leaders, including Tim White, chief strategy officer; Ted Okada, SVP and head of Technology; Ted Nakahara, SVP and Head of Strategic Alliances; Fleur Copping, VP of Strategic Alliances in Regions; and Asif Poonja, EVP and CEO of Fujitsu Americas. The following reflects main stage presentations, breakout sessions and one-on-one discussions, as well as TBR’s ongoing analysis of Fujitsu’s business model, strategy and performance.

Fujitsu in Transition, with Clear Direction and Intent, Playing to Strengths

Three things about Fujitsu stand out in a crowded IT services and consulting market. First, the company is in the middle of an organizational evolution, changing its business model to fit emerging client demands and orienting its go-to-market strategy around Uvance. Second, Fujitsu’s commitment to change in the Americas has completely remade the company around IT services and consulting, with aspirations to become a technology consulting leader. And third, Fujitsu’s alliances strategy, while still dependent on labor-intensive relationships and persistent account-level management, includes all the best practices TBR has seen from larger competitors, with at least one unique twist. In short, Fujitsu’s evolution will likely make the company a highly capable contender as the IT services and consulting market changes.

 

At the start of the analyst event, Tim White, chief strategy officer, explained that Fujitsu’s transition has been underway for a few years and has included allowing the Americas business to shed everything except services. As part of the overall transition, Fujitsu committed to expanding consulting while continuing to deliver on core IT services and modernizations. White noted that Fujitsu is roughly halfway through a three-year plan to grow services and the Americas region has already surpassed targets for 2024. For example, Uvance accounts for 37% of Fujitsu Americas’ business, above the 30% goal.

 

Critically, according to White, Fujitsu has not lost a step on technology advances or quality of services delivered, so clients and alliance partners continue to be well served. The change — the evolution — is primarily in how Fujitsu sees itself and its future. And that future is Uvance.
 

In TBR’s view, understanding Fujitsu’s existing and evolving business model, strategy and performance requires, perhaps surprisingly, a certain separation from the typical analysis, if only because of Fujitsu’s current transition.
 
While there is perhaps some uncertainty among analysts around Fujitsu’s brand, specific offerings and organizational structure, TBR sees no evidence that Fujitsu’s clients and technology alliance partners lack the clarity required to make decisions about Fujitsu’s capabilities, scale and skills.
 
Undoubtedly, Fujitsu’s brand in the Americas could use a significant boost — without which a ceiling could remain for the company’s growth — but the importance of marketwide brand recognition pales in comparison to a successful track record of delivering IT services and consulting, providing innovative solutions, and leveraging the latest technologies to solve clients’ problems.

 

Uvance Is “the Future State of Fujitsu’s Portfolio”

Fujitsu’s leaders stressed the centrality of Uvance in the company’s strategy and vision for IT services, consulting and technology. White described Uvance as “the future state of Fujitsu’s portfolio.” Asif Poonja, CEO of Americas, said, “Uvance is the center of our strategy.” At the center of Uvance is consulting. Fujitsu announced a goal to hire 10,000 consultants, but White and others explained that Fujitsu’s focus is not the number but the portfolio shift toward consulting while still serving clients who need core IT services and modernization.
 
Poonja noted that Fujitsu will focus on technology consulting, rather than McKinsey-style business consulting, playing to Fujitsu’s legacy technology strengths. In TBR’s view, technology-led consulting reflects the current demand among enterprise consulting buyers to infuse every consulting engagement with technology, a trend well underway before the hype began around generative AI (GenAI). Fujitsu’s leaders added that Uvance Wayfinders — essentially business and technology consultants — are able to pull together all of Fujitsu’s capabilities and offerings.

 

In TBR’s view, Uvance is the framework around the company’s “SaaS-like” business model, with the leaders using the term “SaaS-like” but recognizing the phrasing may need further refinement and/or explanation. Fujitsu will use platform-enabled services to drive higher-value conversations and engagements, led by the consultants the company is planning to hire and/or acquire. Fujitsu will sell IP when needed and drive managed services through its delivery capabilities. The shift in the Americas toward becoming an asset-light organization is the first step, and the second step is expanding consulting capabilities and scale. The third step is organizing delivery under a globally run P&L (which Fujitsu may have already begun).
 
Meanwhile, modernization services — moving from mainframe to cloud — remains the engine that keeps Fujitsu running. The company still has its own data centers outside the U.S. and also still has plenty of clients running on mainframe, especially in their core verticals, like public services. For TBR, Uvance’s success may depend on broader adoption of the asset-light Americas strategy, albeit at a pace that does not compromise quality or lose clients in core markets. Again, Uvance is the future state of Fujitsu’s portfolio.

Fujitsu Americas: “Leveraging Global Pillars to Grow”

As described by Poonja and White, Fujitsu in the Americas has persistently pared down its offerings to focus only on IT services and technology consulting, playing to Fujitsu’s strengths and concentrating on industries in which the company has proven capabilities, well-established relationships with clients and differentiated offerings.

 

Poonja added that, although Fujitsu Americas earned a small percentage of Fujitsu’s overall revenues, corporate leadership in Japan recognize the importance of the Americas market and understand the challenges of building a more widely known brand. Poonja stressed that Fujitsu Americas would continue “leveraging global pillars to grow” while staying focused on regional strengths, specifically in government, manufacturing and AI.

 

In TBR’s view, Fujitsu Americas’ current state and trajectory align well with Fujitsu’s overall corporate strategy. The business aspires to be a top technology consulting company and appreciates the difference between being skilled at technologies and being able to make the business case for Fujitsu’s solutions. As an integral part of its strategy, Fujitsu Americas consistently pulls in the global company’s broader strengths and capabilities.

 

The use cases that Fujitsu’s leaders shared during the event highlighted the company’s technology, such as 5G and AI, and its deployable, offshore scale. Overall, Fujitsu Americas’ leadership presented a compelling story of evolution, strategic focus, early positive results and appreciation for current weaknesses. In contrast to analyst events dominated by marketing messages, Fujitsu maintained a substantive and clear-eyed atmosphere, with discussions centered on realistic expectations for Fujitsu Americas’ changing position in the IT services and consulting market.

Fujitsu’s Alliances: Doing the Hard Work While Taking Customer Zero to Another Level

In both the formal presentations and the informal discussions, Fujitsu’s leaders impressed TBR with the fullness and maturity of the company’s alliances strategy. The ecosystem has changed substantially in recent years, forcing companies to rethink their partnering strategies and more closely examine the best practices of peers, competitors and alliance partners. This shift has been an ongoing focus of TBR’s research, which has increasingly been used by alliance leaders at global technology companies as they undergo this transformation.

 

As part of this research, TBR has analyzed a wide range of alliance strategies and activities, from inadequate and underfunded to strategically thoughtful and exceptionally well managed. Fujitsu Americas, in TBR’s assessment, lands solidly in the latter category, based on the full range of investments and activities that Fujitsu’s leaders described with respect to their five strategic partners: Amazon Web Services (AWS), SAP, Microsoft, Salesforce and ServiceNow. (Note: See TBR’s ecosystem reports for more information.)

 

According to Fujitsu’s leaders, the next strategic partner will be determined by Uvance’s business strategy and continued evolution in the technology space, particularly AI. Keeping perspective on the challenges of managing technology partners, Fleur Copping, VP of Strategic Alliances in Regions, noted that every alliance relationship requires constant attention and, often, engagement-by-engagement reinforcement around Fujitsu’s offerings, capabilities and value proposition. Copping further acknowledged that Fujitsu needs to strengthen partner cosell activities. In other words, even when executing on all the best practices, alliance management remains a hard slog.

 

During the event, TBR noted two additional points on alliances — areas that are perhaps unique to Fujitsu. First, TBR has consistently heard that the customer zero approach to new technologies and offerings resonates with clients by bringing credibility and assurance. IT services companies, consultancies and their technology partners have also told TBR that the customer zero approach helps solidify alliances and can lead to innovations and new solutions. Fujitsu appears to be taking customer zero to the next level. For example, Copping described how Fujitsu brought its internal human resource management professionals to a client meeting about a joint Fujitsu-ServiceNow opportunity. The Fujitsu professionals told the client about their own experiences using the ServiceNow solution. This more personal touch resonated with the client and demonstrated the fullness of Fujitsu’s capabilities to alliance partner ServiceNow.

 

Second, Copping noted that because many of Fujitsu’s customers “don’t have as much of a voice” with the cloud vendors and software giants as the largest enterprises, Fujitsu can be an advocate for these small and midsize enterprises, amplifying their concerns and needs to the likes of Microsoft and SAP. TBR has not heard Fujitsu’s peers explicitly state this marketing message. As a matter of positioning, particularly with technology partners, Fujitsu’s message could be another way of gaining mindshare and differentiating from IT services and consulting competitors.

Consulting Is Harder Than It Looks; Fujitsu Has a Good Plan

White “unabashedly” characterized Fujitsu as a technology company, but emphasized using technology as a means to deliver services rather than making technology a commodity play. In the Americas in particular, Fujitsu would not “move away from our heritage as a technology company” but would more fully embrace consulting and the future portfolio of Uvance.

 

In TBR’s view, keeping Fujitsu’s heralded research, innovation and technology capabilities as foundational strengths makes strategic sense while leaving open questions around consulting. For example, one Fujitsu leader outlined the company’s AI sales approach in four basic steps:

  1. Get the client interested in Fujitsu’s technology
  2. Do a proof of concept with Fujitsu’s AI platform
  3. Allow the client to use a precommercial instance of the platform
  4. Bring in Uvance to develop a full solution, highly customized to the client

 

The fourth step, at a minimum, requires consulting skills, business knowledge and industry expertise, although many of Fujitsu’s peers include those elements throughout the sales and delivery process. Recruiting (or acquiring), retaining and managing consulting talent could affect Fujitsu’s corporate culture and undoubtedly will challenge Fujitsu’s leadership.

 

Further, and perhaps the most significant obstacle for Fujitsu in the Americas, will be gaining permission from clients to deliver consulting. By narrowing its scope to technology consulting — not the broad swath of strategy and operations consulting — Fujitsu plays to its own strengths, lessens the marketing load, and likely does not give up market share as the company is unlikely to displace firms like McKinsey & Co. or Boston Consulting Group (BCG).

 

Part of gaining permission, in TBR’s view, will be positioning Fujitsu differently with its current clients, particularly with respect to the key personas interacting with Fujitsu professionals. During the event, one Fujitsu leader described current clients’ struggles to adopt GenAI as a combination of an inability to do the basic work of making their data usable, the uncertainty around return on investment, and a fear of running afoul of the law as new regulations come into effect.

 

Yes, Fujitsu can address all of these concerns, but these hurdles impact and reflect the responsibilities of three different personas within an enterprise. Fujitsu’s challenge will be to become the preferred technology consulting provider for all three personas. In short, consulting is harder than it looks, and TBR believes Fujitsu has the right vision, strategy and approach. We will continue to monitor the company’s ability to execute.

 

TBR’s ongoing coverage of Fujitsu includes dedicated quarterly reports and inclusion in appropriate benchmarks, market landscapes and ecosystem reports. Log in to TBR Insight Center to view all current research.

6G Will Not be Like the Other G’s

TBR Perspective on 6G

6G is unlikely to look like the other G’s in terms of cycle length and scope and level of investment as the beleaguered telecom industry continues to struggle with implementing and realizing ROI from 5G. The telecom industry must also contend with supporting new use cases and how to embed AI, ML and sustainability into the fabric of the network while covering security gaps and preparing for a post-quantum cryptography world. Though there is tremendous brainpower (spanning the public and private sectors as well as academia) assembled to tackle these issues, growth prospects for the telecom industry continue to look challenging.
 
6G is shaping up to be an addendum to LTE and 5G, providing a new antenna overlay that supports net-new frequency bands, as well as enhanced spectral efficiency features and capabilities that provide further network performance and operational improvement. The missing link in the value equation remains how the telecom industry will monetize these new technologies beyond traditional mobile broadband (MBB) and fixed wireless access (FWA) services, and this lack of clear monetization threatens to relegate 6G to a continuation of what was observed during the LTE and 5G eras.
 
TBR continues to see no fundamental change or catalyst on the horizon that will bring CSPs more revenue. The primary incentive for CSPs to invest in 6G, therefore, will remain reduction in the cost per bit to support growing data traffic. This means the ROI for 5G still does not exist, which will likely limit the appetite and scope of investment in 6G. As such, TBR expects CSP capex investment for 6G will be subdued compared with previous G’s and deployment of the technology will be tactical in nature, which is a marked deviation from the multihundred-billion-dollar investments in spectrum and infrastructure associated with the nationwide deployments during each of the prior cellular eras.
 
Additionally, the 6G cycle may be significantly longer in duration than prior cellular generations due to the exponential increase in complexity inherent in these systems and the pace of data traffic growth, which has been slowing.
 
Against this backdrop, private cellular networks represent a real, significant threat to CSPs, as enterprises can extract most, if not all, of what they need from networks without requiring CSPs in the value chain. CSPs’ edge assets continue to be considered a key vector for CSPs to reassert themselves in the market, but this overlooks the alternative paths that enterprises and hyperscalers have to bypass CSPs to get what they need (e.g., real estate, access to power and fiber) at the edge layer.
 
The 5G cycle is now 5 years old, and the telecom industry is still struggling to adopt and deploy virtualization, open RAN and network slicing, much less a 5G standalone (SA) network architecture. This reality implies expectations for 6G will need to be tempered further. TBR believes 6G (at least the first phase of 6G, which will be represented in the 3rd Generation Partnership Project’s [3GPP] Release 21 standards) will only bring spectral and cost-per-bit efficiency improvements and potentially some net-new enterprise-specific features and capabilities. 6G is unlikely to bring any more significant or profound outcomes than 5G, at least not from CSPs.
 
TBR believes hyperscalers, government entities (especially the defense sector) and large enterprises are likely to reap the most benefit from 6G. For CSPs, 6G is likely to primarily be an infill solution to address complex environments and enhance network capacity and speed for existing MBB and FWA offerings.
 
Taken together, 6G will ultimately happen, and commercial deployment of 6G-branded networks will likely begin in the late 2020s, but it remains to be seen whether 6G will be a brand only or a legitimate set of truly differentiated features and capabilities that bring broad and significant value to the global economy. Either way, the scope of CSPs’ challenges is growing, while new value continues to be created outside their purview or goes over the top of their pipes.
 

Watch On Demand: TBR Principal Analyst Chris Antlitz discusses the Looming Business Disruption Among Operators and Vendors as They Strive to Change from Telco to “Tech-co” in the Coming Years

Impacts and Opportunities in 6G

Upper-midband Spectrum Is in Play for 6G

After an initial belief several years ago that 6G would leverage millimeter wave and terahertz spectrum, the wireless technology ecosystem has settled on the upper midbands, specifically in the 7GHz-24GHz range (also known as the Frequency Range 3 [FR3] tranche). Within FR3, 7GHz-15GHz is considered to be the golden range for 6G as it has the best balance between coverage and capacity and there is approximately 1600MHz of total bandwidth that could be made available in the U.S.
 
However, one of the biggest issues with these “golden bands” is the need for CSPs to coexist with incumbent users, such as government entities and satellite operators, which utilize some of these channels for various purposes and would need to either be cleared, refarmed or shared with CSPs for use in cellular communications. The telecom industry already has some experience with shared spectrum through CBRS, which operates in the 3.5GHz band, so there is a pre-existing framework and mechanism in place (i.e., Spectrum Access System) from which to begin establishing a spectrum sharing system for these new bands.
 
Ultimately, TBR believes that 6G will end up leveraging a mix of spectrum tranches, with midband, upper midband and mmWave frequencies all in play. Carrier aggregation and other frequency-combination technologies, as well as advancements in beamforming and endpoint devices, make these spectrum bands perform better when working together. Additionally, FR3 spectrum is not good at penetrating walls. Given around 80% of wireless traffic is generated indoors — a statistic that is unlikely to change materially in the 6G era — FR3 bands would need to be complemented with lower bands to penetrate walls and provide optimal coverage and capacity.

Nonterrestrial Networks (NTN), aka Satellite Connectivity, Enters the Mainstream

The NTN domain is flourishing, and satellite connectivity will be a mainstream technology for both businesses and consumers by the end of this decade. Satellite-provided connectivity will cover most of the Earth (and nearly the entire human population) with at least basic text messaging services, though some NTN providers will also provide high-speed broadband services as well as a range of other communications services, such as voice, just like a traditional CSP.
 
The most disruptive impact of NTN will be closing the cellular coverage gap and reducing the digital divide. Approximately 10% of Earth’s surface and 5% of the global human population, or around 800 million people, still lack cellular network coverage, and satellites can close this gap relatively quickly and at a significantly lower price compared to building out terrestrial macro base station sites in rural and remote areas. The ability to provide truly global network coverage has created a new paradigm in the telecom industry, shaping end-user expectations and pushing CSPs to align with (and increasingly compete against) NTN providers.

FWA Is Not Getting the Attention It Deserves

The mobile industry continues to largely view FWA as an ancillary offering, and the use case is not receiving the level of attention and innovation that it should given FWA’s resounding success in the market. Some attendees noted that current standards do not adequately factor in and focus enough on FWA and that networks are not architected to optimally support this use case. Spectral efficiency technologies tailored to optimize FWA traffic could free up significant capacity on existing networks that could be utilized for other purposes.
 
There are also energy-efficiency considerations for FWA. Mobile network operators (MNOs) have a vested interest in pushing standards bodies and network vendors to innovate on FWA because margins are low and there is room to alleviate some of this margin impact by applying technological innovations. In addition, MNOs want standards bodies and vendors to focus on architecting cellular standards to support unlicensed spectrum bands so that network coverage and capacity can be enhanced with minimal investment by aggregating licensed spectrum with unlicensed spectrum. The 6GHz band is especially pertinent to these considerations.

The Energy Problem Has No Easy Fix

Though the wireless technology ecosystem will continue to eke out gains on energy efficiency and performance, an as-yet-undetermined paradigm shift will be required to fundamentally break the linear relationship between network performance and energy usage. Additionally, AI is unlikely to help address this issue when factoring in the net energy impact because AI workloads are inherently power hungry.
 
Given this rising demand for energy, in addition to driving further reduction in the cost per bit, the broader economy and public sector should focus more on innovations in energy production and distribution, such as more deeply exploring small modular [nuclear] reactors (SMR) and cold fusion, to produce and widely distribute high-output, sustainable, low carbon-footprint energy. Said differently, it will become increasingly difficult to squeeze energy efficiency out of network infrastructure, so focusing on creating cleaner energy at greater scale is a sounder long-term strategy than emphasizing a lower net utilization of energy to achieve sustainability goals.

AI and ML Will Initially be Leveraged for Network Optimization

AI and ML will come into the network domain slowly. Network optimization-related use cases will likely be the initial focus areas, as AI and ML can provide significant outcomes by running complex simulations, such as ray tracing, propagation modeling and channel management (e.g., spectrum access sharing and dynamic spectrum sharing) at scale.
 
Though AI and ML promise a higher degree of automation to accomplish optimization-related tasks, there is concern that the amount and cost of energy required to run these simulations will outweigh the benefits. There is some validity to this concern, but attendees were confident there will be pockets of use cases or workarounds that will mitigate energy consumption and make networks more resilient and higher performing by leveraging AI and ML.

Western Governments Need to be More Proactive to Keep Their Countries at the Forefront of Innovation

Evidence suggests the West is falling behind China in key technologies, most notably in 5G SA, 6G, quantum computing, SMR and other key areas, despite Western governments allocating unprecedented sums of fiscal and monetary support for the technology sector and broader economy during and immediately after the COVID-19 pandemic. Governments, therefore, will need to take a more assertive approach rather than setting big-picture guidelines and relying on the private sector to figure things out. Since the current model is not yielding the desired results, a change will be needed to alter the trajectory. Greater reliance on hyperscalers will likely factor into the equation for a solution.
 
The most glaring deficiency in the Western world is regulatory clarity and policy agenda. For example, the U.S. Federal Communications Commission has been restrained and restricted from advancing important spectrum policies, and special interests have been creating encumbrances that slow down or prevent the wireless technology ecosystem from optimally moving forward (e.g., inconsistent policies around private spectrum and the use of shared bands like 6GHz create harmonization challenges and disincentivize attaining critical mass in the broader industry).

Scope of Government Support for the Telecom Industry Will Likely Increase

The persistent lack of ROI to justify private sector investment in 6G (and cellular networks more broadly) will ultimately push governments deeper into the telecom industry, prompting governments to increase the scope of their involvement in the wireless technology ecosystem as well as make these support structures more embedded in nature. During the first half of the 5G cycle, governments from various countries around the world pumped many hundreds of billions of dollars in aggregate into their respective domestic technology sectors via various stimulus programs, which provide direct or indirect, low- or zero-interest rate loans, subsidies and other means of market support.
 
Additional government backing will be required to enable the full benefits of 6G to come to fruition. Governments have a vested interest in supporting the telecom industry and the broader technology sector as it provides innovations of societal and national security importance and serves as foundational infrastructure to support long-term economic development. TBR expects governments in technology-forward countries (especially the U.S., China, Japan and South Korea) and regional blocs (e.g., the European Union) to continue underpinning R&D programs, subsidizing and/or directly paying for infrastructure deployment, and backstopping industry players that relate to national security concerns.
 
This model of industry stimulation was witnessed at unprecedented scale during the COVID-19 pandemic and now serves as a model for further government involvement. Workforce development has also emerged as a top-of-mind initiative for some governments as a means of preparing domestic workforces to handle new technologies and to offset the negative economic externalities that emerge from the impact of these new technologies (e.g., labor displacement from AI and how this can be mitigated).

Conclusion

6G will happen one way or another, with commercial deployments and services branded as 6G likely to commence by pioneering CSPs by 2030 (as originally expected within the confines of 10-year cellular generation cycles), but the wireless technology ecosystem seems to be absorbing much more than it can handle.
 
In addition to addressing the evolution of 3GPP standards for 6G, the ecosystem must also incorporate AI, ML, quantum and other nascent technologies as well as meet societal objectives, such as carbon zero, to align with theoretical expectations for the new G and the new use cases the technology is expected to enable.
 
The requirements for 6G are causing complexity to increase and are likely to make the ecosystem fall short on delivering these outcomes. Greater investment, collaboration and alignment across the public and private sectors, as well as with academia, will be required to address these challenges and set the telecom industry on a better path.

Infosys Collaborates with Clients and Partners to Navigate What’s Next in Their AI Transformation Programs

Strong Services Execution, Enabled Through Infosys Cobalt and Focused on Outcomes, Provides Foundation Upon Which Infosys Can Build AI Strategy

The steady performance of Infosys’ cloud business highlights the company’s pragmatic approach to its portfolio and go-to-market efforts, largely enabled by Infosys Cobalt.

 

Building on Infosys Cobalt’s success, the company now has an opportunity to steer client conversations toward AI and is positioning Infosys Topaz as the suite of services and solutions that can bring it all together. Agentic AI (i.e., autonomous AI) is the newest set of capabilities dominating client and partner conversations. Scaling AI adoption comes with implications and responsibilities, which Infosys is trying to address one use case at a time. For example, earlier in 2024, Infosys launched the Responsible AI Suite, which includes accelerators across three main areas: Scan (identifying AI risk), Shield (building technical guardrails) and Steer (providing AI governance consulting). These capabilities will help Infosys strengthen ecosystem trust via the Responsible AI Coalition. Infosys also claimed it was the first IT services company globally to achieve the ISO 42001:2023 certification for ethical and responsible use of AI.
 
Regardless of the client’s cloud and AI adoption maturity, everyone TBR spoke with and those who presented at 2024 Infosys Americas Confluence agreed that the need for data strategy and architecture comes first. Two separate customers perfectly summarized the state of AI adoption: “You can’t get to AI without reliable data across the supply chain,” and “GenAI is not a magical talisman. Companies need to build true AI policy and handle GenAI primitives before scaling adoption, with the shift in mindset among developers and users a key component.”

 

Infosys recognizes that AI adoption will come in waves. The first wave, which started in November 2022 and continued over the last 18 to 24 months, was dominated by pilot projects focused on productivity and software development. In the current second wave, clients are starting to pivot conversations toward improving IT operations, business processes, marketing and sales. The real business value will come from the third wave, which will focus on improving processes and experiences and capitalizing on opportunities around design and implementation. Infosys believes the third wave will start in the next six to 12 months. While this might work for cloud- and data-mature clients, only a small percentage of the enterprise is AI ready across all components including data, governance, strategy, technology and talent. Thus, it might take a bit longer scale for AI adoption to scale.

 

But as Infosys continues to execute its pragmatic strategy, the company relies on customer success stories that will help it build momentum. As another customer positioned it, “Infosys knows the data and processes. They know what they are talking about. In [the] 11 years since we have worked with them, they have not missed a single release with their … team delivering the outcomes.”

 

We believe Infosys’ position within the ecosystem will also play a role in how fast and successful the company is when it comes to scaling AI with clients. Infosys’ AI-related messaging includes 23 AI playbooks, which focus on value realization spanning technical and business components, such as Foundry and Factory models, as well as change management.

 

Of course, AI and GenAI will also disrupt Infosys’ business model and service delivery. And while many of its peers are still debating internally how to best position themselves with clients and pitch the value of GenAI without exposing their business to too much risk in the long run, Infosys’ thoughtful, analytics-enabled approach to commercial and pricing model management has positioned the company favorably with price-conscious clients that have predominantly been focused on digital stack optimization over the past 18 months.
 
Infosys’ success with large deals is a testament to the effectiveness of the company’s strategy. In FY4Q24 Infosys had $4.5 billion in large deals, which is the highest quarterly large deal value for the company. In addition, investing in and transforming right-skilled talent who can support this model are critical components to the company’s success. While Infosys has trained 270,000 of its employees on AI, we believe it is the composition and depth of these skills that vary across service lines and clients, especially as outcome-based pricing models now represent half of the contracts in some service lines.

Infosys’ Investments in Engineering and Marketing Strengthen Company’s Position as a Solutions Broker

Navigating the hype of GenAI requires Infosys to also recognize and place bets on other areas that are tangential and have a more immediate impact on its value proposition and overall financial performance.

Infosys Tries to Bring CIOs and CMOs Together Through Infosys Aster

Building off the success of Infosys Cobalt and Infosys Topaz, the company launched Infosys Aster, a set of AI-amplified marketing services, solutions and platforms. While Infosys Cobalt and Infosys Topaz have horizontal applications, the domain-specific nature of Infosys Aster provides a glimpse into what we might expect to see from Infosys in the near future, given the permeation of GenAI across organizational processes. Additionally, the marketing orientation of Infosys Aster is not surprising since most GenAI use cases are geared toward improving customer experience.

 

Built around three pillars — experience, efficiency and effectiveness — Infosys Aster will test Infosys’ ability to capitalize on a new wave of application services opportunities and create first-party data-unique solutions rather than providing off-the-shelf solutions just to ramp up implementation sales.
 
With DMS continuing to act as a conduit for broader digital transformation opportunities for Infosys, we expect the company to use Infosys Aster to position its marketing services portfolio in a more holistic manner, creating a bridge between CMOs and CIOs and also bringing parts of Infosys’ Business Process Management subsidiary into the mix to position the company to capture marketing operations opportunities. Infosys Aster provides a comprehensive set of marketing across the value chain of strategy, brand and creative services, digital experience, digital commerce, marketing technology (martech), performance marketing and marketing operations.
 
Although this is an area of opportunity for Infosys, rivals such as Accenture have an advantage in the marketing operations domain. We do believe the greater opening for Infosys comes from focusing more on driving conversations around the custom application layer and steering client discussions toward achieving profitable growth through the use of Infosys Aster. Client wins such as with Formula E and ongoing work with the Grand Slam tennis tournaments also allow Infosys to demonstrate its innovation capabilities beyond traditional IT services. Part marketing and part branding, wins such as these elevate Infosys’ capabilities. Executing against its messaging is key for Infosys.

Infosys Engineering Services Will Close Portfolio and Skills Gaps Between IT and OT Departments

Infosys Engineering Services remains among the fastest-growing units within the company as Infosys strives to get closer to product development and minimize GenAI disruption on its content distribution and support position. Since the 2020 purchase of Kaleidoscope, which provided a much-needed boost for the company to infuse new skills and the IP needed to appeal to the OT buyer, Infosys has further enhanced its value proposition to also meet GenAI-infused demand.

 

Infosys recently announced the acquisition of the India-based, 900-person semiconductor design services vendor InSemi, which presents a use case where the company applied a measured risk approach to enhance its chip-to-cloud strategy as it tries to balance its portfolio of partner-ready solutions, such as through NVIDIA, with a sound GenAI-first cloud-supported story. Shortly after, Infosys also acquired Germany-headquartered engineering R&D services firm in-tech. The purchase will bolster Infosys’ Engineering Services R&D capabilities and add over 2,200 trained resources to regional operations across Germany, Austria, China, the U.K., and nearshore locations in the Czech Republic, Romania, Spain and India, supporting Infosys’ opportunities within the automotive industry. The purchase of in-tech certainly accelerates these opportunities, bringing in strong relationships with OEM providers, which is a necessary steppingstone as Infosys tries to bridge IT and OT relationships.

 

We do not expect Infosys’ cloud business Infosys Cobalt to slow down anytime soon given the company’s market position for infrastructure migration and managed services as well as its well-run partner strategy with hyperscalers. Adding semiconductor design services bolsters that value proposition as buyers consider whether to use price-attractive CPUs or premium-priced GPU data centers. The latter currently dominates the marketplace, and we expect that trend will not change for at least the next 18 to 24 months. But having semiconductor engineers on its bench can help Infosys start supporting CPU-run models, further appealing to more price-sensitive clients. Meanwhile, Infosys is planning to train 50,000 of its employees on NVIDIA technologies. Lastly, the close collaboration between Infosys Engineering Services and Infosys Living Labs further extends the company’s opportunities to drive conversations with new buyers and demonstrates its ability to build, integrate and manage tangible products.

Infosys’ Reliance on Partners Provides a Strong Use Case of Trust and the Future of Ecosystems

The mutual appreciation between Infosys and partners was amplified throughout 2024 Infosys Americas Confluence. From a dedicated Partner Day to partner-run demos and various sponsorship levels to main-stage presentations, the experience reminded TBR of an event that a technology vendor would typically set up (think: Adobe Summit, AWS re:Invent, Dreamforce, Oracle OpenWorld, to name a few).
 
Infosys’ decision to feature some of its key alliance partners in a similar way that the tech companies do suggests a strong alignment between parties starting with the top-down executive support, through mutual investments in both portfolio and training resources, and most importantly, knowledge management between the parties. In conversations throughout the event with partners, it was evident that Infosys’ strategy is consistent regardless of the length of relationship, from decades-long relationships such as with SAP or an emerging but fast-growing alliance such as with Snowflake. All partners agreed Infosys’ humble approach to managing relationships has put them at ease in working with Infosys and delivering value to joint clients.

 

After attending Infosys’ U.S. Analyst and Advisor Meeting in Texas in March, TBR wrote about Infosys’ relationship with Oracle, highlighting the level of trust and transparency Infosys typically deploys with partners. In TBR’s Summer 2024 Voice of the Partner Ecosystem Report we wrote: “Services vendors most frequently rely on their direct sales efforts and permission to demonstrate value with customers to drive revenue. Using demos and proof-of-concept discussions as a frequent tactic to engage with clients also highlights many of the profiled vendors’ consulting heritage.

 

The technical expertise came through very vividly and aligned with Infosys’ strengths in playing within its own swim lane. In a main-stage discussion, Infosys and Hewlett Packard Enterprise (HPE) discussed at length the role each plays in pursuing opportunities in areas such as GenAI and the need for greater interactions through multiparty model including the value NVIDIA brings to the table, for example. While one could argue that Infosys’ alliance partner strategy mirrors that of many of its competitors as it seeks to secure foundational revenue opportunities while pursuing innovation through a measured risk approach, the company strives to differentiate by acknowledging its strengths and sticking to them rather than branching too far into partners’ territory, which enterprise buyers strongly appreciate.

Land-and-execute Approach and Expansion Will Follow Naturally

Close to a decade ago, TBR analyzed what Infosys’ five-year strategy should look like. While the company went through leadership and strategy changes during this period to such an extent that one could cite concerns about consistency, those days are over. Infosys now has a well-grounded strategy with executives executing on a clear vision rooted in a land-and-execute approach rather than the typical land-and-expand framework many of its peers aspire to. This puts greater pressure on the company’s quality and talent-retention strategies. While no one is immune to macroeconomic headwinds, the internal growth and training opportunities the company provides for its employees across all levels provides a strong backbone to a culture of learning and trust.

 

TBR will continue to cover Infosys within the IT services, ecosystems, cloud and digital transformation spaces, including publishing quarterly reports with assessments of Infosys’ financial model, go-to-market, and alliances and acquisitions strategies. Access reports as soon as they’re available with TBR Insight Center™ access.

IT Services Vendors Embrace Digital Transformation to Revolutionize the Sports and Entertainment Industry

IT Services Vendors Pursue Opportunities in the Sports and Entertainment Industry

Like every other industry, sports has undergone digital transformation in recent years, greatly improving operations within the industry and fundamentally changing the fan experience. Every major sporting event is enhanced by analytics, both at an operational level and for the fans, and other elements core to IT services, such as cybersecurity and automation, have become fundamental to running a sports operation.

 

Not surprisingly, IT services companies and consultancies have jumped on the bandwagon, increasingly associating their brands with major sport events and leagues, not simply as sponsors but also now as digital transformation, AI and analytics partners.

 

The sports and entertainment industry segment typically contributes a small share of revenue for the 31 vendors covered in TBR’s IT Services Vendor Benchmark compared to established industries such as financial services, public sector and manufacturing. However, an increasing number of IT services providers are building specialized expertise to address the needs of clients in sports and entertainment and to diversify revenue streams. Applying capabilities such as around digital design, secure infrastructure and data, and customer experience enables vendors to increase value and capture growth opportunities.

Specialized Expertise and History of Working with Clients in Sports and Entertainment Help Vendors Establish Credibility and Attract New Clients

IBM, Atos, Accenture and Infosys have well-established industry expertise and a history of working with clients in the sports and entertainment sector. In addition to those companies, other IT services providers are developing capabilities and building client relationships to capture opportunities in the sector.

 

Utilizing their solutions, expertise and reputation gained by working with clients in other sectors and applying that knowledge to the sports and entertainment industry enable vendors to expand their client reach. Vendors increasingly utilize digital design capabilities to add value. For example, IBM iX, the experience design business of IBM Consulting, developed a new AI commentary feature for the Wimbledon Championships utilizing watsonx to train the AI in the language of tennis, and then implemented the solution to create engaging commentary for event video clips.

IBM

Utilizes IBM Watsonx to Improve Fan Engagement

IBM has a 30-year partnership with the All England Lawn Tennis Club. To help more than 19 million fans globally follow the Wimbledon Championships more closely, IBM has been improving the digital experience of the tournament’s official app and website.

 

In June IBM announced a new feature for the app and website that provides personalized player stories as players advance through the tournament, utilizing data and generative AI (GenAI) from IBM’s watsonx platform. In addition to the Wimbledon Championships, IBM Consulting has been providing insights and improving experiences over the past several years for events such as the Masters Tournament, the U.S. Open and the Grammy Awards; improving user engagement and integrating AI, such as with the ESPN Fantasy Football app; and addressing storage and security needs, such as for the Mercedes-Benz Stadium in Atlanta.

 

For example, IBM has been working with the Masters Tournament for more than 30 years to digitally transform the event by designing solutions and user interfaces and transforming back-end systems to deliver insights through golf data. IBM is utilizing GenAI to convert Masters data into AI-powered narration and insights about players and games.

 

In April IBM announced new fan features for the Masters app and Masters.com to improve the digital experience of the tournament that was held April 11-14. IBM Consulting collaborated with the Masters’ digital team to provide fans with shot-by-shot insights based on data-based projections and analysis for each hole, thanks to GenAI capabilities from IBM watsonx.

 

IBM has also been working with the U.S. Tennis Association (USTA) for more than 30 years. In August IBM announced several fan features for the digital platforms of the 2024 U.S. Open that are powered by IBM watsonx to improve fan engagement and tournament coverage. IBM delivered AI-generated Match Report summaries for singles matches minutes after they were completed utilizing IBM’s Granite 13B large language model (LLM) and the USTA’s data and editorial guidelines. IBM also provided AI commentary with automated English-language audio and subtitles for singles match summaries. Fans also utilized the redesigned IBM SlamTracker experience offering that provides pre-live and post-match insights.

 

In September IBM and ESPN announced enhancements to the ESPN Fantasy app, which is powered by GenAI technologies from IBM watsonx. The new Top Contributing Factors feature within the Waiver Grade and Trade Grade features of the app provide analysis around grades. The grades that are assigned to players are created by AI models built with IBM watsonx, and the information is generated by IBM’s Granite LLM.

Atos

Every Olympics Must Run Flawlessly; there Are No Second Chances

Atos used its well-established expertise in the sports and entertainment industry to provide infrastructure services for the 2024 Paris Olympics and Paralympics and enable a secure and digital Games experience for end users globally. The company has been providing services for the Olympic Movement since 1989. Atos established its relationship with the International Olympic Committee (IOC) as a Worldwide IT Partner in 2001 and provided IT services for the first Winter Olympics in 2002 in Salt Lake City. Ensuring that the IT systems behind the Olympics run flawlessly every two years requires dedication and strict execution of processes and timelines.

 

Atos has been expanding its client roster in the sports and entertainment industry, applying its vast experience gained from the Olympics. In December 2022 Atos signed an eight-year deal with the Union of European Football Associations (UEFA) to be the official technology partner for men’s national team competitions. Atos is assisting UEFA in managing, improving and optimizing its technology landscape and operations. It is also managing and securing the hybrid cloud environment and infrastructure that hosts UEFA’s services, applications and data. Atos is the official IT partner of UEFA National Team Football until 2030.

 

In March Atos announced plans to open a Sports Technology Center of Excellence (CoE) in its new Middle East and North Africa headquarters in Riyadh, Saudi Arabia, in 2Q24. The CoE will develop technology applications for athletes, fans and sports organizations in Saudi Arabia. The new CoE provides a way for Atos to capture opportunities in the local sports industry as Saudi Arabia works on its Vision 2023 to position as a host for leading international sporting events. The center will enable clients to explore solutions around digital transformation, cloud services, cybersecurity, decarbonization, application modernization, DevSecOps and edge computing. Atos provided cybersecurity and infrastructure services for the 2024 Paris Olympic and Paralympic Games utilizing its Technology Operations Centre.

Accenture

Accenture Helps NFL Make Data-driven Decisions and Works with ESPN to Transform Sports Fan Experience

In May Accenture announced a five-year partnership with the NFL in which Accenture will be the Official Business and Technology Consulting Partner. Accenture will help the NFL make data-driven decisions in three business areas: football, financial operations and human resources. Accenture will also support the NFL across multiple areas such as transforming the league’s human capital systems, ERP and analytics, and driving efficiencies and automation across the NFL’s back-office functions.

 

In 2021 ESPN partnered with Accenture, Microsoft and Verizon with the goal of exploring ways to improve the fan experience in sports through technologies such as 5G, augmented reality and mobile edge computing. Accenture and ESPN launched the ESPN Edge Innovation Center to utilize technologies and jointly imagine, explore, conceive and prototype sports entertainment experiences and production capabilities. The combination of design and innovation capabilities with technology and industry expertise enabled Accenture to become ESPN’s Innovation and Founding Consulting Partner. Accenture and ESPN collaborate to enhance live sports broadcasting, develop consumer-facing products and improve the sports fan experience.

Infosys

Client Wins Such as Formula E and Ongoing Work with Grand Slam Tennis Tournaments Allow Infosys to Demonstrate Innovation Capabilities Beyond Traditional IT Services

Since 2015 Infosys has been the Digital Innovation Partner for the Australian Open, Roland-Garros, the Association of Tennis Professionals (ATP) Tour and the International Tennis Hall of Fame, transforming tennis through data, insights and digital experiences. For example, Infosys has been partnering with the ATP to develop digital assets. Infosys’ design capabilities and technical prowess continue to help it attract business in experience design and AI-powered services with sports and entertainment companies, particularly around tennis tournaments.

 

In March Infosys extended its digital innovation relationship with the ATP by three years, until 2026. The ATP will continue to benefit from Infosys’ capabilities in AI, data analytics and cloud. Since the beginning of the partnership in 2015, Infosys has deployed digital assets for ATP Tour, such as reinventing the ATP PlayerZone intranet portal; launched the ATP fan app; and developed systems integration (SI)-driven features powered by Infosys Topaz in the Infosys ATP Stats Center. Infosys and the ATP are also collaborating on the ATP Carbon Tracker, which monitors and helps offset the carbon footprint of players, supporting the ATP’s goal of achieving net-zero emissions by 2040.

 

Outside of tennis, Infosys is the Official Digital Innovation Partner of Madison Square Garden and the New York Knicks and New York Rangers. In May Infosys announced that it will be the official Digital Innovation Partner for the ABB FIA Formula E World Championship, the global motorsport championship for electric cars, for the next three years. Infosys will deliver in-race analytics, improve fan engagement experiences and enhance sustainability reporting and tracking for Formula E.

 

Additionally, Infosys will develop a new AI-powered Fan Customer Data platform to engage 500 million fans by 2030; provide in-race insights utilizing GenAI capabilities through Infosys Topaz; and implement a sustainability data management tool based on AI to help Formula E reduce carbon emissions by 45% by 2030.

Oracle’s Path to $100B+: Unlocking Growth with Multicloud Strategy

Oracle Is Charting a Path for Unprecedented Growth with Its ‘Infrastructure Anywhere’ Vision

Oracle has among the most complete, full-stack cloud portfolios, from infrastructure to database to applications. While Oracle Cloud World 2024 covered a sizable landscape, one theme stuck out during the four-day event: deployment flexibility. This theme reflects how much Oracle has changed compared to 2016, when Gen2 OCI (Oracle Cloud Infrastructure) launched.

 

With multitenant OCI, Dedicated Regions, Cloud@Customer and Oracle Alloy, a specialized service where customers white label OCI services inside their own data centers, Oracle has quickly emerged as one of the most flexible, delivery-agnostic IaaS vendors on the market. Of course, the other big component of Oracle’s “infrastructure anywhere” vision is multicloud, in which customers can run Oracle databases as native services hosted in the data centers of Oracle’s biggest hyperscaler competitors.

 

Not only does this move reflect a major maturity leap for Oracle, in which Oracle cozies up to its rivals to better address the needs of the customer, but it is also critical to the company’s financial strategy. In addition to giving Oracle the flexibility to allocate more capex dollars toward strategic compute and storage resources as opposed to land and buildings, this strategy will help Oracle get its on-premises database support base to the cloud faster. In doing so, Oracle may forfeit lucrative support and license contracts, but the company reports that for every $1 in lost license and support gross profit it could realize as much as $5 in gross profit in the cloud, which is a testament to how quickly the cloud business is growing.

 

The multicloud strategy is also one of the reasons Oracle awed financial analysts not only by raising its FY26 revenue targets by $1 billion, to $66 billion, but also by setting a FY29 goal of $104 billion. This target, backed by Oracle’s $99 billion RPO (remaining performance obligation) balance, implies an average corporate revenue growth rate of roughly 16% over the next five years. This kind of growth was once unheard of for Oracle, but with cloud now overtaking support as the biggest business, Oracle is a different company, and the OCI growth trajectory instills a degree of optimism in Oracle’s ability to disrupt a highly saturated market in the years to come.

Announcing Oracle Database@AWS

Based on interactions at Cloud World, it is clear the Oracle Database@AWS announcement was the most noteworthy. In our view, given Oracle already launched Oracle Database@Azure, and more recently Oracle Database@Google Cloud, which is now live in four regions, it was only a question of when, not if, Amazon Web Services (AWS) would partner with Oracle.

 

With this announcement, Oracle officially saved the biggest hyperscaler for last, onboarding all the critical partners it needs to migrate legacy database customers and accelerate cloud revenue growth. In terms of how this alliance will work, it is no different than the approach Oracle takes with Microsoft Azure and Google Cloud; Oracle will deliver the hardware and networking inside AWS data centers so customers can provision Oracle database services natively from the AWS console and have the system run in AWS, just as it would if it was hosted in OCI.

 

The approach of physically embedding OCI within other clouds as opposed to just bolting Oracle Database on to other infrastructure through a standard interconnection is important as it will not only give customers the native AWS, Azure and Google Cloud Platform (GCP) experiences they are used to, but also limit latency as the Oracle Exadata hardware is physically located with the appropriate hyperscaler.

 

One could argue Oracle is taking a lot of risk with this strategy, as it is essentially bringing customers and their data closer to AWS, Azure and GCP. But in the age of mounting competition, not to mention generative AI (GenAI), it is a risk worth taking. As one customer at a major financial services firm recently told us, “The GenAI decision makers will not be the old world relational database experts,” and these alliances could help ensure Oracle stays relevant in cloud GenAI discussions by making it easier for customers to use the data within Oracle Database for RAG (retrieval augmented generation), to fine-tune foundation models and build new applications using tools many customers are likely already using, like Amazon SageMaker.

 

We should also point out the concept of data gravity. Customers leveraging these multicloud services will still be established Oracle Database customers with some Oracle SaaS presence, and therefore the bulk of their business data gravity will naturally reside within OCI. Those customers may still be inclined to keep their databases within OCI and not extend to other clouds, but with this strategy, Oracle is at least giving them the option to do so. We expect that these multicloud offerings will gain a lot of traction among Oracle Database customers that have big application footprints on other clouds.

Oracle Analytics Is the Glue Between IaaS and SaaS

Analytics, and the ability to turn data into business insight, is the ultimate objective for nearly every organization. With popular tools like Power BI and Tableau as well as neutral data platforms like Snowflake on the market, customers have a lot of choices when crafting the analytics stack.

 

But customers have also made it clear they want to limit the integration burden, and one of the compelling things about Oracle’s approach to analytics is how it can store customers’ operational data from Fusion applications in the Autonomous Data Warehouse (ADW) for analytics as part of a single SKU. This approach, productized as Fusion Data Intelligence (FDI), reinforces the value of Oracle playing in both the SaaS and IaaS markets and its ability to deliver a unified solution.

Evolving the Data Lake Strategy and Competing as a Unified Solution

Access to operational data in the Fusion suite will remain the hallmark differentiator for FDI, but it is on the infrastructure side where Oracle took a big leap forward with the launch of Intelligent Data Lake. Oracle has been elevating its data lake strategy and positioning for some time, but this announcement puts Oracle more squarely into the space.

 

At its core, Intelligent Data Lake is a reworking of existing OCI capabilities, such as cataloging and integration, to create a single abstraction layer that in true data lake fashion, allows customers to query data on object storage, such as Amazon S3 or Microsoft OneLake, with support for the popular Apache Iceberg and Delta Lake frameworks.

 

To be fair, with Fabric and BigLake, Microsoft and Google Cloud, respectively, have been similarly making advancements with the data lake architecture to better address analytics workloads. However, Oracle is not only adding the simplicity and performance benefits of the data lake but also delivering the architecture in a way in which customers can run the entire data pipeline and still have all the analytics components in a single SKU.

 

With Oracle’s launch of a native Salesforce integration with FDI, which allows customers to combine their CRM and Fusion data within the lakehouse architecture, Oracle’s vision of embedded clouds at the database layer is extending to analytics.

 

Though FDI’s draw will still be primarily with existing Oracle customers, the company is clearly taking steps to help combine Fusion with non-Fusion data and make its platform more relevant within the cloud ecosystem. While FDI may not rip and replace the analytics footprint within any particular account, we could see scenarios where FDI displaces some components of the stack, such as Snowflake at the infrastructure layer, or on the analytics side, PowerBI in Microsoft Fabric.

New Applications Are Being Built on the Analytics Stack

In general, scaling the existing platform components of Oracle Analytics is a top priority for the company, but there is another emerging piece of the analytics vision: Intelligent Applications. Coming soon, Oracle will offer applications — People Leader Workbench for HCM and Supply Chain Command Center for SCM — that sit on top of the Fusion system of record, within the FDI platform.

 

This approach should allow Oracle to target a broader set of personas. For example, in People Leader Workbench, it is not necessarily about reaching only the C-Suite but rather anyone who manages people and can benefit from data-driven insights on their people, and most notably, take action on that insight by connecting back to the Fusion HCM system of record.

What About GenAI?

GenAI has officially exited the hype cycle and is being widely deployed within the enterprise, but when it comes to analytics, capabilities like dashboarding, semantic models and visualization are still taking precedence It is still early, but customer feedback suggests that if data is properly configured and there are guardrails in place, GenAI in analytics has a lot of potential.
 

Dive into the complexities of vendor partnerships in this recent TBR Insights Live session — Click the image below to watch on demand today!

On-demand Video - TBR Insights Live webinar: How to Think as a Partner in the Era of GenAI

One of the key announcements at the event was the general availability of Analytics Cloud AI Assistant in Oracle Analytics Cloud (OAC), which is based on a large language model (LLM) so customers can ask questions about their data. Staying in line with the rest of the Oracle strategy, where GenAI is fully embedded into the portfolio and available to customers at no added cost, the analytics assistant will be available to OAC customers for free as part of their existing instances.

Speaking of SaaS and IaaS

From database alliances to the data lake architecture, Oracle has made many calculated moves at the PaaS layer to better compete for strategic workloads. But there are other innovations and key developments in the upper and lower rungs of Oracle’s cloud portfolio.

Oracle Targets Complete End-to-end Process Automation with AI Agents in Fusion Suite

Since it first entered the GenAI game in late 2023, Oracle stood out in the SaaS market for not upcharging customers for GenAI in their SaaS applications. This speaks to Oracle’s play at the IaaS layer with the OCI GenAI Service, which is native to the same infrastructure where all Oracle’s SaaS applications live.

 

Logically, this approach means that as Oracle’s LLM partners, which host in OCI, push the boundaries of their models, Fusion customers stand to benefit in not just using GenAI for basic assisted authoring and summarization use cases (e.g., writing a job description in Fusion HCM or summarizing customer calls in CX), but actually contextualizing data. In the long term, this could mean providing reasoning on that data to manage more complex workflows and deliver business recommendations.

 

At this time last year, Oracle announced 50 GenAI use cases in the SaaS suite. This year, the applications team announced the number of use cases has grown to over 100, while there are now more than 50 AI agents within the Fusion suite. This announcement marks a progression in how Oracle is moving from more generic prompt-and-response use cases in Fusion to actual contextualization use cases, by applying LLM-based RAG agents to address specific goals and roles within a particular business function. In Fusion HCM, this could include a benefits analyst agent, offering users the ability to ask questions, such as which health plan features are available, based on the enrollment data contained in Fusion HCM and the health plan document specific to the company.

 

But the most commonly cited example throughout the event was the Document IO agent in Fusion ERP, which can convert a picture of a quote in a particular currency into U.S. dollars and automatically create and load a purchase order (PO) within the system. With these AI agents, we see Oracle taking the next big step in addressing more complete process automation and productivity enhancements within its SaaS portfolio, and ultimately a shift in mindset where it is less about delivering an ERP system or an HCM system but more about completing end-to-end business process and experience.

OCI Strategy Centers on Growing Within the Large Enterprise and Attracting Cloud-natives

Oracle’s ability to offer among the most flexible cloud delivery methods is the focus of the OCI strategy and strategic road map, led by high-profile partnerships with AWS and others. But Oracle’s strategy is about being agnostic to not only where customers run OCI but also how they run OCI.

 

For example, at Cloud World Oracle announced Dedicated Region 25, a longtime investment and feat of engineering that essentially consolidates a standard Oracle Cloud region into just three racks, which we physically saw on the keynote stage. This configuration extends the value proposition of Dedicated Region, where customers can get the scale and economics of the public cloud inside their own data centers.

 

Dedicated Region 25 could also play a big role in helping Oracle reach new customers. Oracle’s multicloud alliances will undoubtedly be appealing to the large enterprise customer base, but offerings like Dedicated Region 25 could help Oracle attract cloud-native and AI companies looking for a more compact footprint that can still scale to support critical workloads.

Conclusion

Led by its partnership with AWS, Oracle Cloud World 2024 told a story of a maturing business that is turning competitors into partners to better address the needs of the customer. By keeping the lifeblood of the cloud stack, the database, relevant in customers’ cloud transformations, Oracle also ensures it remains competitive in GenAI scenarios, which aligns with the GenAI investments the company is making in other areas of the stack, from analytics to Fusion applications.

 

As the company continues to navigate as a full-stack vendor catering to the existing Oracle base, while simultaneously gaining relevance in the broader cloud ecosystem, there is a lot of potential ahead, and Oracle is well on its way to becoming a $100-plus billion company.

Diversification Into Other Verticals Is Critical to Amdocs Sustaining Long-term Growth

TBR Perspective: Amdocs Must Accelerate Push into Non-Telecom Verticals for Growth and Diversification

Amdocs has made substantial progress on its reinvention, diversifying its customer base, portfolio and business mix while shifting the market perception of the company from a traditional OSS/BSS provider to more of an ICT software transformation specialist. However, most of Amdocs’ transformation thus far pertains to the telecom industry; Amdocs still needs to transition from being a telecom-centric vendor to a multifaceted provider that supports a diversified mix of verticals. The pressure to move in this direction will intensify as the telecom industry’s challenges persist and Amdocs’ organic growth from the industry continues to slow.
 
Amdocs’ current situation is reminiscent of Tech Mahindra’s before it merged with Mahindra Satyam in 2013. Pre-merger, Tech Mahindra was largely viewed as a telecom-only shop and had minimal exposure to other verticals (the company’s revenue split was around 90% telecom and 10% other verticals pre-merger). This specialization helped Tech Mahindra differentiate and compete for business in the telecom vertical but kept it from benefiting from diversification and greater scale.
 
After the Mahindra Satyam merger was completed, Tech Mahindra became a multifaceted ICT services provider, with robust diversification across many verticals. Though TBR is not suggesting Amdocs should or will take a similar approach, Amdocs has already made several acquisitions that bring exposure to nontelecom verticals. However, these acquisitions are relatively small and have not brought transformational changes to the company’s business mix.
 
Amdocs has been involved in nontelecom verticals for at least a couple of decades, and TBR estimates Amdocs’ nontelecom revenue currently composes approximately 10% of the company’s total revenue. While Amdocs has yet to formalize its foray into nontelecom verticals, TBR notes that is beginning to change as the company seems to be making a stronger push into the financial services vertical, as evidenced by acquisitions (especially Astadia, Projekt202 and Sourced Group) and an increase in dedicated resources to support that vertical.
 
Amdocs is also supporting a variety of brand-forward customers from other verticals, primarily via its Stellar Elements business unit, and is focused on opportunities to help companies in the utilities and media & entertainment verticals with IT and digital transformation.

Impact and Opportunities

Astadia Exposes Amdocs to Mainframe Migration Opportunities

One of Amdocs’ newest acquisitions, Astadia, plays into the nontelecom vertical theme and could serve as a key beachhead to winning more deals with nontelecom customers. Astadia is focused on helping mainframe users migrate to the cloud and has carved out a strong niche in the financial services industry, which is one of the verticals outside of telecom that Amdocs is focusing on. Helping companies migrate off mainframes plays well into Amdocs’ mission-critical transformation value proposition. Amdocs estimates there are 40,000 mainframe computers still in use worldwide by a range of companies and government entities, representing a significant opportunity for net-new business.

Competitor List for Products and Services Broadens for Amdocs

Amdocs’ string of acquisitions and new strategic initiatives, such as the partnership with Microsoft, broadens the scope of companies Amdocs now competes with, from both a products and services standpoint. Historically, Netcracker was Amdocs’ most formidable competitor in terms of portfolio overlap, but that list now includes companies like Salesforce, ServiceNow and Oracle. Meanwhile, on the services side, Amdocs is increasingly crossing paths with traditional C&SI companies, such as Accenture, Tata Consultancy Services and Tech Mahindra.

Amdocs Can Compete (and Win) Against C&SIs like Accenture, Just at Smaller Scale

Amdocs possesses all the capabilities required to drive customer IT and digital transformation, both for and beyond the telecom industry. Though the vendor is less than a tenth of the size of Accenture (which is arguably the benchmark vendor to emulate in the C&SI domain) in metrics such as revenue and headcount, Amdocs can still compete against Accenture and other C&SI firms and win business.
 
Amdocs needs to focus on its specialization in delivering migration and transformation for mission-critical software environments, a skill that is broadly applicable across verticals, as well as its leading KPIs for project completion rates.

There Is More Juice to Squeeze Out of CSPs but Not Much

Amdocs boasts over 400 communication service provider (CSP) logos globally, including most of the top 50 CSPs, and in many of these accounts Amdocs is already the dominant provider in terms of the products it sells. Therefore, squeezing more revenue out of these customers (and/or taking more market share from competitors) will be increasingly challenging as telecom operators chronically struggle amid market maturity and anemic growth prospects, and resort to cost containment and M&A for additional economies of scale.
 
Amdocs is also proactively trying to move further down market, targeting smaller CSPs such as MVNOs and Tier 3 operators to sustain growth. However, this approach is unlikely to move the revenue needle significantly, given the largest CSPs globally account for well over 80% of the total telecom market opportunity.
 
GenAI remains exploratory; automated, scaled usage of GenAI in commercial environments is at least a year away
Amdocs is actively exploring how generative AI (GenAI) can be incorporated across domains, both within its own company and for its customers. Thus far, the company is primarily utilizing GenAI internally for code development, and focusing on contact center transformation for its customers. Amdocs’ strategic partnership with Microsoft broadly applies to AI coinnovation and go-to-market efforts and the current focus is offering a joint solution for marketing and sales process automation.
 
Amdocs is also embedding Microsoft Copilot across its broader product portfolio. TBR notes that the GenAI-enabled “virtual agent” and process automation technology Amdocs showcased at the event were compelling and demonstrate a clear path to business value for CSPs.

Learnings From Partnerships with Hyperscalers Provide a Strong Beachhead Into Other Verticals

Amdocs has been learning a lot from its partnerships with Microsoft, Amazon Web Services and Google Cloud, especially as it pertains to implementing cloud migrations of ICT workloads and digital transformation. Specifically, Amdocs has obtained certifications, status and organizational alignment with hyperscalers. The skills and capabilities Amdocs has developed from the telecom ecosystem can be leveraged across other verticals. Solution cocreation also opens new doors for Amdocs, both within telecom and in other verticals.

Amdocs Makes Waves in CRM for Telecom Leveraging Microsoft Partnership

Amdocs has integrated Microsoft Dynamics (CRM) with Amdocs’ Customer Engagement Platform to offer marketing and sales automation solutions to its customers (TBR notes the joint solution, including the GenAI large language model it uses, is customized specifically for the telecom industry by Amdocs’ TelcoGPT, amAIz).
 
Microsoft Dynamics is integrated with Microsoft’s other key business productivity applications, such as Outlook, O365 and Teams, and the company’s Copilot is embedded across the stack, bringing customers improved outcomes. The joint Amdocs-Microsoft solution will enable the two companies to compete with incumbent CRM providers, especially Salesforce, Oracle and ServiceNow. TBR notes that the joint CRM solution is differentiated by the power of its GenAI platform, an aspect where incumbent CRM providers are lagging, and could displace incumbent CRM providers from CSP accounts. Deals would draw in Amdocs’ systems integration capabilities as well as other services, yielding larger deal sizes.

Conclusion

Amdocs has been navigating the increasingly challenged telecom market well, but with organic growth slowing, the company will need to seek out and accelerate into other areas for more sustainable, long-term growth. Amdocs’ incremental steps into other verticals, mostly via acquisitions, are moving the company in the right direction, but a larger magnitude shift is required.
 
This aspect of Amdocs’ reinvention would encompass the institution of formalized strategic, organizational and portfolio changes gearing the company toward addressing multiple verticals. Doing so would enable Amdocs to expand its total addressable market, diversify its business mix and hedge against downturns in the telecom industry.
 
As a first step toward formalizing Amdocs’ strategy in other verticals, TBR encourages the company to start providing more information about its initiatives in verticals outside of telecom, which are known to be significant but are unquantified and minimally discussed, as it will become more important to Amdocs’ business results and growth profile over time.

Ericsson Aims to Accelerate Network API Market Development via New Venture with Leading Global Telcos

TBR Perspective

Ericsson’s Enterprise Wireless Solutions unit is exhibiting strong revenue growth and serves as a bright spot amid the company’s broader challenges. Ericsson has a compelling 5G-related portfolio that addresses the unique needs of enterprises ranging from SMBs to large industrial entities. Ericsson’s focus on enhancing its enterprise portfolio in areas including private cellular networks (PCNs), neutral host networks, fixed wireless access (FWA) and IoT will generate new revenue that will help to partially offset declining consolidated revenue, which is being negatively impacted by most Tier 1 operators decreasing network capex as they enter the later stages of 5G deployments.

 

Ericsson’s Enterprise segment has experienced challenges, however, namely declining revenue within its Global Communications Platform division, which includes Vonage. Ericsson appeared to overpay ($6.2 billion) for its acquisition of Vonage, which fits awkwardly within Ericsson’s historical core business and was primarily considered a down payment on developing a network API business with an unproven business model when it closed in 2022, and Ericsson has essentially confirmed that notion.

 

In October 2023 the company booked an SEK 32 billion ($3 billion) impairment charge on Vonage’s goodwill, writing off half of the acquisition price. The company took a further SEK 11.2 billion ($1.1 billion) noncash charge on the Vonage acquisition in July 2024. TBR believes Ericsson is correcting course, however, by more deeply collaborating with industry partners through its new network API joint venture, which will reduce fragmentation in the market and make it easier for developers to innovate and create new apps and use cases.

 

The joint venture will also provide Ericsson with a more risk-averse approach to tackling the network API opportunity by pooling funding and resources from the partners as the long-term market size for network APIs is uncertain. Ericsson will need to split proceeds from the joint venture with its partners, however, which will limit long-term revenue potential.

Ericsson Realizes the Need to Collaborate with Industry Partners to Accelerate Network API Development

The composition of Ericsson’s new network API joint venture, which currently does not have a formal name and is expected to close in early 2025 pending regulatory approval, entails Ericsson holding 50% equity in the venture, with the following telecom operators holding the remaining 50% of equity: America Móvil, AT&T, Bharti Airtel, Deutsche Telekom, Orange, Reliance Jio, Singtel, Telefonica, Telstra, T-Mobile, Verizon and Vodafone.

 

Vonage and Google Cloud will serve as channel partners for the joint venture, providing access to their ecosystems of millions of developers as well as their partners, and additional communication service providers (CSPs) and channel partners will be invited to join the entity in the future (Ericsson would maintain its 50% share in the venture if additional CSPs join). The goal of the joint venture is to create a platform that will provide network APIs to an ecosystem of developers, including hyperscalers, Communications Platform as a Service (CPaaS) providers, systems integrators and independent software vendors. The joint venture will be in alignment with existing industry network API initiatives, including the GSMA’s Open Gateway and the Linux Foundation’s CAMARA Project.

 

TBR believes the main benefit of the joint venture will be incentivizing developers to focus on the network API market by providing them with a simpler way to create apps at scale. For instance, developers currently need to engage with CSPs on a one-on-one basis to procure network APIs, which can be a slow and complex process. The joint venture aims to accelerate market development by providing combined common APIs that can work from any location or network. Reduced fragmentation will also speed market development as developers will be able to more fully concentrate on new use cases and applications rather than spending time modifying existing applications to make them compatible with networks on an operator-by-operator basis.

 

Industry projections for the network API market are wide ranging, with Ericsson citing McKinsey & Co.’s projections that the market will generate around $100 billion to $300 billion in incremental connectivity and edge computing-related revenue for operators by 2030 and that an additional $10 billion to $30 billion in revenue will be generated from the APIs themselves.

 

TBR believes the market size of the segment will mainly hinge on network APIs being able to provide developers with differentiated and compelling capabilities that are distinct from existing 5G capabilities that are available independent of network API access. Enhanced capabilities enabled by network APIs include differentiated connectivity, device-based location, security (e.g., authentication) and network insights.

 

Current primary use cases for network APIs include simplified secure login for devices and advanced network authentication to strengthen fraud prevention. Other main use cases include enabling enhanced location verification and more reliable connectivity to support point-of-sale platforms, as well as optimizing the user experience for entertainment services such as video streaming and gaming applications.

 

Ericsson’s joint venture will create competitive pressures for Nokia, which is providing network API solutions via its Network as Code platform. Nokia has at least 14 Network as Code CSP partners as of June and aims to have more than 30 partners by the end of 2024. Nokia may be challenged in meeting this goal, however, due to potential CSP partners possibly being swayed by the ecosystem and benefits provided by Ericsson’s joint venture. Ericsson’s CSP partners are not tied exclusively to the joint venture, however, and have the option to join Nokia’s ecosystem as well.

For IT Services Companies and Consultancies, the New Joint Venture Could be a Promising Change Agent in the Broader Ecosystem

From the perspective of global IT services companies and consultancies, such as Accenture, Infosys and Deloitte, Ericsson’s event theme, “Capture the value of enterprise 5G,” remained focused on Ericsson’s opportunities with and through telco operators while providing a modest opening for increased go-to-market and alliance activity.

 

Based on the event presentations, sidebar discussions with Ericsson leaders, and TBR’s analysis of Ericsson over the last two decades, we see two opportunities for Ericsson to enhance its ecosystem plays with IT services companies and consultancies that align well with Ericsson’s overall strategy.

 

First, TBR’s recent Voice of the Partner research shows that cloud and software vendors, OEMs, and IT services companies see 5G as a promising source of near-term growth, nearly on par with generative AI. To address their enterprise clients’ growing 5G needs, IT services companies and consultancies will need closer alliances with incumbent telcos and OEMs, including Ericsson. IT services companies and consultancies will not try to sell their own connectivity solutions but will readily partner to bring those solutions to their enterprise clients if informed, aligned and incented, particularly if the five-to-eight-times revenue multiplier applies to services attached to Ericsson’s hardware.

 

Second, TBR’s ecosystem reports, which cover a dozen leading global IT services companies’ relationships with Amazon Web Services (AWS), Google Cloud, Microsoft Azure, Adobe and Salesforce, confirm that scale remains a key differentiating characteristic, both for alliances managers across the ecosystem and enterprise clients looking for multiparty, well-orchestrated technology solutions. Ericsson’s joint venture with Google and the 12 operators could be highly appealing as an alliance partner, bringing IT services companies and consultancies into contact with new personas within their enterprise clients, which will create an expanded playing field for professional and managed services companies. In short, Ericsson’s new joint venture could be an ecosystem catalyst, provided the joint venture finds a go-to-market focus and well-led partnerships with the right IT services companies and consultancies.

Ericsson Launches Private 5G and Neutral Host Network Solutions Under its Ericsson Enterprise 5G Segment

At Ericsson Enterprise Industry Analyst Day in September, Ericsson reintroduced its Ericsson Enterprise 5G portfolio, which includes three solutions:

 

  • Ericsson Private 5G: A converged LTE/5G PCN solution with industry and licensed spectrum support
  • Ericsson Private 5G Compact: A U.S. CBRS-based solution designed for enterprises requiring connectivity that is more reliable than Wi-Fi. The solution was previously branded as Cradlepoint NetCloud Private Networks.
  • Ericsson Enterprise 5G Coverage: A turnkey neutral host solution that features certification from all Tier 1 U.S. operators. The solution can support up to three carriers per radio.

 

The relaunch of the Ericsson Enterprise 5G portfolio, in addition to the legacy Cradlepoint business now branded under this segment, will help Ericsson strengthen its messaging within the PCN market and better compete against Nokia, which TBR estimates is the second-largest PCN vendor by revenue globally (behind Huawei) and the largest when excluding China.

 

Ericsson Enterprise 5G Coverage is certified by AT&T, T-Mobile and Verizon, which will be a significant benefit as Ericsson aims to gain headway within the neutral host networks market. Neutral host networks are gradually gaining traction as they are easier to deploy compared to legacy distributed antenna systems (DAS) and can provide significant cost savings as they enable a single neutral host network to support customers from multiple operators without requiring each operator to deploy its own separate infrastructure.

 

Industrial sites, schools and hospitals are the primary locations where neutral host networks are initially being deployed, and Ericsson’s early customers for the solution include Toyota Forklifts in Indiana and engine manufacturer Cummins in New York.

Conclusion

TBR believes Ericsson is effectively positioning to capitalize on 5G-based solutions within the telecom enterprise space, including network APIs, PCNs and neutral host networks. Ericsson is aware that industry collaboration is essential for these segments to reach their peak potential, evidenced by the vendor’s initiatives including the formation of the network API joint venture and gaining certification from AT&T, T-Mobile and Verizon for its neutral host network solution.

 

Ericsson’s success in areas including network APIs, PCN and multi-access edge computing will be impacted by coopetition from hyperscalers within these segments. Though Ericsson has established partnerships with AWS, Google Cloud and Microsoft Azure within multiple portfolio segments, the company’s revenue opportunities will be limited as hyperscalers take a portion of revenue from enterprise deployments.

Nokia’s Fixed Networks Unit Poised for Long-term Growth Despite Market Challenges

TBR Perspective: Nokia’s Fixed Networks Business Unit

Nokia is the largest vendor of fixed network access infrastructure by revenue in the Western economic bloc, a position of strength that exposes the vendor to a range of opportunities that arise in the market. While Nokia remains focused on its fiber-based platform, the vendor is also supporting fixed-wireless access (FWA), which is a rapidly growing service offering in the telecom industry.
Though revenue in Nokia’s Fixed Networks business unit has been uneven over the past few years (primarily due to the disruptions caused by the COVID-19 pandemic), the unit is poised to be one of the biggest beneficiaries of government-supported broadband programs and ongoing internet service provider (ISP) investment in high-speed broadband access technologies, driving a positive revenue trend over at least the next three to five years.
 
Nokia is focused on expanding access to broadband (through fiber and/or FWA) and introducing a future-proof platform for ISPs to build upon. The company is trying to be everything to everyone in this domain by providing a near complete portfolio (only DOCSIS is missing).
 
Despite Nokia’s favorable market position and government-induced tailwinds for the broadband infrastructure domain, TBR notes that the supply-and-demand dynamics as well as the timing of investments are prone to be disjointed, lengthening the time required to meet infrastructure deployment objectives compared to what was originally expected by the government and the telecom industry.
 
Additionally, TBR remains steadfast in its belief that building fiber out to every household is not economically feasible (despite what the government and stakeholders in the market say they want) and that alternative broadband access technologies (such as FWA and satellite) are going to increase in the global mix to connect the unconnected and underserved peoples of the world.

Impact and Opportunities for Nokia

BEAD Program Will Likely Stretch to the Mid-2030s due to Challenges and Delays

Broadband Equity, Access, and Deployment (BEAD) Program-supported projects are now slated to begin deployments in 2025, more than a year later than originally planned. There is a long list of reasons (most of which are related to mapping integrity and political processes) why the program has been delayed thus far, and there is a growing list of reasons that suggest it will take longer for the program to fully ramp up and complete its objective (i.e., spend all of the $42.5 billion allocated to the program).
Among the biggest challenges that lie ahead for the BEAD Program is a shortage of skilled labor (e.g., fiber splicers and trenching machine operators) and industrial equipment, such as boring machines, that will be required to deploy fiber to an estimated 5.5 million households across the U.S. Shortages of products that meet the Build America Buy America (BABA) requirements associated with the BEAD Program could also cause a timing and supply issue.
 
Taken together, TBR now believes the deployments tied to the BEAD Program will begin next year and it could take as long as the mid-2030s for all the program’s funding to be disbursed, more than five years longer than the government and market ecosystem originally anticipated. Nokia is doing as much as it can to mitigate and alleviate these potential challenges in the market.
 
For example, Nokia is proactively educating stakeholders in the ecosystem and working with its partners to better match supply with demand for products and resources. This orchestration of the ecosystem will help align stakeholders and enable the industry to put its best foot forward in carrying out this infrastructure build-out program as well as position Nokia to maintain and grow its leading share in the broadband infrastructure market.

Do Not Forget About Non-BEAD Government Programs for Broadband

Though the telecom industry likes to focus on the BEAD Program (likely because it is the largest program by dollar amount in the broadband ecosystem in the U.S. market), there are a variety of other government-supported programs that also deal with broadband, including the American Rescue Plan Act (ARPA), the Rural Digital Opportunity Fund (RDOF), the U.S. Department of the Treasury’s Capital Projects Fund, the Tribal Broadband Connectivity Program, and the U.S. Department of Agriculture’s ReConnect Loan and Grant Program.
 
In aggregate, TBR estimates there is more than $80 billion in direct and indirect government stimulus allocated for broadband-related projects in the U.S. market alone, all of which is slated to be spent by the mid-2030s. There are also a few hundred billion dollars in aggregate in similar broadband-implicated programs in other regions, most notably in China, the European Union, the U.K. and Australia.

Fiber Access Technology Capabilities Exceed Usability, Creating a Conundrum for Vendors

Technological innovations pertaining to fiber access have become so advanced and the bandwidth available through fiber access so massive that the capabilities of the technology far exceed what most end customers could possibly need or use. This disconnect creates a conundrum for vendors such as Nokia that supply the broadband infrastructure market.
 
Though fiber broadband infrastructure is, and will remain, in high demand, most ISPs will be loath to adopt the most cutting-edge technologies because they far exceed what customers would need and put unnecessary additional cost burden on the operator.
 
There are exceptions, such as what Google Fiber and Frontier Communications are deploying (specifically 50G and 100G connections, respectively), but TBR believes most ISPs will focus on 10G or lower connections, which is more than enough bandwidth for the vast majority of households and businesses and are likely to be future-proof for many years to come.

Overbuilding and One-upmanship Risks New Price War for High-speed Internet Service

The government funding boost, coupled with technological advancements and new entrants into the ISP domain, is creating a situation that is ripe for a price war for broadband services. Specifically, many more markets across the U.S. are likely to have three or more (in some cases up to seven) providers of high-speed broadband service in a given area, including xDSL, FTTx, HFC (via DOCSIS) as well as FWA and satellite (mostly delivered via low Earth orbit [LEO] satellites).
 
Given that a provider typically needs to have more than 30% market share in a given area to achieve profitability in the broadband services market, an increasing number of options puts more power into the hands of end users, which historically suggests the pricing environment will be extremely competitive.
 
In response to the hotter competitive environment, providers that are multiservice-oriented are trying to attract and lock in market share by offering converged (aka bundled) solutions, usually giving end users a discount as an incentive to sign up and stay.
 
Additionally, TBR notes that ISPs are increasingly engaging in one-upmanship (which is also a symptom of the existence of too many options in a given market), meaning ISPs are marketing ever higher broadband speeds to customers to position their offerings as better than the competition while attempting to incrementally increase average revenue per user.
 
Though this strategy has been effective in years past, it is likely to lose efficacy after speeds surpass the level at which the benefits of faster speeds become imperceptible to end users. Therefore, in aggregate, TBR expects the pricing environment in the U.S. for broadband service to be increasingly competitive through at least the remainder of this decade.

Private Equity Comes into the Fixed Broadband Market

Private equity firms are entering the telecom infrastructure market in a big way, gobbling up assets and forging joint ventures with telcos that want to (or need to) raise capital and hedge their risks. Some private equity-sponsored entities are also now building out their own greenfield fiber-based networks (such as Brookfield Infrastructure Partners’ Intrepid Fiber Networks) and are even moving the market toward wholesale, shared and other forms of open-access models.
 
Though the inclusion of private equity into the broadband infrastructure domain is bringing large pools of fresh capital into the market, this trend also risks fueling overinvestment, price compression and disruption of incumbent ISPs’ business models. Regardless, expect private equity to remain attracted to assets that offer consistent cash flow over a long duration, and their inclusion in the telecom ecosystem is likely a net positive for overall market development and evolution.

Existing Government Stimulus May Still Not be Enough for FTTP; Alternatives Will Likely be Called on at Scale to Fill in the Gaps

Though governments (and most of the stakeholders in the telecom ecosystem) across the world want full fiber to each premises, this is still not economically feasible. For example, it is not uncommon for some locations in the U.S. to cost upward of $1 million per premise to connect with fiber, a price that will be politically difficult to justify and that is not supported by normal market conditions. In these extreme situations, it is highly likely that governments will allow and embrace alternatives, such as FWA and satellite-based connectivity.
 
TBR notes that FWA and LEO constellations can easily deliver sustained speeds in excess of 100Mbps at a fraction of what it would cost to deploy fiber to the premises (FTTP). With that said, of the estimated 5.5 million households that the government has identified as needing broadband connection in the U.S., TBR would not be surprised if up to 25% of that number of households is ultimately connected via FWA or satellite (enhancements to DOCSIS and xDSL are also potential options to close the underserved gap). In other countries, that percentage could be even higher.

New Business Models Hold Promise to Connect Low-income Households in Emerging Markets

Upstart ISPs, such as fibertime and Vulacoin in South Africa, have established innovative solutions to cost-effectively provide high-speed broadband services to low-income areas. The architecture of the network emphasizes leveraging FWA and Wi-Fi with a relatively low amount of fiber and the business model is focused on selling units of time (in minutes), which is more affordable for lower-income end users.
 
TBR notes this model requires scale and high time of use to achieve profitability, meaning it is best suited for dense areas, especially impoverished neighborhoods. TBR also notes that obtaining access to high-speed internet is a key avenue in which areas can strengthen their local economies and help reduce levels of poverty.
 
In addition to South Africa, Brazil is also exploring the use of this model. This approach is also likely to be leveraged in other parts of Africa as well as in parts of India and Southeast Asia.

Conclusion

Government and private equity involvement in the broadband market may prove to be a mixed blessing. Though there are concerning indicators suggesting there are too many broadband providers in some key markets (especially the U.S.) and that broadband access businesses are becoming overvalued, these market dynamics actually represent tailwinds for Nokia, which is best positioned to garner a disproportionate amount of investment slated for broadband infrastructure in the Western economic bloc, which includes North America, Europe, developed APAC and select developing markets such as India.
 
Nokia’s outsized and unique position in the broadband infrastructure ecosystem enables the company to play a key role in orchestrating partners and customers to achieve their objectives in the most optimal way possible. Fiber will remain the coveted access medium for high-speed broadband, but the world will also employ other broadband access mediums to a large extent.
 
New ISP and hyperscaler business models, coupled with sustained investments by incumbent ISPs and supported by government stimulus, create an environment ripe for moving the world closer to full broadband coverage for all people.

Atos Powers 2024 Paris Olympics and Paralympics with Cutting-edge IT and AI Solutions

Atos, the worldwide IT partner for the Summer and Winter Olympic and Paralympic Games, invited a group of industry analysts to the 2024 Paris Olympics. The goal of the event was to show Atos in action during the Games with a tour of the Technology Operations Center in Paris, which is one of the three locations responsible for delivering IT services and keeping the Games running. The analysts also attended a swimming competition event at Paris La Defense Arena, to experience the secure and digital experience provided by Atos and its partners in running the IT systems behind the Games.

The Olympics Must Run Flawlessly; There Are No Second Chances

Atos utilized its well-established expertise in the sports and entertainment industry to provide IT services for the 2024 Paris Olympics and Paralympics and enable a secure and digital experience for end users, which typically amounts to a total of approximately 4 billion viewers globally. Atos has been providing services for the Olympic Movement since 1989. Atos established its relationship with the International Olympic Committee (IOC) as a Worldwide IT Partner in 2001 and provided IT services for the first Winter Olympics in 2002 in Salt Lake City. Providing uninterrupted running of the IT systems behind the Olympics every two years requires dedication and strict execution of processes and timelines.

 

According to Angels Martin, general manager Olympics at Atos, “Olympics challenges are similar to other projects; the difference is visibility [of the Games]. No one will postpone the opening ceremony because Atos is not ready.” Martin also explained that cybersecurity management is a vital activity that Atos provides as the Games are one of the most targeted events in terms of cyberattacks, which could threaten the smooth functioning of the Olympics. She also stated that the Games are complex to manage with multiple parties, such as the IOC, sports federations, broadcasters and journalists, requiring services and access to information 24/7 from anywhere on any device. Martin also noted that demand for information has changed significantly since the first engagement 30 years ago, and today Atos is applying AI-driven solutions to enable processes for the Games. For example, Atos used AI solutions for the 2024 Paris Olympics to support the Organising Committees for the Olympic Games in providing scenarios for matching volunteers with job positions based on skills and abilities. In the 2020 Tokyo Olympics Atos provided an AI solution for facial recognition for venue access using accreditation.

Atos Integrates Critical IT Systems and Manages Partners to Run the Games

Atos is responsible for integrating critical IT systems, managing programs with IT vendors that deliver services for the Organising Committees for the Olympic Games, supporting critical applications for the Games and providing security services to enable smooth and uninterrupted running of the Games. For example, for the 2024 Paris Olympics and Paralympics Atos operated the Olympic Management System, which included a volunteer portal, a workforce management system, athlete voting applications, sport entries and qualifications, competition schedule and accreditation. Atos was responsible for the Olympic Diffusion System, which contained Olympic data feed, web results, mobile apps for results, a Commentator Information System, an information system for journalists called MyInfo, and a print distribution system. Atos was also responsible for cloud orchestration between private cloud, public cloud services and data centers at venues.

 

Additionally, Atos applied its expertise around working with a diverse group of technology partners to help run the Games and provided systems integration of applications with other IT providers and partners. Atos integrated partners, such as technology providers, media, the IOC, Organising Committees for the Olympic Games, and security providers, to ensure efficient delivery, operations, timelines and venue management activities. Atos also helped coordinate responses on daily activities and addressed critical events when they occurred. For example, Atos worked with Omega, the timing and scoring sponsor of the 2024 Paris Olympics, to relay results and data to spectators globally in real time. Omega captured raw data around timing and scoring, fed the results into scoreboards and videoboards at venues jointly with Panasonic, and provided data to Atos to feed into the Commentator Information System.

Atos’ Olympics and Paralympics Achievements

Achievements from the 2020 Tokyo Olympics and the 2024 Paris Olympics show the magnitude of work Atos provides. There are approximately 900 events that Atos has to manage to be able to transmit results instantly from competition and noncompetition venues. The company utilized the volunteer portal to process 200,000 volunteer applications prior to the 2020 Tokyo Olympics, and the number of volunteer applications swelled to 300,000 for the 2024 Paris Olympics. According to Atos, one of the most complex activities around managing people for the Olympic and Paralympic Games is assigning volunteers to the large number of necessary positions. For the 2024 Paris Olympics and Paralympics, Atos innovated the volunteers’ assignment process by implementing an optimized pre-assignment scenario model and an AI-based solution that utilized constraint logic programming to improve position matchups. At the 2020 Tokyo Olympics Atos issued 535,000 accreditations through the system and established 350 accreditation checkpoints with facial recognition in all competition and noncompetition venues. Additionally, cloud usage at the 2020 Tokyo Olympics enabled Atos to reduce by 50% the number of physical servers at the 2020 Tokyo Olympics and improve sustainability.

Every Two Years Atos Organizes Upcoming Games

Typically, pre-project activities for each Olympic Games begin six years prior to the event. For example, pre-project activities for the 2024 Paris Olympics and Paralympics began in 2018, and planning began in 2020 with the development of a master plan and strategy and related responsibilities matrix. In November 2020 Atos appointed the first core team for the 2024 Paris Olympics and Paralympics. In 2021 Atos began designing business requirements and systems infrastructure and established a test lab, and in 2022 the company initiated the building of systems and expanded the testing facility. In June 2023 Atos launched testing activities such as integration tests, acceptance tests, systems tests, events tests and multisport tests to prepare for operating the Games in 2024. During the first several months of 2024, Atos worked on venue deployment, disaster recovery and technical rehearsals.

 

For example, between May 13 and May 17 Atos completed the final technology rehearsal for the 2024 Paris Olympics and Paralympics. The rehearsals, which took place across different locations in Paris and other sites of the Olympic and Paralympic Games, were designed to test IT policies and procedures and how well IT teams can collaborate and handle real-time situations that may impact the Games. Atos is the IT integration leader and coordinates with the Organising Committee for the Olympic Games and with experts and technology partners. The technology rehearsals were conducted in 39 venues, including Atos’ Central Technology Operations Center in Barcelona, Spain, and venues specific to the Games, such as Atos’ Technology Operations Center in Paris, the Main Press Center, The Stade de France and competition venues.

 

The Olympic Games resemble a large-scale international corporation mobilizing approximately 300,000 people for the duration of the Games. Atos provides IT services with teams located in the host city and in Atos’ facilities in Poland, Morocco and Spain, and serves more than 4 billion customers globally competition results. While every two years Atos must set up a new organization for each Summer and Winter Games, the company has a well-established process and experience with starting over again. Every two years Atos establishes a Technology Operations Center (TOC) in the host city of the Summer and Winter Games. The TOC is the technology command and control center that houses teams from Atos, the IOC, the Organising Committees for the Olympic Games and other technology partners. The TOC consists of approximately 300 people who are coordinated by Atos and available 24/7 while the Olympics and Paralympics are running. Atos also has a Central Technology Operations Center (CTOC) in Barcelona, which is organized in a similar manner as the TOC in the host city. The CTOC delivers remote support during competitions and critical events, such as the volunteer campaigns, and orchestrates applications for the Games, and consists of approximately 80 people who provide services around operations, architecture, security, infrastructure and data management. Atos also has an Integration Testing Lab in Madrid that manages system testing for the Games.

 

Atos Adds New clients in the Sports and Entertainment Industry

Atos’ engagement with the IOC ends with the 2024 Paris Olympics and Paralympics. However, Atos has been expanding its client roster in the sports and entertainment industry, applying its vast experience gained from the Olympics. In December 2022 Atos signed an eight-year deal with the Union of European Football Associations (UEFA) to be the official technology partner for men’s national team competitions. Atos is assisting UEFA in managing, improving and optimizing its technology landscape and operations. Atos is also managing and securing the hybrid cloud environment and infrastructure that hosts UEFA’s services, applications and data. In July Atos announced that it had successfully delivered key IT services and applications supporting the UEFA EURO 2024 from June 14 to July 14. Atos supported UEFA systems such as accreditation, access control solutions and competition solutions. Atos managed core IT systems through its football service platform and stored and distributed UEFA football data to stakeholders. Atos is the official IT partner of UEFA National Team Football until 2030.

 

Conclusion

Atos has a well-established position and history of operating in the sports and entertainment industry. Expanding its client roster with organizations such as UEFA will help the company maintain its reputation as a reliable IT services provider and innovation partner for major events. Enabling the running of complex events such as the Summer and Winter Olympic Games and the UEFA EURO 2024 championship provides global visibility of Atos’ capabilities and brand and enables the company to augment its client base in the industry.

Investing Big in GenAI Today: The Key to Unlocking Massive Long-term Returns

GenAI requires massive investment now for a chance at massive long-term returns

For most new technologies and trends in the IT space, actual business momentum and revenue generation typically take years to develop. In fact, in many cases, particularly with new technologies available to consumers, monetization may never develop, as the expectation of free trials or advertising-led revenue streams never leads to sustainable business models.

 

The history around monetizing new technologies is what makes the rise of generative AI (GenAI) over the past 18 months so notable. In such a short period of time, we have tangible evidence from some of the largest IT vendors that billions of dollars in revenue have already been generated in the space, with the expectation that even more opportunity will develop in the coming years.

 

AI and GenAI revenue streams have not come without investment, however, as the infrastructure required to enable the new technology has been significant. The three major hyperscale cloud providers have borne the brunt of this required investment, outlaying billions of dollars to build out data centers, upgrade networking and install high-performance GPU-based servers. Amazon Web Services (AWS), Microsoft, Google and other cloud platform providers were already spending tens of billions annually to maintain and expand their cloud service offerings, and GenAI adds significantly to that investment burden.

 

The early revenue growth resulting from GenAI offerings has been promising, but put in the context of the increased investment required, it becomes clear that the business impacts of the technology will play out over an extended time period. Most public companies execute quarterly, plan annually and, as a stretch, project their expectations out over three to five years.
 
The impact of GenAI extends even further, as Microsoft CFO Amy Hood stated on the company’s fiscal 4Q24 earnings call: “Cloud and AI-related spend represents nearly all of our total capital expenditures. Within that, roughly half is for infrastructure needs where we continue to build and lease data centers that will support monetization over the next 15 years and beyond.” That means not only that Microsoft spent $19 billion on capital expenditures during a single quarter to support cloud and AI but also that the time horizon for the returns on that investment stretches beyond a decade.

 

Microsoft is, in this way, representative of all cloud platform peers, investing huge sums of capital expenditures now to realize modest new streams of revenue in the short term and anticipating significant revenue opportunity over the next 20 years.

AI & GenAI versus Capital Expenditures (Amazon Web Services, Microsoft, Google and Oracle)

AI-related revenue is already considerable, with growth expected to persist

TBR estimates the four leading cloud platform vendors generated more than $12 billion in revenue from AI and GenAI services in 2023, which is in and of itself a sizable market. On top of that, we expect revenue from those four vendors to increase by 71% during 2024.

 

Below are examples from some of the largest monetizers of GenAI so far, with estimates on the current size of their respective businesses and the strategies they use. A market of that scale and growth trajectory is notable in an IT environment where much more modest growth is the norm. While we expect growth to gradually slow and normalize over the coming years, the AI and GenAI markets remain attractive nonetheless. Insights follow about how some of the current leaders in this space are monetizing.

 

Microsoft (estimated $1 0 billion in GenAI revenue annually): While Microsoft did not quite meet Wall Street’s lofty expectations for AI-related revenue growth, the company posted a solid quarter in 2Q24. In TBR’s opinion, Microsoft’s GenAI strategy is on the right track, and its financial results align closely with our expectations. In 2Q24 Azure AI services contributed 8% of Azure’s 29% year-to-year growth, while Copilot was cited as a growth driver for Office 365.

 

Nevertheless, with Office 365 revenue growth decelerating compared to past quarters, it is clear the monetization of GenAI will take time to materialize. Still, given Microsoft’s current capex spend and capex forecast, the company is committed to its AI strategy. Management stated nearly all $19 billion of capital expenditures this quarter was focused on the cloud business, with roughly half going toward data center construction and the other half used to procure infrastructure components like GPUs.

 

This hefty commitment indicates that GenAI will remain at the forefront of Microsoft’s product development, go-to-market and partner strategies for years to come as the company looks to turn an early lead into an established position atop the AI and GenAI market.

 

AWS (estimated $2.5 billion in GenAI revenue annually): During AWS’ New York City Summit event in July, Matt Wood, the company’s VP of AI Products, noted that GenAI had already become a multibillion-dollar business for the company. Amazon CEO Andy Jassy has also spoken confidently about the future of AI, publicly proclaiming the company’s belief that GenAI would grow to generate tens of billions in revenue in the coming years.

 

The fact that AWS has been playing in AI infrastructure, with custom chip lines for both training and inference, well before the GenAI hype cycle is notable. Customers are not likely to go through the daunting task of moving off industry standard hardware, so these custom offerings can still be a more cost-effective source for net-new workloads, which is one of the reasons they signify a lot of potential for GenAI.

 

AWS’ custom offerings, coupled with tools that customers use to build and fine-tune models, such as Bedrock and SageMaker, will continue to spin the EC2 meter. AWS does have other GenAI monetization plans with a two-tiered pricing model for Amazon Q Business and Q Developer. However, it is still early days for these offerings, and Microsoft Copilot entering the mix, at least from the line-of-business (LOB) perspective, clearly indicates AWS faces an uphill battle.

 

Google Cloud (estimated $2 billion in GenAI revenue annually): Unlike some of its peers in the industry, Alphabet has not clearly quantified the impact that GenAI is having on Google Cloud’s top line. However, on Alphabet’s recent earnings call, executives said that GenAI solutions have generated billions of dollars year to date and are used by “the majority” of Google Cloud’s top 100 customers.

 

These results, coupled with a 40-basis-point acceleration in Google Cloud’s 2Q24 revenue growth rate, to 28.8%, signal that while GenAI is having an impact on Google Cloud Platform (GCP) revenue growth, it is very early days. Steps Google Cloud is taking to boost developer mindshare — with over 2 million developers using its GenAI solutions — and align with global systems integrator (GSI) partners to unlock new use cases, leave us confident Google Cloud can more aggressively vie for GenAI spend through 2025.

 

ServiceNow (less than $100 million in GenAI revenue annually): With Now Assist net-new annual contract value (NNACV) doubling from last quarter, ServiceNow’s steady momentum selling GenAI to the enterprise continues. Now Assist was included in 11 deals over $1 million in annual contract value (ACV) in 2Q24, showing positive early signs that the strategy of packaging premium digital workflow products based on domain-specific large language models (LLMs) is resonating.

 

At 45%, ServiceNow’s Pro SKU penetration rate, which represents the percentage of customer accounts on Pro or Enterprise editions of IT Service Management (ITSM), HR Service Delivery (HRSD) and CSM products, is already very strong. Upgrading these already premium customers to Pro Plus SKUs with GenAI, for which ServiceNow has already realized a 30% price uplift, could signify an opportunity for ServiceNow valued at well over $1 billion. Naturally, a big focus is expanding the availability of Pro Plus outside the core workflow products.

 

IBM (less than $2 billion in GenAI revenue annually): Approximately 75% of IBM’s reported $2 billion in GenAI book of business to date stems from services signings, and IBM lands nearly all watsonx deals thorough Consulting. Companies need help getting started with GenAI in the cloud, and IBM’s ability to lead with Consulting and go to market as both a technology and consulting organization will continue to prove unique in the GenAI wave.

 

On the software side, overcoming challenges with the Watson brand and deciding how much it wants to compete with peers have been obstacles, but IBM is now strategically pivoting around the middleware layer, hoping to act as a GenAI orchestrator that helps customers build and run AI models in a hybrid fashion. This pivot has resulted in a series of close-to-the-box investments, including Red Hat’s InstructLab project, which allows customers to fine-tune and customize Granite models, and IBM Concert for application management.

 

According to IBM, these types of GenAI assets have contributed roughly $0.5 billion to IBM’s AI book of business. By adopting a strategy to embed its AI infrastructure software into the cloud ecosystem of GenAI tools and copilots already widely accepted by customers, IBM ensures it stays relevant with these cutting-edge workloads.

 

Oracle (less than $100 million in GenAI revenue annually): With the Oracle Cloud Infrastructure (OCI) GenAI Service hitting general availability in January and a code assist tool only recently launched into preview, Oracle has been late to the GenAI game. But the company has highlighted several multibillion-dollar contracts for AI training on OCI, which speaks to its tight relationship with NVIDIA and ample supply of GPUs.

 

As an API-based service providing out-of-the-box access to LLMs for generic use cases, the OCI GenAI Service on its own does not necessarily differ from what other hyperscalers are doing. What does stand out is that Oracle offers the entire SaaS suite. Given that all Fusion SaaS instances are hosted on OCI, where the GenAI service was built, Oracle can deliver GenAI capabilities to SaaS customers at no added cost.

 

This means Oracle’s GenAI monetization will be purely from an infrastructure perspective. GPU supply and the cost efficacy of OCI will help Oracle bring new workloads into the pipeline, and we will see a bigger impact to growth in 2025. For context, Oracle’s remaining performance obligations balance (though some includes Cerner) is $98 billion.
 

Dive Into the Future of GenAI with TBR Analysts Patrick Heffernan, Bozhidar Hristov and Kelly Lesiczka

Beyond revenue generation, cost savings is part of the value proposition for cloud vendors and customers alike

Many of the leading IT vendors’ GenAI strategies have centered on investing in solutions for customers. However, vendors have also been serving as customer zero for the technology by implementing it internally. The results from their early implementations seem very much like end-customer use cases, which focus on cost savings and efficiency as the easiest benefits to realize. While many IT vendors have seen operating expenses and headcount level off over the past couple of quarters, implying that AI has had some impact on company efficiency, IBM and SAP have both explicitly stated AI’s impact on their operating models.

 

IBM was one of the earliest vocal proponents for the labor-saving benefits AI could bring to its business. In mid-2023 CEO Arvind Krishna announced a hiring freeze and shared an expectation that AI would replace 8,000 jobs. IBM remains focused on driving productivity gains, which it is largely doing by lowering the internal cost of IT and rebalancing the global workforce. This includes using AI to automate back-office functions. Such efforts have IBM on track to deliver a minimum of $3 billion in annual run-rate savings by the end of 2024.

 

Meanwhile, SAP’s decision to increase its planned FTE reallocation from a previous target of 8,000 to a new range of between 9,000 and 10,000 FTEs shows the company is committed to improving operating efficiency. While the bulk of the restructuring will consist of reallocating FTEs into lower-cost geographies and strategically important business units, taking a customer-zero approach with GenAI is also a component. SAP is leveraging business AI tools focused on areas like finance & accounting and human resources to reduce the labor intensity within the respective business units.

Just like end customers, vendors are investing significantly now in hopes of generating long-term GenAI returns

As seen in TBR’s Cloud Customer Research streams, customers have been investing in GenAI solutions with some haste, forgoing clear ROI measurements or typical budgeting procedures. Customers, as well as the major vendors we cover, have a sense of urgency around GenAI and share the feeling that if they do not embrace these new solutions now, it could place them at a long-term competitive disadvantage. If customers are not making full use of GenAI capabilities, their competitors will be more efficient and productive and capture more growth opportunities. For vendors, the ability to not only deliver GenAI capabilities but also do so at scale will be a competitive necessity for decades to come.

 

In this regard, customers and vendors find themselves in a similar situation, investing in GenAI now just for the possibility of a future advantage, but the scale of investments required are quite different. Customers have the good fortune of leveraging scalable, subscription-based services for many of these GenAI technologies. Customers are still extending their IT budgets and paying more to incorporate GenAI, but they do not have large fixed costs and long-term commitments at this point.

 

Vendors, on the other hand, need to make significant investments, even beyond the already huge levels of investment to support cloud services, to capitalize on the GenAI opportunity. The scale of investment cannot be understated for the largest cloud platform providers like AWS, Microsoft, Google and Oracle. All of these vendors were already investing tens of billions of dollars annually to support data center and infrastructure build-outs.

 

The unique data center and infrastructure requirements to deliver GenAI solutions, including the GPU-based systems, are driving double-digit to triple-digit increases in capex spending for leading vendors. Not only is the level of spending noticeable, the time periods for the returns are also lengthy. In communicating those increased expenses to investors and Wall Street analysts, vendors like Microsoft messaged the returns from these investments playing out over the next 15 years, a time horizon seldom mentioned previously.