Nokia’s Fixed Networks Unit Poised for Long-term Growth Despite Market Challenges

TBR Perspective: Nokia’s Fixed Networks Business Unit

Nokia is the largest vendor of fixed network access infrastructure by revenue in the Western economic bloc, a position of strength that exposes the vendor to a range of opportunities that arise in the market. While Nokia remains focused on its fiber-based platform, the vendor is also supporting fixed-wireless access (FWA), which is a rapidly growing service offering in the telecom industry.
Though revenue in Nokia’s Fixed Networks business unit has been uneven over the past few years (primarily due to the disruptions caused by the COVID-19 pandemic), the unit is poised to be one of the biggest beneficiaries of government-supported broadband programs and ongoing internet service provider (ISP) investment in high-speed broadband access technologies, driving a positive revenue trend over at least the next three to five years.
 
Nokia is focused on expanding access to broadband (through fiber and/or FWA) and introducing a future-proof platform for ISPs to build upon. The company is trying to be everything to everyone in this domain by providing a near complete portfolio (only DOCSIS is missing).
 
Despite Nokia’s favorable market position and government-induced tailwinds for the broadband infrastructure domain, TBR notes that the supply-and-demand dynamics as well as the timing of investments are prone to be disjointed, lengthening the time required to meet infrastructure deployment objectives compared to what was originally expected by the government and the telecom industry.
 
Additionally, TBR remains steadfast in its belief that building fiber out to every household is not economically feasible (despite what the government and stakeholders in the market say they want) and that alternative broadband access technologies (such as FWA and satellite) are going to increase in the global mix to connect the unconnected and underserved peoples of the world.

Impact and Opportunities for Nokia

BEAD Program Will Likely Stretch to the Mid-2030s due to Challenges and Delays

Broadband Equity, Access, and Deployment (BEAD) Program-supported projects are now slated to begin deployments in 2025, more than a year later than originally planned. There is a long list of reasons (most of which are related to mapping integrity and political processes) why the program has been delayed thus far, and there is a growing list of reasons that suggest it will take longer for the program to fully ramp up and complete its objective (i.e., spend all of the $42.5 billion allocated to the program).
Among the biggest challenges that lie ahead for the BEAD Program is a shortage of skilled labor (e.g., fiber splicers and trenching machine operators) and industrial equipment, such as boring machines, that will be required to deploy fiber to an estimated 5.5 million households across the U.S. Shortages of products that meet the Build America Buy America (BABA) requirements associated with the BEAD Program could also cause a timing and supply issue.
 
Taken together, TBR now believes the deployments tied to the BEAD Program will begin next year and it could take as long as the mid-2030s for all the program’s funding to be disbursed, more than five years longer than the government and market ecosystem originally anticipated. Nokia is doing as much as it can to mitigate and alleviate these potential challenges in the market.
 
For example, Nokia is proactively educating stakeholders in the ecosystem and working with its partners to better match supply with demand for products and resources. This orchestration of the ecosystem will help align stakeholders and enable the industry to put its best foot forward in carrying out this infrastructure build-out program as well as position Nokia to maintain and grow its leading share in the broadband infrastructure market.

Do Not Forget About Non-BEAD Government Programs for Broadband

Though the telecom industry likes to focus on the BEAD Program (likely because it is the largest program by dollar amount in the broadband ecosystem in the U.S. market), there are a variety of other government-supported programs that also deal with broadband, including the American Rescue Plan Act (ARPA), the Rural Digital Opportunity Fund (RDOF), the U.S. Department of the Treasury’s Capital Projects Fund, the Tribal Broadband Connectivity Program, and the U.S. Department of Agriculture’s ReConnect Loan and Grant Program.
 
In aggregate, TBR estimates there is more than $80 billion in direct and indirect government stimulus allocated for broadband-related projects in the U.S. market alone, all of which is slated to be spent by the mid-2030s. There are also a few hundred billion dollars in aggregate in similar broadband-implicated programs in other regions, most notably in China, the European Union, the U.K. and Australia.

Fiber Access Technology Capabilities Exceed Usability, Creating a Conundrum for Vendors

Technological innovations pertaining to fiber access have become so advanced and the bandwidth available through fiber access so massive that the capabilities of the technology far exceed what most end customers could possibly need or use. This disconnect creates a conundrum for vendors such as Nokia that supply the broadband infrastructure market.
 
Though fiber broadband infrastructure is, and will remain, in high demand, most ISPs will be loath to adopt the most cutting-edge technologies because they far exceed what customers would need and put unnecessary additional cost burden on the operator.
 
There are exceptions, such as what Google Fiber and Frontier Communications are deploying (specifically 50G and 100G connections, respectively), but TBR believes most ISPs will focus on 10G or lower connections, which is more than enough bandwidth for the vast majority of households and businesses and are likely to be future-proof for many years to come.

Overbuilding and One-upmanship Risks New Price War for High-speed Internet Service

The government funding boost, coupled with technological advancements and new entrants into the ISP domain, is creating a situation that is ripe for a price war for broadband services. Specifically, many more markets across the U.S. are likely to have three or more (in some cases up to seven) providers of high-speed broadband service in a given area, including xDSL, FTTx, HFC (via DOCSIS) as well as FWA and satellite (mostly delivered via low Earth orbit [LEO] satellites).
 
Given that a provider typically needs to have more than 30% market share in a given area to achieve profitability in the broadband services market, an increasing number of options puts more power into the hands of end users, which historically suggests the pricing environment will be extremely competitive.
 
In response to the hotter competitive environment, providers that are multiservice-oriented are trying to attract and lock in market share by offering converged (aka bundled) solutions, usually giving end users a discount as an incentive to sign up and stay.
 
Additionally, TBR notes that ISPs are increasingly engaging in one-upmanship (which is also a symptom of the existence of too many options in a given market), meaning ISPs are marketing ever higher broadband speeds to customers to position their offerings as better than the competition while attempting to incrementally increase average revenue per user.
 
Though this strategy has been effective in years past, it is likely to lose efficacy after speeds surpass the level at which the benefits of faster speeds become imperceptible to end users. Therefore, in aggregate, TBR expects the pricing environment in the U.S. for broadband service to be increasingly competitive through at least the remainder of this decade.

Private Equity Comes into the Fixed Broadband Market

Private equity firms are entering the telecom infrastructure market in a big way, gobbling up assets and forging joint ventures with telcos that want to (or need to) raise capital and hedge their risks. Some private equity-sponsored entities are also now building out their own greenfield fiber-based networks (such as Brookfield Infrastructure Partners’ Intrepid Fiber Networks) and are even moving the market toward wholesale, shared and other forms of open-access models.
 
Though the inclusion of private equity into the broadband infrastructure domain is bringing large pools of fresh capital into the market, this trend also risks fueling overinvestment, price compression and disruption of incumbent ISPs’ business models. Regardless, expect private equity to remain attracted to assets that offer consistent cash flow over a long duration, and their inclusion in the telecom ecosystem is likely a net positive for overall market development and evolution.

Existing Government Stimulus May Still Not be Enough for FTTP; Alternatives Will Likely be Called on at Scale to Fill in the Gaps

Though governments (and most of the stakeholders in the telecom ecosystem) across the world want full fiber to each premises, this is still not economically feasible. For example, it is not uncommon for some locations in the U.S. to cost upward of $1 million per premise to connect with fiber, a price that will be politically difficult to justify and that is not supported by normal market conditions. In these extreme situations, it is highly likely that governments will allow and embrace alternatives, such as FWA and satellite-based connectivity.
 
TBR notes that FWA and LEO constellations can easily deliver sustained speeds in excess of 100Mbps at a fraction of what it would cost to deploy fiber to the premises (FTTP). With that said, of the estimated 5.5 million households that the government has identified as needing broadband connection in the U.S., TBR would not be surprised if up to 25% of that number of households is ultimately connected via FWA or satellite (enhancements to DOCSIS and xDSL are also potential options to close the underserved gap). In other countries, that percentage could be even higher.

New Business Models Hold Promise to Connect Low-income Households in Emerging Markets

Upstart ISPs, such as fibertime and Vulacoin in South Africa, have established innovative solutions to cost-effectively provide high-speed broadband services to low-income areas. The architecture of the network emphasizes leveraging FWA and Wi-Fi with a relatively low amount of fiber and the business model is focused on selling units of time (in minutes), which is more affordable for lower-income end users.
 
TBR notes this model requires scale and high time of use to achieve profitability, meaning it is best suited for dense areas, especially impoverished neighborhoods. TBR also notes that obtaining access to high-speed internet is a key avenue in which areas can strengthen their local economies and help reduce levels of poverty.
 
In addition to South Africa, Brazil is also exploring the use of this model. This approach is also likely to be leveraged in other parts of Africa as well as in parts of India and Southeast Asia.

Conclusion

Government and private equity involvement in the broadband market may prove to be a mixed blessing. Though there are concerning indicators suggesting there are too many broadband providers in some key markets (especially the U.S.) and that broadband access businesses are becoming overvalued, these market dynamics actually represent tailwinds for Nokia, which is best positioned to garner a disproportionate amount of investment slated for broadband infrastructure in the Western economic bloc, which includes North America, Europe, developed APAC and select developing markets such as India.
 
Nokia’s outsized and unique position in the broadband infrastructure ecosystem enables the company to play a key role in orchestrating partners and customers to achieve their objectives in the most optimal way possible. Fiber will remain the coveted access medium for high-speed broadband, but the world will also employ other broadband access mediums to a large extent.
 
New ISP and hyperscaler business models, coupled with sustained investments by incumbent ISPs and supported by government stimulus, create an environment ripe for moving the world closer to full broadband coverage for all people.

Atos Powers 2024 Paris Olympics and Paralympics with Cutting-edge IT and AI Solutions

Atos, the worldwide IT partner for the Summer and Winter Olympic and Paralympic Games, invited a group of industry analysts to the 2024 Paris Olympics. The goal of the event was to show Atos in action during the Games with a tour of the Technology Operations Center in Paris, which is one of the three locations responsible for delivering IT services and keeping the Games running. The analysts also attended a swimming competition event at Paris La Defense Arena, to experience the secure and digital experience provided by Atos and its partners in running the IT systems behind the Games.

The Olympics Must Run Flawlessly; There Are No Second Chances

Atos utilized its well-established expertise in the sports and entertainment industry to provide IT services for the 2024 Paris Olympics and Paralympics and enable a secure and digital experience for end users, which typically amounts to a total of approximately 4 billion viewers globally. Atos has been providing services for the Olympic Movement since 1989. Atos established its relationship with the International Olympic Committee (IOC) as a Worldwide IT Partner in 2001 and provided IT services for the first Winter Olympics in 2002 in Salt Lake City. Providing uninterrupted running of the IT systems behind the Olympics every two years requires dedication and strict execution of processes and timelines.

 

According to Angels Martin, general manager Olympics at Atos, “Olympics challenges are similar to other projects; the difference is visibility [of the Games]. No one will postpone the opening ceremony because Atos is not ready.” Martin also explained that cybersecurity management is a vital activity that Atos provides as the Games are one of the most targeted events in terms of cyberattacks, which could threaten the smooth functioning of the Olympics. She also stated that the Games are complex to manage with multiple parties, such as the IOC, sports federations, broadcasters and journalists, requiring services and access to information 24/7 from anywhere on any device. Martin also noted that demand for information has changed significantly since the first engagement 30 years ago, and today Atos is applying AI-driven solutions to enable processes for the Games. For example, Atos used AI solutions for the 2024 Paris Olympics to support the Organising Committees for the Olympic Games in providing scenarios for matching volunteers with job positions based on skills and abilities. In the 2020 Tokyo Olympics Atos provided an AI solution for facial recognition for venue access using accreditation.

Atos Integrates Critical IT Systems and Manages Partners to Run the Games

Atos is responsible for integrating critical IT systems, managing programs with IT vendors that deliver services for the Organising Committees for the Olympic Games, supporting critical applications for the Games and providing security services to enable smooth and uninterrupted running of the Games. For example, for the 2024 Paris Olympics and Paralympics Atos operated the Olympic Management System, which included a volunteer portal, a workforce management system, athlete voting applications, sport entries and qualifications, competition schedule and accreditation. Atos was responsible for the Olympic Diffusion System, which contained Olympic data feed, web results, mobile apps for results, a Commentator Information System, an information system for journalists called MyInfo, and a print distribution system. Atos was also responsible for cloud orchestration between private cloud, public cloud services and data centers at venues.

 

Additionally, Atos applied its expertise around working with a diverse group of technology partners to help run the Games and provided systems integration of applications with other IT providers and partners. Atos integrated partners, such as technology providers, media, the IOC, Organising Committees for the Olympic Games, and security providers, to ensure efficient delivery, operations, timelines and venue management activities. Atos also helped coordinate responses on daily activities and addressed critical events when they occurred. For example, Atos worked with Omega, the timing and scoring sponsor of the 2024 Paris Olympics, to relay results and data to spectators globally in real time. Omega captured raw data around timing and scoring, fed the results into scoreboards and videoboards at venues jointly with Panasonic, and provided data to Atos to feed into the Commentator Information System.

Atos’ Olympics and Paralympics Achievements

Achievements from the 2020 Tokyo Olympics and the 2024 Paris Olympics show the magnitude of work Atos provides. There are approximately 900 events that Atos has to manage to be able to transmit results instantly from competition and noncompetition venues. The company utilized the volunteer portal to process 200,000 volunteer applications prior to the 2020 Tokyo Olympics, and the number of volunteer applications swelled to 300,000 for the 2024 Paris Olympics. According to Atos, one of the most complex activities around managing people for the Olympic and Paralympic Games is assigning volunteers to the large number of necessary positions. For the 2024 Paris Olympics and Paralympics, Atos innovated the volunteers’ assignment process by implementing an optimized pre-assignment scenario model and an AI-based solution that utilized constraint logic programming to improve position matchups. At the 2020 Tokyo Olympics Atos issued 535,000 accreditations through the system and established 350 accreditation checkpoints with facial recognition in all competition and noncompetition venues. Additionally, cloud usage at the 2020 Tokyo Olympics enabled Atos to reduce by 50% the number of physical servers at the 2020 Tokyo Olympics and improve sustainability.

Every Two Years Atos Organizes Upcoming Games

Typically, pre-project activities for each Olympic Games begin six years prior to the event. For example, pre-project activities for the 2024 Paris Olympics and Paralympics began in 2018, and planning began in 2020 with the development of a master plan and strategy and related responsibilities matrix. In November 2020 Atos appointed the first core team for the 2024 Paris Olympics and Paralympics. In 2021 Atos began designing business requirements and systems infrastructure and established a test lab, and in 2022 the company initiated the building of systems and expanded the testing facility. In June 2023 Atos launched testing activities such as integration tests, acceptance tests, systems tests, events tests and multisport tests to prepare for operating the Games in 2024. During the first several months of 2024, Atos worked on venue deployment, disaster recovery and technical rehearsals.

 

For example, between May 13 and May 17 Atos completed the final technology rehearsal for the 2024 Paris Olympics and Paralympics. The rehearsals, which took place across different locations in Paris and other sites of the Olympic and Paralympic Games, were designed to test IT policies and procedures and how well IT teams can collaborate and handle real-time situations that may impact the Games. Atos is the IT integration leader and coordinates with the Organising Committee for the Olympic Games and with experts and technology partners. The technology rehearsals were conducted in 39 venues, including Atos’ Central Technology Operations Center in Barcelona, Spain, and venues specific to the Games, such as Atos’ Technology Operations Center in Paris, the Main Press Center, The Stade de France and competition venues.

 

The Olympic Games resemble a large-scale international corporation mobilizing approximately 300,000 people for the duration of the Games. Atos provides IT services with teams located in the host city and in Atos’ facilities in Poland, Morocco and Spain, and serves more than 4 billion customers globally competition results. While every two years Atos must set up a new organization for each Summer and Winter Games, the company has a well-established process and experience with starting over again. Every two years Atos establishes a Technology Operations Center (TOC) in the host city of the Summer and Winter Games. The TOC is the technology command and control center that houses teams from Atos, the IOC, the Organising Committees for the Olympic Games and other technology partners. The TOC consists of approximately 300 people who are coordinated by Atos and available 24/7 while the Olympics and Paralympics are running. Atos also has a Central Technology Operations Center (CTOC) in Barcelona, which is organized in a similar manner as the TOC in the host city. The CTOC delivers remote support during competitions and critical events, such as the volunteer campaigns, and orchestrates applications for the Games, and consists of approximately 80 people who provide services around operations, architecture, security, infrastructure and data management. Atos also has an Integration Testing Lab in Madrid that manages system testing for the Games.

 

Atos Adds New clients in the Sports and Entertainment Industry

Atos’ engagement with the IOC ends with the 2024 Paris Olympics and Paralympics. However, Atos has been expanding its client roster in the sports and entertainment industry, applying its vast experience gained from the Olympics. In December 2022 Atos signed an eight-year deal with the Union of European Football Associations (UEFA) to be the official technology partner for men’s national team competitions. Atos is assisting UEFA in managing, improving and optimizing its technology landscape and operations. Atos is also managing and securing the hybrid cloud environment and infrastructure that hosts UEFA’s services, applications and data. In July Atos announced that it had successfully delivered key IT services and applications supporting the UEFA EURO 2024 from June 14 to July 14. Atos supported UEFA systems such as accreditation, access control solutions and competition solutions. Atos managed core IT systems through its football service platform and stored and distributed UEFA football data to stakeholders. Atos is the official IT partner of UEFA National Team Football until 2030.

 

Conclusion

Atos has a well-established position and history of operating in the sports and entertainment industry. Expanding its client roster with organizations such as UEFA will help the company maintain its reputation as a reliable IT services provider and innovation partner for major events. Enabling the running of complex events such as the Summer and Winter Olympic Games and the UEFA EURO 2024 championship provides global visibility of Atos’ capabilities and brand and enables the company to augment its client base in the industry.

Investing Big in GenAI Today: The Key to Unlocking Massive Long-term Returns

GenAI requires massive investment now for a chance at massive long-term returns

For most new technologies and trends in the IT space, actual business momentum and revenue generation typically take years to develop. In fact, in many cases, particularly with new technologies available to consumers, monetization may never develop, as the expectation of free trials or advertising-led revenue streams never leads to sustainable business models.

 

The history around monetizing new technologies is what makes the rise of generative AI (GenAI) over the past 18 months so notable. In such a short period of time, we have tangible evidence from some of the largest IT vendors that billions of dollars in revenue have already been generated in the space, with the expectation that even more opportunity will develop in the coming years.

 

AI and GenAI revenue streams have not come without investment, however, as the infrastructure required to enable the new technology has been significant. The three major hyperscale cloud providers have borne the brunt of this required investment, outlaying billions of dollars to build out data centers, upgrade networking and install high-performance GPU-based servers. Amazon Web Services (AWS), Microsoft, Google and other cloud platform providers were already spending tens of billions annually to maintain and expand their cloud service offerings, and GenAI adds significantly to that investment burden.

 

The early revenue growth resulting from GenAI offerings has been promising, but put in the context of the increased investment required, it becomes clear that the business impacts of the technology will play out over an extended time period. Most public companies execute quarterly, plan annually and, as a stretch, project their expectations out over three to five years.
 
The impact of GenAI extends even further, as Microsoft CFO Amy Hood stated on the company’s fiscal 4Q24 earnings call: “Cloud and AI-related spend represents nearly all of our total capital expenditures. Within that, roughly half is for infrastructure needs where we continue to build and lease data centers that will support monetization over the next 15 years and beyond.” That means not only that Microsoft spent $19 billion on capital expenditures during a single quarter to support cloud and AI but also that the time horizon for the returns on that investment stretches beyond a decade.

 

Microsoft is, in this way, representative of all cloud platform peers, investing huge sums of capital expenditures now to realize modest new streams of revenue in the short term and anticipating significant revenue opportunity over the next 20 years.

AI & GenAI versus Capital Expenditures (Amazon Web Services, Microsoft, Google and Oracle)

AI-related revenue is already considerable, with growth expected to persist

TBR estimates the four leading cloud platform vendors generated more than $12 billion in revenue from AI and GenAI services in 2023, which is in and of itself a sizable market. On top of that, we expect revenue from those four vendors to increase by 71% during 2024.

 

Below are examples from some of the largest monetizers of GenAI so far, with estimates on the current size of their respective businesses and the strategies they use. A market of that scale and growth trajectory is notable in an IT environment where much more modest growth is the norm. While we expect growth to gradually slow and normalize over the coming years, the AI and GenAI markets remain attractive nonetheless. Insights follow about how some of the current leaders in this space are monetizing.

 

Microsoft (estimated $1 0 billion in GenAI revenue annually): While Microsoft did not quite meet Wall Street’s lofty expectations for AI-related revenue growth, the company posted a solid quarter in 2Q24. In TBR’s opinion, Microsoft’s GenAI strategy is on the right track, and its financial results align closely with our expectations. In 2Q24 Azure AI services contributed 8% of Azure’s 29% year-to-year growth, while Copilot was cited as a growth driver for Office 365.

 

Nevertheless, with Office 365 revenue growth decelerating compared to past quarters, it is clear the monetization of GenAI will take time to materialize. Still, given Microsoft’s current capex spend and capex forecast, the company is committed to its AI strategy. Management stated nearly all $19 billion of capital expenditures this quarter was focused on the cloud business, with roughly half going toward data center construction and the other half used to procure infrastructure components like GPUs.

 

This hefty commitment indicates that GenAI will remain at the forefront of Microsoft’s product development, go-to-market and partner strategies for years to come as the company looks to turn an early lead into an established position atop the AI and GenAI market.

 

AWS (estimated $2.5 billion in GenAI revenue annually): During AWS’ New York City Summit event in July, Matt Wood, the company’s VP of AI Products, noted that GenAI had already become a multibillion-dollar business for the company. Amazon CEO Andy Jassy has also spoken confidently about the future of AI, publicly proclaiming the company’s belief that GenAI would grow to generate tens of billions in revenue in the coming years.

 

The fact that AWS has been playing in AI infrastructure, with custom chip lines for both training and inference, well before the GenAI hype cycle is notable. Customers are not likely to go through the daunting task of moving off industry standard hardware, so these custom offerings can still be a more cost-effective source for net-new workloads, which is one of the reasons they signify a lot of potential for GenAI.

 

AWS’ custom offerings, coupled with tools that customers use to build and fine-tune models, such as Bedrock and SageMaker, will continue to spin the EC2 meter. AWS does have other GenAI monetization plans with a two-tiered pricing model for Amazon Q Business and Q Developer. However, it is still early days for these offerings, and Microsoft Copilot entering the mix, at least from the line-of-business (LOB) perspective, clearly indicates AWS faces an uphill battle.

 

Google Cloud (estimated $2 billion in GenAI revenue annually): Unlike some of its peers in the industry, Alphabet has not clearly quantified the impact that GenAI is having on Google Cloud’s top line. However, on Alphabet’s recent earnings call, executives said that GenAI solutions have generated billions of dollars year to date and are used by “the majority” of Google Cloud’s top 100 customers.

 

These results, coupled with a 40-basis-point acceleration in Google Cloud’s 2Q24 revenue growth rate, to 28.8%, signal that while GenAI is having an impact on Google Cloud Platform (GCP) revenue growth, it is very early days. Steps Google Cloud is taking to boost developer mindshare — with over 2 million developers using its GenAI solutions — and align with global systems integrator (GSI) partners to unlock new use cases, leave us confident Google Cloud can more aggressively vie for GenAI spend through 2025.

 

ServiceNow (less than $100 million in GenAI revenue annually): With Now Assist net-new annual contract value (NNACV) doubling from last quarter, ServiceNow’s steady momentum selling GenAI to the enterprise continues. Now Assist was included in 11 deals over $1 million in annual contract value (ACV) in 2Q24, showing positive early signs that the strategy of packaging premium digital workflow products based on domain-specific large language models (LLMs) is resonating.

 

At 45%, ServiceNow’s Pro SKU penetration rate, which represents the percentage of customer accounts on Pro or Enterprise editions of IT Service Management (ITSM), HR Service Delivery (HRSD) and CSM products, is already very strong. Upgrading these already premium customers to Pro Plus SKUs with GenAI, for which ServiceNow has already realized a 30% price uplift, could signify an opportunity for ServiceNow valued at well over $1 billion. Naturally, a big focus is expanding the availability of Pro Plus outside the core workflow products.

 

IBM (less than $2 billion in GenAI revenue annually): Approximately 75% of IBM’s reported $2 billion in GenAI book of business to date stems from services signings, and IBM lands nearly all watsonx deals thorough Consulting. Companies need help getting started with GenAI in the cloud, and IBM’s ability to lead with Consulting and go to market as both a technology and consulting organization will continue to prove unique in the GenAI wave.

 

On the software side, overcoming challenges with the Watson brand and deciding how much it wants to compete with peers have been obstacles, but IBM is now strategically pivoting around the middleware layer, hoping to act as a GenAI orchestrator that helps customers build and run AI models in a hybrid fashion. This pivot has resulted in a series of close-to-the-box investments, including Red Hat’s InstructLab project, which allows customers to fine-tune and customize Granite models, and IBM Concert for application management.

 

According to IBM, these types of GenAI assets have contributed roughly $0.5 billion to IBM’s AI book of business. By adopting a strategy to embed its AI infrastructure software into the cloud ecosystem of GenAI tools and copilots already widely accepted by customers, IBM ensures it stays relevant with these cutting-edge workloads.

 

Oracle (less than $100 million in GenAI revenue annually): With the Oracle Cloud Infrastructure (OCI) GenAI Service hitting general availability in January and a code assist tool only recently launched into preview, Oracle has been late to the GenAI game. But the company has highlighted several multibillion-dollar contracts for AI training on OCI, which speaks to its tight relationship with NVIDIA and ample supply of GPUs.

 

As an API-based service providing out-of-the-box access to LLMs for generic use cases, the OCI GenAI Service on its own does not necessarily differ from what other hyperscalers are doing. What does stand out is that Oracle offers the entire SaaS suite. Given that all Fusion SaaS instances are hosted on OCI, where the GenAI service was built, Oracle can deliver GenAI capabilities to SaaS customers at no added cost.

 

This means Oracle’s GenAI monetization will be purely from an infrastructure perspective. GPU supply and the cost efficacy of OCI will help Oracle bring new workloads into the pipeline, and we will see a bigger impact to growth in 2025. For context, Oracle’s remaining performance obligations balance (though some includes Cerner) is $98 billion.
 

Dive Into the Future of GenAI with TBR Analysts Patrick Heffernan, Bozhidar Hristov and Kelly Lesiczka

Beyond revenue generation, cost savings is part of the value proposition for cloud vendors and customers alike

Many of the leading IT vendors’ GenAI strategies have centered on investing in solutions for customers. However, vendors have also been serving as customer zero for the technology by implementing it internally. The results from their early implementations seem very much like end-customer use cases, which focus on cost savings and efficiency as the easiest benefits to realize. While many IT vendors have seen operating expenses and headcount level off over the past couple of quarters, implying that AI has had some impact on company efficiency, IBM and SAP have both explicitly stated AI’s impact on their operating models.

 

IBM was one of the earliest vocal proponents for the labor-saving benefits AI could bring to its business. In mid-2023 CEO Arvind Krishna announced a hiring freeze and shared an expectation that AI would replace 8,000 jobs. IBM remains focused on driving productivity gains, which it is largely doing by lowering the internal cost of IT and rebalancing the global workforce. This includes using AI to automate back-office functions. Such efforts have IBM on track to deliver a minimum of $3 billion in annual run-rate savings by the end of 2024.

 

Meanwhile, SAP’s decision to increase its planned FTE reallocation from a previous target of 8,000 to a new range of between 9,000 and 10,000 FTEs shows the company is committed to improving operating efficiency. While the bulk of the restructuring will consist of reallocating FTEs into lower-cost geographies and strategically important business units, taking a customer-zero approach with GenAI is also a component. SAP is leveraging business AI tools focused on areas like finance & accounting and human resources to reduce the labor intensity within the respective business units.

Just like end customers, vendors are investing significantly now in hopes of generating long-term GenAI returns

As seen in TBR’s Cloud Customer Research streams, customers have been investing in GenAI solutions with some haste, forgoing clear ROI measurements or typical budgeting procedures. Customers, as well as the major vendors we cover, have a sense of urgency around GenAI and share the feeling that if they do not embrace these new solutions now, it could place them at a long-term competitive disadvantage. If customers are not making full use of GenAI capabilities, their competitors will be more efficient and productive and capture more growth opportunities. For vendors, the ability to not only deliver GenAI capabilities but also do so at scale will be a competitive necessity for decades to come.

 

In this regard, customers and vendors find themselves in a similar situation, investing in GenAI now just for the possibility of a future advantage, but the scale of investments required are quite different. Customers have the good fortune of leveraging scalable, subscription-based services for many of these GenAI technologies. Customers are still extending their IT budgets and paying more to incorporate GenAI, but they do not have large fixed costs and long-term commitments at this point.

 

Vendors, on the other hand, need to make significant investments, even beyond the already huge levels of investment to support cloud services, to capitalize on the GenAI opportunity. The scale of investment cannot be understated for the largest cloud platform providers like AWS, Microsoft, Google and Oracle. All of these vendors were already investing tens of billions of dollars annually to support data center and infrastructure build-outs.

 

The unique data center and infrastructure requirements to deliver GenAI solutions, including the GPU-based systems, are driving double-digit to triple-digit increases in capex spending for leading vendors. Not only is the level of spending noticeable, the time periods for the returns are also lengthy. In communicating those increased expenses to investors and Wall Street analysts, vendors like Microsoft messaged the returns from these investments playing out over the next 15 years, a time horizon seldom mentioned previously.

Hybrid, Proximity and Ecosystems Are Elevating the Importance of Colocation

A Multitude of Secular Trends Are Reinforcing Colocation’s Value Proposition

Market trends over the past few years have made several things clear about the IT strategy of most enterprise customers, all of which reinforce the value proposition offered by colocation providers:

  • Hybrid environments will persist — Whether due to existing legacy investments, divisional or regional nuances, or acquisition and divestiture activity, heterogeneity will remain in most IT environments. At one point, the benefits of public cloud made organizations consider a homogeneous, fully cloud-based IT delivery strategy, but those visions have faded for most. The challenge — and goal — is to embrace the hybrid heterogeneous approach and find the best way to integrate, manage and optimize services across these diverse sets of delivery methods and assets. Colocation data centers play a critical role for customers, offering a hybrid approach to facilities and in the interconnection of cloud and on-premises services.
  • Location and proximity matter — The importance of delivery locations is driven by not only the hybrid nature of the environment but also the latency requirements of many workloads. Edge workloads are the clearest example, but there are other cases where latency is critical or where regulations determine the location of data.
  • Investment costs and opportunity costs are important — While organizations are still looking to control and minimize IT costs where possible, there has been a shift toward selective investment. This started when IT became one of the few levers companies could control as they shifted their business models to adapt to changes wrought by the COVID-19 pandemic. Most recently, the onset of generative AI (GenAI) convinced organizations that IT could be a competitive advantage as well, prompting investment in new solutions and technologies to keep pace with the market and key competitors. In this way, organizations are still closely controlling investment costs in new solutions but also can be swayed to spend due to the fear of lost opportunities. Colocation provides an emerging value proposition with GenAI and AI workloads, offering prebuilt facilities and interconnection services without requiring large retrofits or new capital expenditures.
  • Ecosystems equal innovation — Though hyperscalers have become the center of the IT ecosystem over the past decade, the network of infrastructure providers, ISVs, systems integrators (SIs), colocation providers, consultancies and managed services providers remains intact. With the hybrid approach that most customers are embracing, combined with the digital transformations being deployed and then amplified by the onset of new AI and GenAI technology, numerous vendors are part of most enterprise IT solutions. The orchestration of those multiple vendors is critical and most often handled by a trusted SI partner.

Colocation Is a Relied-upon Option for the Vast Majority of Enterprises

According to TBR’s 2Q24 Infrastructure Strategy Customer Research, a significant portion of enterprises report colocation as some part of their overall IT delivery strategy. Most have less than 25% of their workloads deployed in colocation facilities, which is a reflection of the two predominant delivery strategies: their own centralized data centers and cloud-based environments. Colocation is even more of a consideration for new workloads, however, as 72% of surveyed respondents expect to deploy between 25% and 50% of net-new workloads using colocation providers.
 
We believe this trend is due to two factors. First, enterprises are reluctant to build their own data center facilities for workloads that perform best outside the cloud or that have location and latency requirements. Most organizations want to reduce their data center capacity at this point, not add to it at. Second, for many new workloads, data center requirements are more challenging to provide. With the need for increased density, more power requirements and unique GPU-based AI services, a modern data center is required. The challenges of technology, facilities and staffing highlight the value of a ready-to-use colocation facility.

Digital Realty and Equinix Stand Out in a Tightly Packed Colocation Market

Recent trends around hybrid deployment models, latency-sensitive workloads, data residency and AI-reliant solutions have highlighted the sometimes-overlooked benefits of colocation providers. Especially for large enterprise customers, the scale of colocation facilities, strength of alliances and ability to invest in supporting new technologies make a difference in the value of their services. TBR research shows Digital Realty and Equinix are head and shoulders above the rest of their peers in terms of the ability to meet enterprise requirements. From a purely data center location perspective, Digital Realty is the market leader worldwide, with 309 data centers, including those from unconsolidated joint ventures, effective 1Q24.

 

The current revenue perspective is one component when it comes to colocation spending, but enterprises also want to know their solution providers will be able to scale as demands grow. Especially after the supply constraints over the last couple of years and the ongoing shortage of key components for next-gen workloads like GenAI, customers are not always secure in their ability to access resources on a timely basis. While the supply of colocation capacity remains tight, investing now to guarantee expanded capacity is another differentiator. Here again there are advantages to scale, as Digital Realty actually outpaced all covered vendors in level of capital expenditures in 2023. This commitment to current investment is a signal to customers that they can continue to grow with Digital Realty moving forward.

Digital Realty Is Well Positioned to Address Hyperscaler Demand, Both Financially and in its Go-to-market Approach

Though adamant about vying for mindshare among both enterprises and hyperscalers, Digital Realty has always been better known for its play in wholesale colocation. Over the past several quarters, Digital Realty has employed an aggressive joint venture strategy, allying with private equity firms to build multibillion-dollar hyperscale data centers in both Tier 1 and Tier 2 markets. As such, much of Digital Realty’s financial makeup and recent performance have stemmed from this customer base, with over 80% of annualized base rent from new leases in the >1MW category (effective 1Q24). The retail colocation market will undoubtedly continue to grow, led by robust demand for hybrid deployments and cloud adjacency for reasons highlighted earlier in this piece.
 
But many sources continue to suggest a rampant demand surge in the wholesale market as hyperscalers rush to satisfy their own customers’ AI and GenAI deployments. There are several ways Digital Realty is addressing this demand. Some are financial, including the ability to recycle capital by selling off nonscalable, single-tenant facilities to reinvest in strategic markets and maintaining a conservative capital structure; for context, Digital Realty owns nearly all of its facilities, in stark contrast to competitor Equinix, which is still leasing roughly 40% of its data centers. But the other aspect is Digital Realty’s go-to-market approach and how the vendor is nurturing relationships with the hyperscalers and their own partner ecosystems.

Digital Realty and Oracle Have a Strong Customer-partner Relationship: Other Hyperscalers Should Take Note

Digital Realty has always had a strong relationship with Oracle, which is now Digital Realty’s third-largest customer, deploying in 38 locations and spending $170 million in annualized recurring revenue (ARR). It is hard to dispute Oracle’s success pivoting from a traditional software incumbent and SaaS provider to an IaaS challenger with OCI (Oracle Cloud Infrastructure), which is on track to become Oracle’s largest cloud business in the coming years. Digital Realty astutely took notice of OCI’s role in the market, not to mention Oracle’s tight relationship with NVIDIA, which supplied Oracle with GPUs early in the AI wave.
 
Recent developments like connecting to Oracle’s EU Sovereign Cloud and offering NVIDIA-based OCI instances in its high-traffic Northern Virginia data center only reinforce Digital Realty’s role in powering OCI’s expansion. It is one of the reasons Oracle can not only boast more rapid footprint expansion over peers but also deliver on the “distributed cloud” message that nearly all hyperscalers are eager to convey. For perspective, Oracle holds only a single-digit share percentage in the IaaS market, but Oracle’s ability to leverage Digital Realty to expand ahead of peers is notable and something that other hyperscalers that are adamant about building their own data centers should recognize as they fight to capture net-new AI workloads.

SIs and Consultancies Pull It All Together at Scale for Enterprises

For IT services companies and consultancies, two needs mentioned above — orchestration and scale — illustrate how the colocation piece of the enterprise IT ecosystem can provide competitive opportunities.

Orchestration Is Critical and Most Often Handled by a Trusted SI Partner

Companies like Deloitte, Accenture and Infosys have the most complete view of an enterprise’s IT environment, positioning them best to coordinate vendors that provide disparate technologies. Most consultancies and SIs stay in well-defined swim lanes, delivering their added value while facilitating cloud, software and even hardware solutions from an enterprise’s suppliers. In TBR’s research, the market-leading consultancies and SIs use their industry knowledge, influence and reach within a client as the basis for orchestrating a full spectrum of technology providers, calibrated specifically to an enterprise’s IT needs.
 
As described above, colocation continues to be a pressing need, creating an opening for IT services companies and consultancies that have traditionally shied away from alliances that are too far removed from their core competencies. Just as alliances have formed around cloud computing and enterprise software, with IT services companies and consultancies delivering value through innovation, cost containment and managed services, partnerships with colocation specialists could add a compelling component to an IT services company’s orchestration value proposition. Consultancies’ and SIs’ business models depend on retaining clients and expanding footprint within clients. If colocation can become another differentiating factor and improve enterprise clients’ overall IT environments, SIs and consultancies will willingly seek partnerships.

Enterprises Want to Know That Their Solution Providers Can Scale as Demands Grow

If client retention remains critical to SIs’ and consultancies’ business models, scale increasingly marks the difference between average performance and market-leading results. No SI or consultancy can out-scale the largest players, but TBR’s research shows that companies that manage their alliances well can leverage their ecosystem for scale unattainable on their own. In short, no other company can be Accenture, but an SI or consultancy can replicate Accenture’s reach with the combined forces of a well-oiled tech and services ecosystem.
 
Colocation providers already play within the technology ecosystem but have not traditionally been considered a means for consultancies and SIs to increase scale. As AI and GenAI increase compute power demands and enterprises turn to their consultancies and ask, “How can I take advantage of all this new technology without exploding my IT budget?” and “How can I take this GenAI-enabled solution from pilot to my entire enterprise,” colocation can become a critical component.

The SI and Consulting Tech Evolution: From ERP to Cloud to GenAI to Colocation

In TBR’s view, SIs and consultancies will never become adept at selling those components of the technology ecosystem that are the furthest from their core competencies, but the market leaders have become exceptional at managing and expanding their ecosystems. TBR’s research around the relationships between the largest cloud providers and the largest SIs demonstrates how much revenue can be driven through alliances. As SIs and consultancies mature their partnering practices, colocation will become another element orchestrated by the likes of Deloitte, Accenture and Infosys. Quite possibly some smaller SIs and consultancies will use colocation as a critical component to scaling themselves into competitive positions against those larger players. As GenAI drives new demands — on compute power, budgets and expertise — TBR will closely watch these relationships between SIs and colocation providers.

Databricks Pivots Around Data Intelligence to Address GenAI Use Cases

Just like it did with the data lakehouse five years ago, Databricks is establishing another paradigm with data intelligence, which has the data lakehouse architecture at its core but is infused with generative AI (GenAI). Data intelligence was a key theme throughout Databricks Data & AI Summit and signals Databricks’ intentions to further democratize AI and ultimately help every company become an AI company.

A Brief Databricks Backstory

Founded by the creators of Apache Spark, Databricks is known as a trailblazer for launching new concepts in the world of data, such as Delta Lake, the open table format with over 1 billion yearly downloads, and the “lakehouse” architecture, which reflects Databricks’ effort to combine the best of what the data lake and data warehouse offer. Launched in 2020, the lakehouse architecture can handle both structured and unstructured data, and addresses the data engineer and business analyst personas in a single platform.

 

Delta Lake and Unity Catalog, which governs the unstructured data stored in these Delta tables, serve as the basis for the lakehouse architecture and are part of Databricks’ longtime strategy of simplifying the data estate and, by default, AI. But with the advent of GenAI, which is causing the amount of unstructured data to proliferate, Databricks has spearheaded yet another market paradigm, pushing the company beyond its core areas of data ingestion and governance into data intelligence.

 

At the heart of data intelligence is the lakehouse architecture and also Mosaic AI, the rebranded result of last year’s MosaicML acquisition that equipped Databricks with the tools to help customers train, build and fine-tune large language models (LLMs). These also happen to be the same technologies Databricks used to build its own open-source LLM ― DBRX ― sending a compelling message to customers that they, too, can build their own models and use the Mosaic AI capabilities to contextualize that data and tailor it to their business, thus achieving true data intelligence.

What Is Data Intelligence?

Databricks’ executives and product managers largely communicated the definition of data intelligence through demonstrations. One of the more compelling demos showed how Mosaic AI can be used to create an agent that will build a social media campaign, including an image and caption for that campaign, to boost sales.

 

The demo depicted how a user can use transaction data as a tool to supplement a base model, such as Meta’s Llama 3. This demo was key to highlighting one of Databricks’ product announcements, the Shutterstock ImageAI model, which is built on Databricks in partnership with Shutterstock and marks Databricks’ foray into the multimodal model space.

 

The exercise created an image for the fictional social media campaign that included a company’s bestselling product — chosen through transaction data — and a catchy slogan. But to convey the contrast between data intelligence and general intelligence, the demonstrator removed the “intelligence” ― all the data-enabled tools that exist in Unity Catalog ― and generated the image again. This time, the image did not include the bestselling product and was accompanied by a much more generic logan.

 

This demo reinforced the importance of contextualized data in GenAI and the role of Unity Catalog, which helps govern the data being used, and Mosaic AI, which allows developers to use enterprise data as tools for creating agents (e.g., customer support bots).

 

Data intelligence is about not only the context behind the data but also making that context a reality for the enterprise. For instance, in the above scenario, the demonstrator was able to put the image and slogan into Slack and share it with the marketing team through a single prompt. In this example, it is clear how a customer with Databricks skills could use GenAI in their business.

Databricks’ Acquisition of Tabular Is a Blow to Snowflake and a Surefire Way to Stay Relevant in the Microsoft Ecosystem

As a company born on the values of openness and reducing lock-in, Databricks pioneered Delta Lake to ensure any engine can access the data sitting in a data lake. Delta Lake remains the most widely adopted lakehouse format today, handling over 90% of the data processed in Databricks, and is supported by other companies, as 66% of contributions to the open-source software come from outside Databricks.

 

But over the past few years, we have seen Apache Iceberg gain traction as a notable alternative, garnering significant investment from data cloud platforms, including Snowflake. When Databricks announced its acquisition of Tabular ― created by the founders of Apache Iceberg ― days before the Data & AI Summit, it signified a strategic shift that will help Databricks target a new set of prospects who are all in on Iceberg, including many digital natives.

 

The general availability of Databricks’ Delta Universal Format (UniForm), which helps unify tables from different formats, indicates the company’s intention to make Delta and Iceberg more interoperable and, over time, potentially reduce the nuances between both formats, though this may be a longer-term vision.

 

The Tabular acquisition in some ways also marginalizes Snowflake’s steps to become more relevant as a Microsoft Fabric partner. Available through Azure as a first-party native service, Databricks has always had a unique relationship with Microsoft, and Delta serves as the basis for Microsoft Fabric. But Microsoft’s recent announcement to support Iceberg tables with Snowflake in a push for more interoperability was notable, and now with Tabular, Databricks can ensure it remains competitive in the Microsoft Fabric ecosystem.

It Is All About Governance

First announced three years ago, Unity Catalog has emerged as one of Databricks’ more popular products, allowing customers to govern not just their tables but also their AI models, an increasingly important component in GenAI.

 

At the event, Databricks announced it will open source Unity Catalog, which we watched happen during the Day 2 keynote, when Unity Catalog was uploaded to GitHub. Despite Unity Catalog’s mounting success, this announcement is not surprising and only reinforces the company’s commitment to fostering the most open and interoperable data estate.

 

It is very early days, but open sourcing Unity Catalog could help drive adoption, especially as governance of GenAI technologies remains among the top adoption barriers.

Databricks SQL Is Gaining Momentum

It is no secret that Databricks and Snowflake have been moving into one another’s territories. Databricks, with its expertise in AI and machine learning (ML), has been progressing down the stack, trying to capture data warehouse workloads. Snowflake, with its expertise in data warehousing, is looking to get in on the AI opportunity and address the core Databricks audience of data scientists and engineers.

 

Snowflake’s early lead in the data warehouse and strong relationship with Amazon Web Services (AWS) could be making it more difficult for Databricks to attract workloads. Combined with the enormity of the market, there may never be a scenario in which Databricks becomes a “standard” in enterprise accounts for data warehousing. But Databricks’ messaging of “the best data warehouse is a lakehouse” certainly seems to be working.

 

Traditionally, customers have come to Databricks for jobs like Spark processing and ETL (Extract, Transform, Load), but customers are increasingly looking to Databricks for their data warehouse. These customers fall into two groups. In the first group, customers on legacy systems, such as Oracle, are fed up with the licensing and are looking to modernize. In the second group, existing cloud customers are looking for a self-contained environment with less lock-in, compared to vendors like Snowflake, or are seeking to avoid challenges with system management and scale after having worked with hyperscalers.

 

As highlighted by Databricks Co-founder and Chief Architect Reynold Xin, Databricks SQL is the company’s fastest-growing product, with over 7,000 customers, or roughly 60% of Databricks’ total customer base. During his keynote, Xin touted improved startup time with Databricks SQL Serverless to five seconds and automatic optimizations for BI workloads to be four times faster compared to two years ago. Provided Databricks can continue to enhance performance while pushing the boundaries on ease of use to better compete with Snowflake and other vendors in attracting less technical business personas, we expect this momentum will continue and will challenge competitors to raise the bar for their own systems.

Databricks Is Bringing an Added Layer of Value to the BI Stack

Databricks AI/BI is a new service available to all Databricks SQL customers that allows them to ask questions using natural language (Genie) and perform analytics (Dashboards). In a demo, we saw the two user interfaces (UIs) in action: BI offers common features like no-code drag and drop and cross-filtering, and AI includes the conversational experience where customers can ask questions about their data.

 

Databricks AI/BI may lack some of the complex features of incumbent BI tools, but ultimately these are not the goals of the offering. The true value is in the agents that can understand the question the business analyst is asking and hoping to visualize. Databricks’ approach exposes the challenges of bolting on generic LLMs to a BI tool. But the company is not interested in keeping this value confined to its own BI capabilities. Staying true to its culture of openness, Databricks announced at the event that it will open up its API to partners, ensuring PowerBI, Tableau and Google Looker customers can take advantage of data intelligence in these BI environments.

Conclusion

With its lakehouse architecture, which was founded on the principles of open-source software and reduced lock-in, Databricks is well positioned to help customers achieve data intelligence and deploy GenAI. The core lakehouse architecture will remain Databricks’ secret sauce, but acquisitions, including those of MosaicML and Tabular, are allowing Databricks to broaden the scope of its platform to tap into new customer bases and serve new use cases.

 

If Databricks can continue to lower the skills barrier for its technology and sell the partner ecosystem around its platform, the company will no doubt strengthen its hold on the data cloud market and make competitors, including the hyperscalers in certain instances, increasingly nervous.

Blending Industry Expertise with Cybersecurity Credibility: Insights From PwC’s EMEA Financial Services Team

Compelling Cybersecurity Needs Meet PwC’s Capabilities

A July 1, 2024, briefing by PwC’s EMEA Financial Services (FS) team provided TBR with a closer look at PwC’s largest industry practice by revenue and the ways the firm has blended industry expertise with cybersecurity managed services experience and credibility. Julian Wakeham, UK EMEA Consulting Financial Services leader; Moritz Anders, Digital Identity lead, Cyber Security & Privacy, Germany; and Joshua Khosa, Service lead, Cyber Managed Services, Germany, steered the discussion for PwC.

 

Anders said FS clients’ three compelling cybersecurity needs — compliance, cost optimization and talent — shaped PwC’s approach to cybersecurity managed services and, in TBR’s view, will be consistent revenue drivers for PwC as those needs will be perpetual. The challenges around recruiting and retaining highly specialized cybersecurity experts, for example, remain outside the core functions of most enterprises, yet the cybersecurity risks continue evolving, necessitating that consultancies step into that role. A significant part of PwC’s value, therefore, comes from assembling and deploying experts in both cybersecurity and the underpinning enterprise technologies.

 

Critically, according to Anders, PwC has approached cybersecurity managed services not as an IT play, where it can simply throw technology and people at the problems, but as an ongoing business challenge best tackled through a highly automated architecture and a sustained focus on business outcomes.

 

Echoing the three compelling cybersecurity needs highlighted above, Anders and Khosa provided details about a use case with a Europe-based bank that delivered three clear business outcomes: “compliance and audit readiness, operational efficiency, and enhanced security,” with the last relying, in part, on PwC using its alliance partners to keep emerging technologies and updates flowing to the client.

 

In TBR’s view, the compliance and audit-readiness components reflect PwC’s legacy strengths and brand around governance, risk, and compliance, and the operational efficiency outcomes build on the firm’s decades-old emphasis on and experience with operations consulting. In short, PwC continues playing to its strengths.

 

At the end of the briefing, the PwC team was asked why this particular Europe-based bank chose PwC for a complicated, multiyear cybersecurity managed services engagement. Anders said PwC remained direct and humble throughout the selection process, informing the client, without marketing spin, what PwC could and could not do well.

 

Among the strengths PwC brought to the table, according to Anders, was Europe-based talent at scale, in contrast to competitors, which relied on offshore resources. Wakeham noted PwC’s flexibility, focus on business problems (and not just selling technology solutions), PwC’s Industry Edge+ as a key enabler for business model reinvention, and the “deep trust” PwC’s clients have in the firm. 

Watch Now: TBR Vice President Dan Demers and TBR Principal Analyst Patrick M. Heffernan discuss trends expected to shape the market in 2024, including GenAI’s impact on ecosystem alliances and how clients use TBR’s research and analysis to add context to strategic questions and address challenges around alliance enablement

Business Model Reinvention, Ecosystem Strategy and Expansive Capabilities

Reflecting on the EMEA FS briefing and previous discussions with PwC across topics and capabilities as diverse as people advisory services, IoT and generative AI, TBR made a few observations. First, PwC’s focus on “business model reinvention” was mentioned at the beginning and end of the discussion, with Wakeham acknowledging that the firm did not create that term or idea but explaining that PwC’s own market research indicated the importance to CEOs of that strategic focus. TBR reported earlier this year on PwC’s ideas around business model reinvention and notes that while previous strategic shifts have taken time to gain traction across the PwC member firms, business model reinvention appears to have considerable momentum and heft.

 

Second, PwC’s alliances strategy appears to be evolving as both the competitive and ecosystem landscapes change, with increased expectations that technology partners will bring business to PwC. In contrast to the usual equivocation and lack of details around how ecosystem partners can play a role in PwC’s go-to-market strategy, the EMEA FS team provided both a direct answer to TBR’s question about whether partners bring PwC into client projects and an explanation for the underlying reasons why software vendors would introduce PwC into an engagement.

 

PwC maintains a vast array of technology partnerships across cybersecurity, enterprise platforms, cloud, IoT and more, necessitating a well-managed ecosystem effort and providing extensive opportunities to gain new clients and expanded opportunities within existing accounts. Continually refining the ecosystem playbook will be vital to PwC’s continued success.

 

Lastly, PwC’s EMEA FS team provided another example of the breadth of the firm’s capabilities, an element of PwC’s value proposition that can sometimes be forgotten when focusing too intently on one piece of the overall firm. For example, in cybersecurity managed services, PwC brings expertise and capabilities in cyber incident response, smart cyber defense, cybersecurity upskilling, identity and access management, and OT & IT security, to name a few.

 

TBR believes the extent of PwC’s capabilities and offerings, while not unique, can sometimes be lost on clients and ecosystem partners that are focused on the immediate services the firm is bringing to their engagement. If PwC remains focused on business model reinvention and continues evolving its ecosystem strategy, the breadth of the firm’s capabilities will become the underlying strength that sustains PwC’s success.

SoftwareOne Strategy Brings Speed, Ease, Flexibility and Low Cost to SAP Clients in DACH Region

SoftwareOne’s Strategy for SAP Clients in DACH

In a June 2024 discussion with SoftwareOne’s DACH (Germany, Austria and Switzerland) leadership, TBR came away with three observations on what might make SoftwareOne a potentially unique player in the SAP ecosystem. At a minimum, SoftwareOne in DACH appears to be taking a different approach to the market opportunities created by the confusion around RISE with SAP, GROW with SAP, migrations to S/4HANA, and the 2027 deadline for the end of ECC support. From the presentation by and discussion with SoftwareOne’s Stephan Timme, president DACH; Vincenzo Boesch, sales leader for SAP Services; and Oliver Berchtold, service director DACH, TBR noted that:

 

  • SoftwareOne remains focused on customers’ current SAP environments and helping those customers get to S/4HANA and to the cloud. SoftwareOne is decidedly not focused on business processes. TBR believes that distinction, while subtle, matters because almost every other IT services company and consultancy in the SAP ecosystem of SoftwareOne’s scale and larger starts with identifying business problems and processes that need to be fixed and then declaring, “Hey, look at that, SAP RISE is a perfect solution to fixing these problems!” SoftwareOne does not dance around the business issues but instead gets straight to what it excels at: solving the SAP challenges of licensing, migrating and maintaining.
  • SoftwareOne recognizes SAP customers have been struggling with the confusion and costs around S/4HANA and RISE, along with all the other changes wrought by the end of support for ECC coming at the same time as the rise of generative AI (GenAI). During the SoftwareOne analyst event that TBR attended in Austin, Texas, in April, PF Grillet, SoftwareOne’s global SAP leader, told us there has “never been a more complex time for decision-making around SAP.” SoftwareOne’s answer to the confusion is a value proposition rooted in four promises: fast, easy, agile/flexible and cost-effective. TBR would summarize that value proposition and SoftwareOne DACH’s overall approach to clients in one word: And TBR suspects DACH clients value pragmatism.
  • When TBR noted the absence in SoftwareOne’s presentation of the SAP catchphrase “clean core,” Vincenzo Boesch explained that for him and his colleagues, SoftwareOne’s readiness services provided a comprehensive approach to what SAP calls clean core, emphasizing the need to prepare for migration, migrate only what is needed, and then maintain the benefits of a clean core throughout the transition and ongoing functioning of SAP within the client’s environment. SoftwareOne aims to remove complexity and streamline a customer’s environment, modernize and prepare customers for innovation, and then adopt an iterative transformation approach, more focused and agile, to continue the move to full clean core over the time frame best suited to the customer’s needs and capabilities. SoftwareOne recognizes few customers will have the means and the appetite to do it all in one go. In short, SoftwareOne is honest about the challenges, does only what is necessary and sets the client up for sustained success.

 

On the last point, TBR believes this approach to clean core likely fits the needs of small and midsize businesses with less complex IT environments. In follow-on discussions with TBR, SoftwareOne explained that the company is not targeting all customers but rather focusing on those customers that have limited need for business transformation, already have efficient business processes, and are driven to modernize their ERP and remove the mainly technological limitations of ECC. With enterprise clients migrating complex and highly customized SAP instances, SoftwareOne’s focus purely on SAP runs the risk of discounting business process change management cost and potentially pushes complex and problematic processes into the future, setting clients up for aggravation, not success. Failure seems unlikely given SoftwareOne’s overall track record of success with SAP. A more likely outcome for SoftwareOne would be selling additional consulting services related to complex migrations.

Stepping Back to Look at the Bigger SAP Picture

Just as there has “never been a more complex time for decision making around SAP,” as Grillet said in April, there also has never been a better time for a services firm to take a pragmatic approach to helping small to midsize businesses migrate to S/4HANA. As of 4Q23, GROW with SAP had amassed only 700 customers, which pales in comparison to the estimated 24,000 customers on ECC, 14,600 on S/4HANA and 8,800 in S/4HANA Cloud. Furthermore, GROW’s install base skews toward new logos, suggesting many legacy SMB customers have been comparatively slow to adopt the offering.
 
SAP will look to change this trend, primarily through its ecosystem, which will present opportunities for partners focusing on the SMB segment. The partners that succeed will recognize the oversized impact that cost and complexity have on SMBs, which are more likely to be resource-constrained in technical staff relative to larger enterprises. These customers are more apt to pursue the simplest path to the cloud possible, aligning closely with how SoftwareOne is positioning its services.
 
The contrast between SoftwareOne’s pragmatic approach and the strategies of much of the rest of SAP’s services ecosystem is becoming more stark. While RISE and GROW still hold S/4HANA migration at their core, SAP has become vocal about the offerings’ ability to drive multiproduct sales motions. In 1Q24, SAP released new add-on packages for RISE that bundle line-of-business (LOB) suites in finance and supply chain, and the company will look to partners to lead the charge in their adoption.
 
Yet, cross-selling initiatives risk adding to the complexity of an already challenging migration and implementation process. In the coming years, efforts to encourage business AI adoption will add to the noise for customers simply looking to bring their ERP deployments to the cloud. Many customers, especially SMBs, will appreciate a service provider whose value proposition is a fast, easy, agile/flexible and cost-effective migration. This will be a key differentiator for SoftwareOne as the vendor positions as a services provider capable of cutting through the noise and guiding customers toward the path of least resistance.

Minding the Minefields Before Stepping in Them

During the early days of robotic process automation, some IT services companies and consultancies advised clients to automate their processes before evaluating the efficacies and benefits, betting that automating even less-than-optimal processes would generate cost savings. Predictably, that bet did not always pay off.
 
Two considerations for SoftwareOne: First, the company should ensure that its message and what it delivers reinforces the pragmatism of speed, flexibility and value. In other words, promote this really well to ensure SoftwareOne’s approach and value proposition stand out in a large and noisy market. Second, in TBR’s view, while Boesch and his colleagues presented a compelling value proposition for SoftwareOne’s DACH clients, this approach will not necessarily apply to the customers that need a broader business process transformation.
 
For more than a decade, many consultancies and IT services companies have been stressing a business-first and technology-second mindset, but TBR believes SoftwareOne’s approach will appeal to clients – maybe especially in but certainly also outside of DACH – that are similarly focused on pragmatic, technology-first outcomes.

SoftwareOne: Gritty, Determined, Local

SoftwareOne’s leadership hosted about 20 analysts for presentations, client briefings and breakouts dedicated to specific SoftwareOne solutions and services. In addition to attending the formal program, TBR analysts met one-on-one with select SoftwareOne leads. The following reflects both summit presentations and TBR’s ongoing research of and discussions with SoftwareOne.

Simplify access to technology

In the year since TBR attended SoftwareOne’s inaugural analyst event in Milwaukee, the company has refined its value proposition and further established its place in the market, in large part by expanding its IT services capabilities and revenues.

 

During his event opening presentation, Duffy outlined five client priorities that SoftwareOne intends to tackle, all of which echoed the company’s value proposition:

  1. Simplify access to technology
  2. Maximize ROI on technology spend
  3. Enhance workforce productivity
  4. Accelerate cloud adoption
  5. Fast-track results in the AI and generative AI (GenAI) era

 

The first two priorities align with TBR’s Voice of the Customer research, which shows that IT services and consulting buyers want to get more from their IT investments and expect new technologies, in particular GenAI, will be complementary, compatible and immediately additive in relation to existing technologies.

 

Overall, in TBR’s view, SoftwareOne has captured the current market vibe and positioned itself well to continue its transformation from a VAR, primarily involved in cloud and software resale, to a truly full-service IT company.

SoftwareOne: “We can predict when they are going to do what”

During the opening presentation, Duffy said that SoftwareOne understands when a client bought their various IT components, how much they paid, how happy they are with IT performance, and when their licenses are up for renewal. In short: “What do clients own and how do they build things?”

 

That insight, combined with SoftwareOne’s overall approach, allows the company to look at the market through a “customer’s lens” and not through SoftwareOne’s own delineations of its offerings. In TBR’s view, this distinction — which starts with a recognition of how customers think — often becomes clouded by organizational constraints and sales demands within many IT services companies and consultancies. Starting with the client’s business challenges in mind is relatively easy; gaining intimate knowledge of their IT environment and spend is not.

 

Schlotter, in his presentation, noted that SoftwareOne can “predict when [clients] are going to do what,” because the company has tracked patterns — including software renewals, adoptions, and wholesale changes — across its vast client base. Thomson deepened that point by saying that for SoftwareOne, “IT portfolio management is our value proposition.” Importantly, SoftwareOne sees self-funding innovation as the flywheel that takes the company’s understanding of a customer’s IT environment, including opportunities to optimize that environment, and turns it into new value.

 

Through initial software cost takeout, ongoing IT asset management (ITAM) services and a focus on licensing expenses, SoftwareOne helps CIOs free up capital to reinvest in technology modernization. As Thomson noted, “Licensing costs were last year’s problem,” so SoftwareOne leads with the value that will come through enhanced technology but always makes explicit that funding will come through savings around licensing. In TBR’s view, this is SoftwareOne’s defining strategic advantage. Full stop.

 

One last point on SoftwareOne’s thinking about the market and the company’s place in it: During her breakout presentation, Burke emphasized the continuing need to recognize regional differences at client facilities and managing, selling and delivering to all clients in a local manner. The technology, by its own nature, may be region-agnostic, but clients are, by their own nature, local. SoftwareOne, therefore, embraces a be-local, stay-local culture. One can see the echo of intimate knowledge of a client’s IT environment in being enmeshed in a local environment.

Sustained growth is an alliance play

Berry, a recent addition to SoftwareOne, explained the company’s new approach to running partnerships “like a business.” While potentially overly transactional, Berry framed SoftwareOne’s approach as being “proactive with a set of prioritized [strategic partners]” whereby SoftwareOne would invest in joint solutions and joint go-to-market support, while “continuing to support clients gaining access to many software vendors.”

 

To evangelize within SoftwareOne, alliance partner managers would be the face of that partner while also externally educating the partner on SoftwareOne’s value proposition and capabilities. According to Berry, the alliance partner managers would maintain strategic relationships while also operating “as a business” and supporting and influencing sales efforts.

 

Recognizing that SoftwareOne’s client base differs from those of the giant IT services companies and global consultancies, Schlotter said that while the majority of customers have already made a commitment to one cloud or another, the corporate (not enterprise) market has less mature cloud environments and remains open to “re-migration” advice — i.e., move to another cloud — from SoftwareOne.

 

In TBR’s view, having the capability to serve those chance encounters with a willing cloud-hopper should not distract from the overwhelming reality that most corporate and enterprise clients have already made some kind of commitment to a cloud provider.

 

On specific alliances, Berry and Schlotter noted Microsoft’s importance to SoftwareOne’s revenues and long-term strategy, including the short-term opportunities around Copilot. In Schlotter’s view, “Copilot is a door-opener because now the CIO needs to talk to us.” From the hyperscalers’ perspective, according to Schlotter, among SoftwareOne’s values is the company’s ability to make hyperscalers’ value even “stickier” at a client.

 

In TBR’s view, no single nonfinancial metric has been more passionately sought by consultancies, IT services companies and hyperscalers than client retention.

 

During informal discussions and a breakout session with Grillet, two points struck TBR as particularly relevant to SoftwareOne’s partners in the SAP space. First, out of SoftwareOne’s 400 SAP specialists, only 32 reside in the Americas, providing an opening for SoftwareOne’s partners to bring opportunities and greater scale. Second, Grillet noted that around half of SoftwareOne’s new SAP leads come from the company’s own ITAM business, demonstrating — to SAP — a clear differentiation for SoftwareOne. One final note: TBR has reported previously on SoftwareOne’s SAP practice and continues to view the company’s strategy as exceptionally well-suited to SoftwareOne’s capabilities and market position.

Not every superhero wears a cape

TBR’s event perspectives typically include extensive recapping of companies’ client stories, product demonstrations and performance metrics, but we left those elements out of this report. During the event, SoftwareOne provided numerous client stories, extensive details about its capabilities and numbers to support its growth, but we wanted to emphasize that SoftwareOne represents a different kind of competitive threat and partnering opportunity, independent of the usual evaluation metrics. Intimate knowledge of a client’s IT environment, to include licensing challenges and opportunities as well as usage and costs, provides a superpower potentially significant to technology partners and threatening to competitors, especially as we move into a better analyzed and more transparent GenAI era. Forewarned is forearmed, as they say.

Informatica Unveils ‘ChatGPT for Enterprise Data’ Amid GenAI Boom

Disruptive technology is anything but stagnant. From the hyperscalers pushing the parameters of their latest large language models (LLMs) to other firms trying to provide their own models in an arguable race to the bottom, the lines of the generative AI (GenAI) stack are blurring.

 

In many ways, this activity is good, elevating innovation, competition and the role of the partner ecosystem, which will increasingly define how cloud vendors go to market in years to come. But at the same time, it is fueling the proliferation of unstructured data and adding to the already complex and redundant IT estate with yet another LLM, another vector database, and more third-party AI orchestration and integration tools.

 

Informatica is on a mission to break through this added complexity, honing the metadata system of record that defines Informatica Data Management Cloud (IDMC) to account for the latest models, databases and integrations that must play friendly with customers’ existing applications.

 

Of course, the challenge remains getting customers to think about data ahead of GenAI amid all the hype. At TBR, we often talk about how GenAI models are only as good as the amount and quality of data that are fed into them. Garbage-in, garbage-out should be self-explanatory when it comes to data’s role in LLMs and GenAI, and yet effective data management is still often overlooked or deployed too late within the enterprise.

 

Corporate culture, executive pushback and integrating point solutions remain among the top obstacles, and according to TBR’s 2H23 Cloud Infrastructure & Platforms Customer Research, 35% of respondents are taking a best-of-breed approach to data management, resulting in fractured data silos. That is why the Informatica World 2024 opening keynote theme — “Everybody’s ready for AI except your data” — rang true and speaks to an opportunity for Informatica, and increasingly its consulting partners, to address as customers try to navigate GenAI’s complexities and ultimately take advantage of GenAI’s opportunities.
 

TBR’s Leading Analysts Discuss Use Cases, Risk and Governance, Commercial Strategies, and Resource Management for Cloud and Software Applications in GenAI: Watch the Full Video Below Now

“CLAIRE GPT is the Chat GPT for Enterprise Data”

IDMC is the culmination of a five-year, $1 billion R&D investment, and executives have clearly articulated the pace of innovation will only accelerate with a meticulous focus on improving the native functionality in each of IDMC’s seven core modules.

 

At the event, product leaders for each of these components (e.g., Data Governance, Integration PaaS [iPaaS], MDM [Master Data Management] & 360 Apps) got on stage to showcase their innovations over the past year. They discussed not only core innovation, which could be anything from improving the user experience to adding new prebuilt connectors and extensions, but also a set of innovations specific to CLAIRE — for example, an automated assessment of metadata in MDM.

 

All these new added capabilities highlight Informatica’s goal of infusing GenAI into every component of the IDMC platform. Perhaps the best example of this strategy, though, is CLARIE GPT, which was first announced at last year’s Informatica World conference and is now generally available. The year 2023 was largely one of execution, as Informatica worked to put the AI engine, trained on vast amounts of intelligent metadata already sitting in CLAIRE, through preview, solicit customer feedback, and prepare for its release to the general public with a fresh message: “CLAIRE GPT is the ChatGPT for enterprise data.”

 

To see this message in action, we were given a hypothetical scenario in which a marketing operations analyst tasked with retaining disengaged customers uses natural language prompts in CLAIRE to find the right data sets and ask questions about that data. For example, the analyst can ask CLAIRE GPT to list data sets for marketing campaign analysis, and CLAIRE will pull the appropriate data, including those sitting in multiple systems like data warehouses and BI tools.

 

But Informatica actually delivering capabilities like CLAIRE GPT with IDMC for customers’ GenAI projects is only one piece of the puzzle. Informatica is also using GenAI to improve the underlying platform capabilities and deliver a unified natural language experience in IDMC. In a move that is now table stakes among SaaS, PaaS and IaaS vendors, Informatica is creating its own Copilot — CLAIRE Copilot — to auto classify and enrich master data.

IDMC Modernization Initiatives Well Under Way

Aside from offering new IDMC capabilities and releasing CLAIRE GPT, 2023 was a major year for modernization, which was a big theme at this year’s event. With license revenue down 68% year-to-year in 2023 and cloud subscription revenue up roughly 40% over the same period, Informatica is executing on its strategy of ceasing new license sales and enhancing programs to help customers move to IDMC in the cloud. This includes strengthening the existing modernization packages for legacy PowerCenter, MDM and Governance workloads with fresh automated tooling, including CLAIRE Copilot for PowerCenter (available in 2H24); increasing partner certifications; and growing these programs within larger accounts.

 

One of the more compelling customer modernization success stories from the event was Takeda Pharmaceuticals, which in the span of 18 months was able to consolidate the 12 data centers still running PowerCenter down to three.

 

Chief Data Officer Barbara Latulippe told a story of how Takeda wanted to become GenAI ready but recognized that modernization was the initial step and that migrating the low-risk and simple applications first was the recipe for success. The second phase for Takeda was consolidating 6 different MDM environments into one, which ultimately paved the way for the company to adopt other IDMC capabilities, including the Informatica Cloud Data Marketplace, on which the company is now live today. As was the case with Takeda, customers that go through the modernization process will ultimately ask how GenAI can be used to create value.

 

We expect data marketplaces to play an increasingly important role here; if a customer needs to automate a certain business process and thus locate the right LLM for that process, as well as the right data to contextualize said LLM, one-click experiences will go a long way, further democratizing GenAI and making it fairly seamless for customers to turn a regular application or process into one that is GenAI-enabled.

Informatica’s Relationships With All Hyperscalers Are Maturing, but Work with Microsoft Fabric Is a Big Win for Both Companies

For any cloud platform company, integrating with all the major hyperscalers is no longer a unique strategy but a prerequisite. As we continue to follow Informatica, however, there are some strategic ecosystem moves the company is making in both its product and go-to-market strategies to reach new customers and protect its positioning as the most neutral data management provider.

 

One key example is Informatica’s fast-growing relationship with Oracle, which Informatica formerly recognized as a hyperscaler partner two years ago. While other platform companies are closely aligned with Amazon Web Services (AWS), Azure and Google Cloud Platform (GCP), many have yet to account for OCI (Oracle Cloud Infrastructure), which could be due to a lack of maturity, including how Oracle cosells with ISVs in the cloud.

 

But as a key member of the legacy Oracle Database ecosystem for some time, Informatica has been able to quickly elevate its relationship with Oracle around OCI and now serves thousands of Oracle customers in North America alone. Informatica recently made Data Governance & Catalog available natively on OCI — ahead of GCP, which will support this capability later this year — and is expanding to more OCI regions.

 

But the most transformative ecosystem development highlighted at the event was Informatica’s work with Microsoft, specifically how it is filling whitespace within Fabric, Microsoft’s own answer to data management in its broader pursuit of GenAI dominance. Specifically, Informatica is coming in with its Cloud Data Quality solution native to Fabric, which means that as customers ingest data into Microsoft Fabric’s OneLake repository, Informatica ensures the data can be automatically profiled and assessed in real time.

 

Unlike many other partners that are joining Microsoft’s partner program as ISVs building and selling applications on top of Fabric, Informatica has signed on as a design partner, an exclusive invite-only track that allows partners to create entirely new workloads consumed natively within the Fabric solution. This distinction is important, as it essentially embeds Informatica’s Data Quality tooling directly within the customers’ analytics workflow and ensures Informatica can offer Microsoft’s roughly 11,000 Fabric customers access to IDMC in the back end and increase the likelihood they explore other aspects of Informatica’s platform.

 

Though Fabric is touted as an end-to-end SaaS solution serving the entire data life cycle, Microsoft understands its gaps and recognizes the trust customers place in Informatica when it comes to data integration, quality and governance. Letting Informatica fill whitespace around Fabric, an integrated platform that is underpinned by a data lake architecture, is a strategic move for Microsoft and should be noted by Azure competitors that similarly partner with Informatica but in some ways lack Microsoft’s platform mentality; to us, the Fabric-Informatica integrations only reinforce how aligned Azure and Informatica are as two platform-centric companies understanding the symbiotic relationship between data and GenAI.

Final Take

In many ways 2023 was an execution year for Informatica. Since last year’s event, when Informatica showcased CLAIRE GPT, the company has been adamantly focused on bringing the GenAI engine into general availability, all the while delivering a broader and deeper set of AI-rich features natively within IDMC.

 

Meanwhile, with momentum for PowerCenter migrations, which in 1Q24 accounted for 80% of all Informatica’s modernization deals, and early success of the MDM program, modernization has been a key initiative for the company, driving cross-selling opportunities and pushing Informatica closer to the $1 billion cloud annualized recurring revenue mark.

 

Rallying a cohesive ecosystem — made up of hyperscaler and global systems integration partners that are expanding their certifications as part of these modernization programs — around IDMC will remain key to Informatica’s success, ultimately allowing the company to cross-sell more services and ensure IDMC becomes the de facto way customers get their data ready for GenAI.

PwC Touts India as Strategic Growth Hub, Investing in the Country’s Tech and Talent for Long-term Gains

Vibrant, Multinational and Driving Change: India and PwC India

Consensus among presenters and attendees that India will be a massive growth market for PwC over the next few years underscored every aspect of the early May event, tempered only, perhaps, by the sentiment expressed by a number of PwC leaders and clients that India is a great growth market right now.

 

A shared assumption among the analysts and PwC leaders and professionals, based on the presentations and numerous sidebar conversations, was that today’s investments in India-based talent and technologies will be considered, in 10 years, to have been foundational and strategic for PwC. In short, everyone presenting was bullish on India as a market in its own right. Coincidently, The Economist published a special report on India the week before the analyst event and noted, “Since 2012, India has been the world’s fastest-growing large economy” and “the number of new business registrations in India has tripled since 2015.”

 

Martin Scholich, PwC’s deputy global advisory leader, described India as an “open, fast-growing, technology-driven market.” Arnab Basu, advisory leader for PwC India, added that the Indian economy has spurred the development of more “vibrant, large, multinational Indian companies,” clearly a plus when it comes to expanding PwC’s India client base.

 

Ritu Rekha, a partner and leader of finance transformation and a co-leader of business transformation for PwC India, noted the recent expansion of Global Capability Centers (GCCs) in India and explained that the value proposition of GCCs was evolving from cost savings to capabilities expansion to innovation, with PwC facilitating those changes and enabling growth. Basu added that the changing nature of the work completed at these centers provides a greater opportunity for PwC to bring consulting value to clients.

 

PwC India Chair Sanjeev Krishan noted the Indian government’s desire to see GCCs expand beyond traditional IT hubs and call centers into facilities focused on engineering services and innovation. Krishan added that PwC India is aligning with India’s road map and the priorities outlined by the Indian government, to include a sharpened focus on certain industries, and supporting the domestic market (India businesses built to serve India clients).

 

Krishan rounded out his talk by asserting that disruption around the world could be a net positive, as India is ahead of the curve in not only adapting to technology and business shifts but also mitigating other countries’ supply chain challenges. Excitement and growth mixed with opportunities that are based on chaos elsewhere but grounded in increasing internal stability are a good recipe for a country as well as a professional services firm.
 

Discover What Lies Ahead for IT Services and Consultancies in the Era of GenAI: Watch the Video Below Now

GenAI for the Many Indias

When talking about generative AI (GenAI) — whether in a breakout session dedicated specifically to the technology, during an individual client case study or as part of a larger discussion about the firm — PwC’s leaders repeated a consistent refrain around skills: The technology itself is not a challenge, but people can be.

 

PwC needs professionals skilled enough to not only develop the GenAI-enabled solutions that clients want but also bring in the right expertise to meet a client’s broader-than-technology goals. In short, GenAI has further accelerated training imperatives across the full spectrum of everything PwC delivers. GenAI is not replacing people but challenging the firm to make everyone more valuable.

 

During a panel session, Scott Likens, PwC’s global AI and innovation technology leader, explained that the firm adopted a “people-first strategy” with respect to all AI, with an emphasis on safety and responsibility. He further noted that within PwC at the partner level, training around GenAI has been greater quantitatively than any other nonmandated training, a sentiment echoed by other PwC partners in attendance.

 

Turning from internal to external support, Likens said that because “everyone can use GenAI,” the firm “must reskill and upskill,” while adding that PwC’s clients have looked to the firm for help in training clients’ own professionals. Likens also noted that while GenAI tools can be global in nature and application, they — and the professionals using them — must adhere to local standards and regulations, echoing one of the recurring themes throughout the event that global solutions and capabilities need to be tailored to the local audience.

 

To that point, Rajnil Mallik, partner and leader of GenAI at PwC India, reminded analysts, “There’s not one India; there are multiple Indias,” with companies of all sizes across every sector operating in vastly different regions and markets. Mallik continued to explain that PwC’s clients have been seeking advice primarily around GenAI use cases, data management, risk and necessary organizational changes and the firm has been recommending that more mature clients create an internal GenAI lab where they can focus on all AI and analytics efforts. Mallik cautioned that “for GenAI to reach [its] potential, must have some regulation” and made a comparison to the regulations and oversight of the oil and gas industry.

 

In a presentation focused on PwC India’s work with the government of India on a federal GenAI strategy, Santosh Misra, PwC India Partner, said the government’s priorities around GenAI started with compute power and data. The government of India understood the urgent need to address future skills with current investments, intended to fund multiple AI innovation centers and vowed to work with the GenAI startup community.

 

Along with these priorities, PwC was advising the government on fully embracing Trusted AI, a sentiment circling back to Likens’ “people-first strategy.” In TBR’s view, the GenAI discussions during the event provided a continuation of PwC’s strategy and story from the last few years, including the firm’s emphasis on people (skills) and client needs (not just related to the technology). The new elements in May 2024 were PwC’s close alignment with the government of India’s GenAI strategy and the firm’s responsiveness to the realities of the Indian market.

A Hunger for Partnership: Why Clients Want to Work with PwC

Notably reviving one successful component from PwC India’s 2019 analyst event, the firm once again included eight separate and diverse client use cases, including manufacturing tires, providing financial services, and enhancing government operations. While each client story provided unique perspectives on PwC’s capabilities and strengths, three common threads stood out to TBR.

 

First, the clients who presented were almost all techies: CTOs, heads of IT, chief information security officers (CISOs), and one exceptional client who was both the CFO and head of IT at her company (an almost perfect persona for PwC). While every use case included some traditional consulting services, the underlying current remained centered on the technological capabilities and partnerships that PwC could bring to its clients. In TBR’s view, who PwC now serves demonstrates the long-gestating shift across the entire firm, from technology-agnostic professional services to technology-imbued consulting.

 

Second, the use cases themselves demonstrated the breadth of PwC’s consulting capabilities, from cybersecurity managed services to cloud migration to RISE with SAP to blockchain platform building. Again, PwC and its clients mentioned traditional consulting services, such as assessment and road mapping, but the variety of use cases highlighted the diversity in PwC’s capabilities and offerings.

 

Lastly, every client — including the greenfield RISE with SAP example — described a long-standing relationship with PwC and included cultural fit and shared values in their reasons for selecting the firm for their specific use case. In TBR’s view, those longitudinal relationships explain the kind of answers clients provided to the question, “Why did you work with PwC?” which included the following:

  • PwC displayed a “hunger for partnership.”
  • “Cultural match, speed and technology capabilities.”
  • Experience, capability, availability and consistency

 

TBR recognizes that PwC does not have a monopoly on the multidisciplinary professional services model and that any collection of eight client use cases from a PwC peer could be as diverse. Not being unique does not detract from the impressiveness of a wide-ranging set of capabilities and offerings and the execution needed to bring those services to clients at scale.

Seamless Execution, or the Satisfaction of a Job Well Done

In contrast to consultancies that mostly sell road maps, continually add on services to extend an engagement, or sell the A team and then send in the C team, PwC is believed by its clients and own staff to be the firm that gets the job done.

 

PwC leaders in India, both the PwC India partners and the global partners, repeatedly echoed each other in their commitment to delivering tangible results, not just planning and talking. As Krishan expressed during his keynote address, “How can I make [clients’ and PwC professionals’] execution seamless?”

 

As the consulting business model continues its slow and uncertain evolution, with PwC peers like EY and KPMG also searching — globally — for the best strategies and clients looking for more results and reduced spending on tech, PwC’s adherence to getting things done should resonate across the ecosystem and allow the firm to both retain its most important clients and expand, particularly in fast-growing markets such as India.