Comcast Business Advances its Enterprise Strategy Through AI-driven Innovation and Ecosystem Expansion

2026 Comcast Business Analyst Conference, Philadelphia, April 15-16, 2026 — A select group of industry analysts gathered at the Comcast Center in Philadelphia to hear from Comcast Business leaders about the progress and success of the unit’s sales and go-to-market strategies. The event continued to center on its theme introduced at last year’s conference, “Everything, Everywhere, All at Once,” reflecting the increasingly complex operating environment customers face and Comcast Business’ role in helping them navigate change through integrated solutions. Building on this theme, Comcast Business emphasized the accelerating pace of innovation over the past year, underscoring advancements in AI and network capabilities as it aims to deliver solutions that keep pace with the speed of business transformation. The event was hosted by NBC News Business and Data Correspondent Brian Cheung and included a State of the Business session with Comcast Business President Edward Zimmermann, a Strategy & Vision session with Comcast Business Chief Product Officer Bob Victor, and an update on Comcast’s network from Chief Network Officer Elad Nafshi. The agenda also featured panel discussions with senior leadership, speaker sessions with Comcast Business customers, and fireside chats with high-profile thought leaders on AI development and trends.

TBR perspective

Since 2025, Comcast Business has accelerated its transition from a connectivity-led provider to a solutions- and platform-oriented partner for enterprise customers. The 2026 analyst conference highlighted the company’s focus on expanding share among global enterprises through continued investment in AI-enabled networking, cybersecurity and edge compute capabilities. This evolution reflects both opportunity and necessity. Enterprise growth is increasingly driving overall performance, while the SMB segment faces intensifying pricing pressure from fixed wireless access (FWA) and converged offerings.
 
At the same time, rapid advancements in AI are reshaping customer requirements, placing greater emphasis on low-latency connectivity, integrated security and real-time data processing. Comcast Business is positioning itself to capitalize on these trends by leveraging its network scale, partner ecosystem and managed services portfolio to deliver differentiated outcomes. However, success will depend on the company’s ability to execute, particularly whether it can monetize AI-driven capabilities and scale its global platform.

Impact and opportunities

Comcast Business drives revenue growth via enterprise expansion, while its SMB segment faces increasing headwinds

Comcast Business’ revenue performance remains relatively strong, generating over $10.2 billion in 2025, exceeding its long-term goal of reaching $10 billion in annual revenue. Growth is increasingly driven by the enterprise segment, which expanded 13.1% in 2025, supported by the integration of acquisitions, such as Masergy and Nitel. Additionally, the company now serves approximately 90% of Fortune 500 companies in some way. Comcast Business is also expanding its focus on multinational enterprises, leveraging partnerships with global operators across more than 130 countries.
 
Despite this momentum, the SMB segment — the company’s largest revenue contributor — is becoming increasingly challenging. Competition from FWA providers and converged offerings in the U.S. market is intensifying pricing pressure as small businesses gravitate toward lower-cost “good enough” connectivity solutions. These dynamics contributed to a net loss of 48,000 business customer relationships in 2025, compared to a net loss of 16,000 in 2024 and net additions of 17,000 in 2023. TBR believes the majority of these losses occurred within the SMB segment.
 
To offset customer losses, Comcast Business is increasing its focus on cross-selling value-added services to customers in areas such as mobility, SD-WAN, security and unified communications. For instance, Comcast Business reported that its enterprise customers are spending three times as much for value-added services as on core connectivity services compared to 2023. Comcast Business will also increase wireless revenue from larger businesses in 2026 through its new MVNO agreement with T-Mobile. The agreement covers up to 1,000 lines per account, which will enable Comcast to begin targeting the midmarket with wireless offerings, whereas its existing B2B MVNO agreement with Verizon is limited to 20 lines per account.

Comcast Business scales AI across its portfolio, network and operations

Comcast is expanding its use of AI from targeted, efficiency-driven applications to a more pervasive, embedded role across its network, solutions portfolio and customer engagement model. AI is now integrated across key areas, including network optimization, cybersecurity, sales enablement and customer experience, and is improving operational efficiency through internal use cases such as automated RFP development, deep research and meeting summarization. AI integration is enabling Comcast to automate over 99.7% of software changes across its network, supporting self-healing capabilities that can quickly resolve outages and, over time, help improve customer retention.
 
Comcast expects AI to not only enhance network and operational efficiencies but also create meaningful revenue-generation opportunities, though the company remains in the early stages of developing monetization strategies. For example, Comcast’s edge computing capabilities support ultra-low latency speeds of less than 1 millisecond for many customers, positioning the company to enable advanced AI-driven applications such as AR/VR, which are more dependent on low latency than text-based use cases. Comcast Business is also exploring customer-facing AI use cases, including small-business concierge agents designed to manage front-desk functions such as greeting customers, scheduling appointments and handling routine inquiries, highlighting the potential to extend AI-driven value beyond internal operations and into customer-facing revenue opportunities.

The launch of Comcast Business Innovation Labs will accelerate the development of enterprise solutions

The company is advancing its enterprise strategy through the formal launch of Comcast Business Innovation Labs, an initiative designed to codevelop and rapidly scale first-to-market solutions for midmarket and enterprise customers. The lab brings together Comcast Business, customers and a broad ecosystem of technology partners to address specific business challenges, reflecting a more demand-driven approach to innovation. A key focus for Comcast Innovation Labs is supporting edge and AI-driven use cases by leveraging Comcast’s network capabilities and partner ecosystem.
 
Initial programs launched under the Comcast Business Innovation Lab include a partnership with Dell Technologies to deliver managed edge compute for AI and real-time applications and partnering with Digital Realty to enable seamless hybrid and multicloud connectivity through data center fabric services. Comcast Business is also collaborating with Expedient to support three core capabilities: AI operations at scale via Expedient’s Secure AI CTRL services, private cloud as a cost-efficient environment for workloads, and managed disaster recovery to support mission-critical applications.
 
TBR believes Comcast Innovation Labs strengthens the company’s ability to differentiate through ecosystem-driven innovation and faster solution development cycles, particularly as enterprise customers seek more tailored outcome-based offerings. However, the long-term impact of the initiative will depend on Comcast Business’ ability to scale these solutions beyond pilot environments and integrate them effectively across its broader portfolio and go-to-market strategy.

Conclusion

The 2026 Comcast Business Analyst Conference highlighted the company’s evolution from a connectivity-focused provider to a solutions-oriented partner for enterprise customers. Comcast Business’ ability to surpass $10 billion in annual revenue and sustain double-digit enterprise growth underscores the effectiveness of its upmarket strategy, supported by acquisitions, global partnerships and an expanding portfolio of value-added services.
 
However, SMB, which accounts for the majority of Comcast Business’ revenue, is becoming increasingly challenging as FWA competition and macroeconomic pressures drive greater pricing sensitivity. These headwinds will require Comcast Business to further strengthen its value proposition to retain and grow its SMB base and combat competitive pressures in the market.

From Ecosystem to Execution, NVIDIA Shapes How AI Is Built and Run

NVIDIA’s increasing emphasis on physical AI signals that the company’s ambitions extend well beyond digital workloads. By linking its agent software stack with simulation, robotics and autonomous systems, NVIDIA is positioning itself as the foundational platform for both virtual and real-world AI applications. GTC 2025 established the importance of inference, and GTC 2026 clarified that the next phase of AI will be defined by agents, and NVIDIA is building the infrastructure to power them from end to end

Who Will Win the AI Services Race in the Next Wave of AI?

This quarter, TBR FourCast looks at Accenture, Capgemini, HCLTech and IBM Consulting, comparing how their underlying data strategies, especially related to engineering and integration, prepares them for advanced AI adoption.

Anthropic, OpenAI and Palantir: Who Gains and Who Loses in the Federal Fallout

With the largest global IT buyer’s biggest priority — AI — on the line, the stakes could not be higher

The U.S. federal government is the largest single buyer of IT services in the world, making it a critical customer target for leading providers in the space. For the current federal fiscal year (FFY), U.S. federal IT spending is estimated to approach $130 billion. Within that umbrella of spending, the Department of Defense (DOD) is not only the largest driver of spend but is also expected to see the most significant spending growth, at an estimated 5% year-to-year. Cloud-delivered options have been increasingly important to the DOD, most notably with the $9 billion Joint Warfighting Cloud Capability (JWCC) contract in 2022 and the newest iteration of the vehicle, dubbed JWCC Next.
 
The shift to cloud continues, but AI has become the clear priority for the DOD’s large and increasing IT investments over the past six months. As outlined in TBR’s 3Q25 Federal IT Services Benchmark: “TBR believes federal agencies increasingly view AI as an essential technology for enhancing mission workflows rather than as a niche, specialized tool or tool set. As such, we anticipate broadly accelerating implementation of comprehensive, agencywide AI platforms in FFY26 and FFY27. FSIs [federal systems integrators] will be tapped to not only integrate AI into IT infrastructures but also develop secure and ethically sound foundations for AI adoption.”
 
All of this is a long-winded way of setting up just how important the recent developments between Anthropic, OpenAI and Palantir are considering the implications for the largest agency (DOD), within the single largest buyer of IT in the world (U.S. Federal Government), relating to the single largest technology priority (AI).

The downside of Anthropic’s position may have broad financial impacts

Anthropic took a firm stance that the DOD could not use the company’s Claude technology for mass civilian surveillance or in fully autonomous weapons. This position caused Anthropic to lose the contract and receive a designation as a national security risk, threatening its partnerships with other providers. For Anthropic, the loss is not just the $200 million DOD agreement ceiling it won in July 2025.
 
On Feb. 12, Anthropic executives said the company’s run-rate revenue was $14 billion, and it raised $30 billion at a $380 billion valuation that same month. A few weeks later, Anthropic executives told a court the Pentagon blacklist could reduce 2026 revenue by multiple billions of dollars. Company leadership has also argued that the formal legal scope is narrower than the political rhetoric and that it should apply only to Claude’s use in direct DOD contract work, not all business with contractors. Reuters reported the Pentagon left room for exemptions in “rare and extraordinary circumstances.”
 
That means the real financial risk is probably not one canceled award but rather a pipeline contamination: Contractors derisking away from Claude, slower federal conversions, and reputational drag in defense-adjacent enterprise sales. Put differently, the $200 million ceiling is only about 1.4% of Anthropic’s disclosed $14 billion run rate, so the “multiple billions” warning has to be about second-order effects, not just the contract itself.

OpenAI gains short-term incremental revenue opportunity but should benefit even more significantly long-term

For OpenAI, the near-term revenue uplift is real but probably less dramatic than the strategic win. OpenAI’s federal posture was already building before Anthropic’s rupture. OpenAI launched and scaled usage of ChatGPT Gov, announced a $200 million-ceiling pilot with the DOD’s Chief Digital and Artificial Intelligence Office, struck a General Services Administration (GSA)-wide deal offering ChatGPT Enterprise to agencies for $1 per agency for a year, brought ChatGPT onto GenAI.mil for a platform used by 3 million civilian and military personnel, and most recently added an Amazon Web Services (AWS) route to sell models to U.S. defense and government agencies for classified and unclassified work.
 
Although the U.S. government activity is notable, it still represents a small portion of OpenAI’s overall revenue, which was rumored to have surpassed a $25 billion run rate as of early 2026. Put in this context, a $200 million government award represents only 0.8% of that run rate. However, the much bigger financial effect is strategic: Anthropic’s loss makes OpenAI the default frontier-model substitute for defense buyers, which should raise public-sector lifetime value, accelerate follow-on pipeline conversion, and strengthen valuation support.

Palantir’s position as the government AI control plane is reinforced

The impact on Palantir of the change in DOD AI provider from Anthropic to OpenAI includes a very modest short-term financial upside and, more importantly, a reinforcement of Palantir’s position underpinning U.S. government AI technologies. Palantir clearly leans into DOD engagement and lacks any qualms about the use of its technology by the military and controversial domestic agencies like Immigration and Customs Enforcement. Palantir’s revenue is also much more highly dependent on the government sector; in 2025, $2.4 billion of Palantir’s total revenue, or roughly 53.7%, was generated by government contracts.
 
Palantir’s FedStart program is an on-ramp to absorb the federal compliance burden for other vendors on a usage basis, with Palantir handling ATO (Authority to Operate) conversations, compliance artifacts, continuous monitoring and control assessments. Anthropic joined FedStart in 2025, but Palantir integrates multimodal AI and its partners with Microsoft to operationalize Azure OpenAI in classified government environments.

Explore deeper data and analysis

With TBR Insight Center’s interactive data visualization tool, your team can quickly adapt the thousands of data points within the AI & GenAI Model Provider Market Landscape, Cloud Data & Analytics Market Landscape and U.S. Federal Cloud Ecosystem Report for tailored competitive analysis, go-to-market strategy and executive briefings. The tool enables you to curate relevant quantitative insights by company, business unit and/or market segment, creating a report specific to your needs and ensuring consistent frameworks across projects.
 
Click here to explore Insight Center’s data visualization tool, or start your free trial today to access this one-of-a-kind digital-first intelligence platform.
 
If you believe you have access to the full research via your employer’s enterprise license or would like to learn how to access the full research, click here.
 

Supply Chain Threatens the Rise of AI PC in 2026

Massive-scale AI infrastructure deployments are driving skyrocketing PC prices

Memory manufacturers continue to shift their capex investments toward expanding high-bandwidth memory (HBM) production capacity in support of rampant AI server demand. At the same time, demand for more commoditized dynamic random access memory (DRAM), such as DDR5 and LPDDR5X, is also growing but at a slower rate relative to HBM demand. The combination is driving a supply-and-demand imbalance in the DRAM market, leading to higher prices.
 
In the below TBR Insights Live session, Principal Analyst Angela Lambert and Senior Analyst Ben Carbonneau share insights into how rising memory prices and Windows PC ecosystem investments will impact PC refresh and the adoption of AI PCs in 2026 and beyond.
 

 
This TBR Insights Live session is available on demand on our YouTube channel. Visit this link to download the presentation’s slide deck.
 
If you’d like to further explore the data mentioned in this TBR Insights Live session, sign up for a free trial of TBR Insight Center™ today.
 
TBR Insights Live sessions are held typically on Thursdays at 1 p.m. ET and include a 15-minute Q&A session following the main presentation. Previous sessions can be viewed anytime on TBR’s Webinar Portal.

Salesforce Highlights Strengths in Innovation and Relationships at Agentforce World Tour

On Feb. 25, 2026, TBR attended Salesforce’s Agentforce World Tour event in Sydney along with 10,000 Salesforce professionals, clients, alliance partners and analysts. The following reflects TBR’s observations and discussions during the event as well as our ongoing assessment of Salesforce and its ecosystem partners. TBR’s Salesforce analysis can be found in its quarterly vendor reports, the Cloud and Software Applications Benchmark, and the Adobe and Salesforce Ecosystem Report.

‘Everyone is looking to agency [agentic AI] to drive their companies forward’

During a panel discussion with Salesforce clients, Australian business leaders and Salesforce executives discussed best practices for enterprisewide agentic AI adoption and for scaling pilots. Panelists mentioned common ideas such as ensuring clear ownership of projects and agents, defining desired outcomes at the start of any engagement, and co-locating technology teams with business teams (this blog and others from TBR dive into best practices for IT services companies, consultancies, technology vendors and enterprises with respect to agentic AI adoption).
 
One Salesforce leader noted that clients have expressed frustration that AI has simply allowed them to write better emails. Salesforce, he added, is working to show ROI at scale and “get more production value out of these products.” In TBR’s view, AI adoption sentiments expressed in keynotes, panel discussions, and show-floor discussions with Agentforce attendees reflect common themes around well-understood best practices, concerns and fears about enterprisewide adoption, and confidence that 2026 will deliver clear, measurable and significant ROI from agentic AI investments. This last point may reflect the setting and vibe of the event, although many of the specific use cases described by Salesforce professionals and Australian clients reinforced an overall sense about agentic AI.

‘As a leader, if you think AI is going to replace people, you have more problems than [adopting] AI’

At another point during the panel discussion, the CEO of an Australian student accommodation business described the leadership challenges inherent in adopting AI at scale, both within her company and in her experience speaking with fellow CEOs in Australia. She commented that the most significant hurdles were rooted in business processes and people, not in the technology, and that leaders who failed to consider enterprise resilience from a business perspective would likely fail to gain significant benefits from adopting AI-enabled solutions.
 
This CEO’s comments echoed the sentiment expressed by Sanjna Parulekar, SVP of product marketing at Salesforce, who said “context is king” (in adopting agentic AI) and that companies should focus on business workflows, particularly as large language models increasingly examine business workflows. Parulekar also noted that understanding AI-driven change management, including changes in roles and responsibilities, could help companies break down silos and more rapidly transform their business models.
 
In TBR’s view, the quote in this section’s subhead perfectly captures this CEO’s dilemma at present: AI promises a productivity boost when bots replace people, but successful adoption at scale seems to require more people with different skills. Digital full-time employees (FTEs) are not yet cheaper than human FTEs, but slow-rolling adoption seems untenable. What to do? For Salesforce, and the company’s consulting partners in attendance at Agentforce, the answers are clear: more software, more platforms and more AI, all aimed at solving business problems, not just adding technology for technology’s sake.

Salesforce in the public sector

In a special breakout session, Salesforce’s local and global public sector leaders made three critical points about the company’s overall public sector strategy and recent performance:

  1. Licensing and permitting have been taking off as a use case, frequently tied to efforts to accelerate economic development.
  2. Governments across all levels have been looking for consolidation, from point solutions to a platform, especially in the U.S.
  3. As part of Salesforce’s public sector push in the U.S., the company has been providing partner-like training to employees at government agencies, disrupting traditional systems integrators. Notably, according to Salesforce, U.S. federal government agencies are increasingly looking to Salesforce to be the prime contractor on technology-centric engagements.

 
With the recent hype around the “death of SaaS” and other pressures on the business models of technology companies, Salesforce’s growing presence, success, and apparent disruption of competitors and alliance partners alike underscore Salesforce’s strengths in creating stickier client relationships and continually innovating, two qualities essential in the agentic AI age.
 
TBR’s overall takeaway from a day with Salesforce in Australia: Software is not dead. SaaS is not dead. Different wrappers, innovative use cases and deeply embedded relationships, both personal and technological, underscore Salesforce’s strength.

Telecom Edge Compute Market Forecast

TBR Spotlight Reports represent an excerpt of TBR’s full subscription research. Full reports and the complete data sets that underpin benchmarks, market forecasts and ecosystem reports are available as part of TBR’s subscription service. Click here to receive all new Spotlight Reports in your inbox.

Telco, cableco and hyperscaler spend on edge compute infrastructure will grow at a TBR-projected CAGR of 13.4% from 2024 to 2029 to reach $52.5B

TBR estimates telecom edge compute infrastructure investment will reach $52.5 billion in 2029, driven primarily by network transformation ― especially vRAN deployments ― by telcos and deployments by telcos and hyperscalers eager to extract economic value from AI and other distributed computing use cases.
 
The edge computing market is developing more slowly than originally expected due to several factors, particularly the lack of proven revenue-generating use cases. Future ROI on edge compute investments is uncertain as opportunities such as monetizing AI inferencing at the edge remain unproven. vRAN, the primary telco use case for edge compute, provides some cost efficiencies but offers limited net-new revenue-generation opportunities.
 
Hyperscaler spend growth will more than double the combined telco and cableco growth during the forecast period, as this cohort will pivot spend from central to edge build-outs to achieve the latency and quality of service that new network use cases will require, as well as to handle AI inferencing workloads.
 

Hyperscalers deprioritized edge cloud build-outs as they double down on AI training, which drives central data center investment; AI inferencing will leverage edge computing

Though hyperscalers are still increasing investment in edge data centers, their top priority is building out central data centers to train and support their AI models, as central data centers are best suited to support the large amount of power and space GPU servers require to do AI model training; data storage also requires space. Further, hyperscalers require more data center capacity to support their cloud services businesses and new use cases enabled by technologies such as AI.
 
To mitigate some of the economic and technical challenges associated with building out edge computing infrastructure at scale, hyperscalers intend to leverage key technological innovations in central cloud. For example, hyperscalers are conducting R&D and investing in Arm-based chips (which are more energy-efficient than x86 chips, the predominant chip used in data centers today) as well as in new cooling technologies such as liquid immersion. Hyperscalers are also focused on developing new renewable energy resources and exploring promising technologies such as SMR (small modular reactors) and geothermal to provide a steady stream of energy to power their data centers. TBR expects hyperscalers will leverage most of these innovations in central data centers before incorporating them into edge data centers.
 

If you believe you have access to the full research via your employer’s enterprise license or would like to learn how to access the full research, click the Access Research button.

Access Research

Hyperscalers have been focused on AI training in central data centers, but emphasis will shift to building out edge infrastructure to handle AI inferencing

Hyperscalers are building out their distributed computing and intelligent connectivity infrastructure platforms to capitalize on key use cases such as AR/VR and, more broadly, the digital ecosystem transcendence across key aspects of people’s lives (see TBR’s 2H25 Hyperscaler Digital Ecosystem Market Landscape for more information). Hyperscalers’ end goal is to enable ambient computing. TBR estimates trillions of dollars in economic value will be created by the intersection of distributed computing and intelligent connectivity through this decade, and we believe hyperscalers are positioning to capture an outsized portion of this market opportunity.
 

 
TBR believes hyperscaler distributed computing platforms, which encompass central and edge data centers as well as on-device computing, will be leveraged to run the intelligence layer of their respective global networks. For example, cloud-native, virtualized network functions, such as the mobile core, will reside in this distributed computing platform, supporting the transport and access layers of the network.

vRAN supports edge compute spend by CSPs such as Verizon and, until recently, DISH; Telus is the leading CSP in Canada in vRAN investment

Canada-based operators are taking a wait-and-see approach before making significant investments in edge infrastructure. These operators have, however, begun to invest in vRAN — and, more broadly, network transformation via virtualization and cloudification — with Telus making a strong commitment to the technology through an agreement with Samsung reached in February 2024, and Canada-based operators taking part in sovereign cloud efforts via offerings such as Bell AI Fabric.
 
AT&T, Verizon, Amazon, Microsoft and Google will spend the most on edge infrastructure in North America through the forecast period. 

vRAN is the largest edge compute use case for telcos, and Verizon is among the leaders in this space as the company announced it has deployed over 22,900 vRAN sites as of early 2025 in markets across the country. Samsung and Ericsson are rolling out vRAN solutions in Verizon’s C-Band-based 5G network.
 
Other U.S.-based companies, especially Apple, Meta Platforms, Comcast, Lumen and T-Mobile, are also investing in edge infrastructure but at a lower scale compared to the five aforementioned companies.
 
DISH has extensively deployed vRAN, though the company is dismantling its mobile access network.
 
Comcast has spent the most compared to other cablecos on edge infrastructure to date as the operator pushes forward with its network transformation. Other cablecos, such as Charter, Cox, Altice and Liberty Global, are following in Comcast’s footsteps. However, in general, cablecos will lag hyperscalers and telcos in the edge compute opportunity. As of October, Comcast had rolled out over 50,000 edge compute servers and 1,300 vCMTS physical points of deployments.

vRAN deployments largely drive local CSP edge compute spend; hyperscaler investment is materializing more slowly than previously anticipated due to a shift in their priorities

vRAN underpins local edge compute spend volume, with support to date provided by rollout initiatives at key operators such as China’s CSPs, Rakuten, Verizon and Japan’s Tier 1 CSPs. The proliferation of vRAN across more CSPs will drive growth during the forecast period.
 

 
Tower sites are especially pertinent to edge computing due to their proximity to population areas, availability of power, backhaul and last-mile access, and physical security.
 
Towercos have not seen the master lease agreements that hyperscalers were expected to sign with strategic shared infrastructure owner (SIO) partners to locate their edge stacks at the base of cell sites materialize at scale as hyperscalers pursue other avenues, such as satellites.
 
SIOs (such as tower companies, data center colocation providers and neutral host providers) are becoming increasingly important aggregation points for AI-related traffic and connectivity demand. Their facilities often serve as hubs where hyperscalers, neoclouds, enterprises and AI service providers interconnect, driving demand for high-capacity fiber, wavelength services, dark fiber and low-latency metro and long-haul transport. As AI workloads scale and become more distributed, SIOs are influencing where network investment is required and where new interconnection-rich locations emerge within metropolitan and regional markets.
 
Telcos have largely ceded this opportunity to SIOs because they have been divesting their tower and land assets to generate cash for network equipment, pay down debt, buy more spectrum and fund capex. As such, telcos will increasingly rely on SIOs for locating a portion of their edge sites.

U.S. Wireless Market Outlook for 2026

M&A, pricing pressures and leadership shifts reshape the competitive landscape

Acquisitions, expansion of mobile broadband bundling strategies and continued network investment remain priorities for U.S. wireless operators as they navigate an increasingly competitive market.
 
Competitive pricing — particularly from cable MVNOs — is intensifying, while Verizon is revamping its go-to-market strategy to reaccelerate subscriber growth. At the same time, differentiated connectivity services, including direct-to-device (D2D) satellite offerings and emerging network slicing capabilities, are becoming key competitive levers. Additionally, leadership changes at Verizon and T-Mobile are expected to accelerate cost-cutting efforts, AI implementation and operational restructuring initiatives.
 
In the below TBR Insights Live session, TBR Senior Analyst Steve Vachon shares key insights into the evolving U.S. wireless market and what market shifts mean for U.S. operators in 2026 and beyond.
 

 

Watch and learn:

  • How M&A activity and convergence strategies are reshaping the competitive landscape of the U.S. wireless market
  • How pricing strategies and service offerings are shifting in the U.S., and the impact on subscriber growth for U.S. operators
  • How recent CEO appointments and leadership changes will impact capital allocation, cost efficiencies and AI initiatives for U.S. operators
  • How these trends will impact wireless revenue growth, profitability and capex in 2026

This TBR Insights Live session is available on demand on our YouTube channel. Visit this link to download the presentation’s slide deck.
 
If you’d like to further explore the data mentioned in this TBR Insights Live session, sign up for a free trial of TBR Insight Center™ today.
 
TBR Insights Live sessions are held typically on Thursdays at 1 p.m. ET and include a 15-minute Q&A session following the main presentation. Previous sessions can be viewed anytime on TBR’s Webinar Portal.

KPMG Collaborates with Microsoft to Develop Governed Agent Operating Model at Scale

KPMG and Microsoft and the next phase of partner-led AI transformation

KPMG is repositioning itself from a Microsoft Dynamics-centric systems integrator to a broader AI-led transformation partner that has extensive experience with Microsoft technologies. As KPMG’s Microsoft alliance moves toward an AI-era operating model measured by sustained adoption and governed outcomes, the strategic questions for partners and competitors are centered on whether KPMG’s governance-led, platform-enabled approach becomes a repeatable bet in regulated and board programs — and, if so, how quickly peers can counter with comparable operational frameworks and field-ready narratives.
 
Recently, TBR had a chance to hear directly from KPMG’s Microsoft alliance leaders including Cherie Gartner, Global Lead Partner for Microsoft, KPMG LLP; Marco Amoedo, Global Chief Technology Officer, Microsoft, KPMG International; and Sven Rohl, Global Microsoft AI Business Solutions Lead, KPMG International, about how KPMG’s 20-plus-year alliance with Microsoft has evolved into a 360-degree relationship, framed by four interrelated fields of play including Reimagining the Enterprise, Platforms with Purpose, Modernization at Scale, and Secure by Design Enterprise. The following analysis reflects on this discussion and TBR’s ongoing research on KPMG and the Big Four, including our semiannual Management Consulting Benchmark and ecosystem intelligence reports.

Trust beyond scale

Framing the alliance as a global 360-degree relationship is a familiar phrase in alliance management, but KPMG tries to use the phrase differently, beyond just a relationship and more as a structure that enables shared accountability. Building off a decade-plus-long collaboration around Microsoft Dynamics applications, KPMG has now made Azure consumption the center of gravity, and executives described the metric as “the underpinning layer.” Aligning its internal success criteria with how Microsoft measures platform expansion means KPMG is implicitly shifting what it asks clients to do: not just approve a program and go live, but rather adopt, consume, expand and operate.
 
In the AI era, that distinction can become decisive as all parties look for pilots to be projectized. With KPMG firms managing a pool of over 40,000 Microsoft-trained consultants, including over 6,800 Dynamics experts and more than 14,000 Microsoft certifications, the KPMG global organization reinforced the scale of and commitment to the relationship by reemphasizing a multiyear, deliberate diversification push to grow Azure-led work. Certifications and specializations are not only a proof point but also a metric within KPMG’s multibillion-dollar investment with Microsoft, where Azure consumption remains a KPI to measure success.
 
We see this as a subtle but important narrative blend where KPMG is not disowning the legacy but rather using it as proof of proximity to business systems while moving the growth story upstream into cloud, data, security and AI. We believe if KPMG is successful with its messaging, the result will be a partner story that is designed to meet the next procurement threshold: Show me you can run this safely and prove it is worth it.
 
Further, rather than organizing around discrete Microsoft products (e.g., Azure, Dynamics, M365), KPMG is also designing integrated offerings that map to the go-to-market motions of Microsoft’s solution areas including Cloud & AI platforms, AI business solutions, and Security. We view these offerings as an opportunity for KPMG to deploy its Powered Enterprise framework structured around transformation discussions that begin with business priorities and end with one or more technology solutions, rather than leading with technology.
 
Although this approach is not unique to KPMG, it allows the firm to lean on its value proposition, something partners appreciate as they look to avoid coopetition. With knowledge management also testing the trust and alignment among partners, solution architects within KPMG and Microsoft are helping the companies build a robust foundation. Growth acceleration will come from ensuring field sellers within both organizations are equally able to tell each partner’s story as well as their own.
 

Platform-enabled integrated offerings bring the partners closer together

As the AI market moves from proof of concept to a portfolio of agents embedded into core workflows, buyers’ concerns are changing. They are now seeking vendors that are less focused on deploying solutions and more on helping them address concerns around governance and data residency control without disrupting the architecture, cost calculations, or operating processes across a multivendor environment, all while avoiding turning the program into a fragile dependency chain. This is where most AI transformation narratives collapse.
 
Vendors typically over-index on capability demonstrations and under-invest in the operating model. That weakness is precisely where KPMG firms are investing. The firm’s broader operating model strategy — its efforts to standardize, consolidate and reduce fragmentation across its global organization of member firms — starts to matter even more in this context especially as AI at scale punishes decentralization. KPMG’s direction is designed to make “One KPMG” more real operationally as the firm recognizes that this is the only way to make agentic delivery repeatable as it seeks to act more like a platform-led business.

The core bet: productizing transformation and productizing trust

KPMG’s alliance partnership with Microsoft is built around two assets that function as the firm’s packaging layer for the AI era: KPMG Velocity and KPMG Workbench. KPMG Velocity provides AI-enabled products and services through a platform ecosystem that integrates KPMG’s insights, methods, expertise, capabilities and data with advanced technology. KPMG Workbench is the firm’s global AI platform, designed to scale global adoption and integration of AI and underpin KPMG client delivery solutions.
 
For KPMG, the key strategic point is not the existence of Velocity, as peers similarly package methods in platforms, but rather what Velocity is positioned to do: make Microsoft’s technology consumable in terms of measurable outcomes. The most important aspect of KPMG Workbench is not that it uses AI but rather the kinds of capabilities the platform offers that signal production readiness, especially those targeting skeptical buyers, as Workbench is designed to address key questions around trust and compliance, economics and sovereignty.
 
In summary, if KPMG Velocity is the packaging layer that makes transformation repeatable, KPMG Workbench is the operating layer that makes transformation defensible. KPMG now has the opportunity to use these platforms consistently across member firms, both with Microsoft and with other key partners.

The alliance motion that matters: Azure-central, multivendor, and designed for reality

Another strategic message KPMG’s Microsoft alliance leaders discussed is that they do not expect buyers to lean on a single vendor, even if Microsoft is the anchor. They repeatedly emphasized a multiparty ecosystem go-to-market strategy with Microsoft alongside SAP, ServiceNow and others. The pattern is consistent with the emergence of the multiparty alliance construct that TBR has observed within the past 18 to 24 months and has discussed at length within our Ecosystem Intelligence research stream.
 
Overall, KPMG treats Azure as the compute and platform foundation but acknowledges that the enterprise estate is triangulated by design. This is an important positioning, especially as the market has shifted from platform selection to platform negotiation where large enterprises will not rip and replace existing solutions but will look for vendors that can orchestrate their tech stack. Any vendor that assumes it can win by forcing a monoculture could lose relevance.
 
KPMG’s approach is therefore less Microsoft-only than it is Azure-central. It is a strategy built for deal reality and positions the firm to create outcomes without demanding architectural purity. This strategy could strengthen KPMG’s reputation with procurement departments, especially in regulated environments, where buyers must balance modernization with risk management and existing vendor commitments.

Reconciling outcome-based consulting with consumption-based cloud

During the briefing, KPMG highlighted two approaches that close the gap with its aspiration to drive outcome-based pricing at scale: where every solution must deliver a measurable return, and where Azure is the core growth metric of Microsoft’s consumption-based economics. This leads to a defensive dynamic where clients increasingly expect AI-enabled efficiencies to translate into lower costs and fewer people, forcing KPMG to justify its pricing by clearly demonstrating the value of its platforms and IP, and an offensive dynamic, in which the firm invests heavily in managed services and Consulting as a Service models that bundle AI-driven solutions with ongoing delivery, aligning more naturally with Microsoft’s consumption-led view.
 
These two shifts are reflected in KPMG’s expansion of subscription-based offerings, such as KPMG Clara, KPMG Digital Gateway for Tax and a KPMG Digital Gateway for Law, and sector-specific AI managed services (e.g., trade surveillance and fraud monitoring for banks) built on KPMG platforms and Azure. These efforts are supported by new go-to-market capabilities like partner and staff sales academies focused on solution and subscription selling and the introduction of dedicated technology sellers in some member firms — changes that mirror hyperscaler and ISV selling motions. For KPMG to succeed at scale, it will likely need to continue evolving culturally and operationally beyond its traditional alliance partnership model, a shift the firm has already begun to address.

Long-term structural changes can help KPMG stay relevant with Microsoft as a copilot

Over the next 12 to 24 months, we expect professional services vendors’ partner strategies and enterprise buying to reorganize around roles and operating standards. First, partner segmentation will harden. Enterprises will increasingly select multiple partners for distinct roles: scale operators to industrialize, governance lighthouses to de-risk production and specialists to provide domain depth. The “one partner does everything” narrative will weaken.
 
Second, consumption will be scrutinized through a value lens. Azure consumption will become less about quota and more about the quality of the investment. Workbench-style telemetry and metering are early indicators of where buyer expectations are heading: cost attribution and measurable value per agent, per workflow and per portfolio.
 
Third, commercial models will shift faster toward managed operations and subscription-like constructs. Although value and usage are measurable and continually optimized, time-and-materials commercial constructs will become harder to defend as the default model.
 
Beyond 24 months, the most consequential shift is that “assurance-grade” operations may become a prerequisite for transformation itself. As agents operate inside financial processes, HR, security and supply chains, buyers will demand auditability and governance in ways that resemble assurance disciplines. The linkage of KPMG Workbench across advisory and audit-adjacent contexts is not incidental and hints at how KPMG believes it can compete.
 
At the same time, sovereignty-by-design becomes procurement leverage. Data residency routing and region-specific controls are not just technical features, but they will become deal accelerants, particularly in Europe and regulated industries. Finally, the source of lock-in shifts. The sticky asset will not be the implementation project but rather the operating standard: governance models, telemetry dashboards, value measurement frameworks and agent life cycle processes. Whoever defines those standards will own the long-term relationship, even if the underlying technology is theoretically interchangeable.
 
KPMG is positioning its Microsoft alliance for the next phase of enterprise AI adoption, in which production readiness, governance and demonstrable value are expected to differentiate winners from laggards. KPMG Velocity is framed as the mechanism for making transformation repeatable, while KPMG Workbench is positioned as the means to make transformation governable and auditable. Within this narrative, Azure consumption functions as the key performance indicator aligning KPMG’s incentives with Microsoft’s platform economics, and a multivendor triangulation strategy aligns KPMG’s go-to-market approach with enterprise buyer realities.
 
If KPMG can convert these elements into field-ready plays and consistently repeatable client outcomes, it may emerge as a uniquely valuable Microsoft partner for regulated, board-visible programs, shifting competitive pressure onto peers to differentiate through operational trust rather than delivery scale. Overall, KPMG appears to be attempting to move the competitive battleground from implementation speed to governed operations. If the market evolves in this direction, partners and competitors will need to clarify their roles and demonstrate credibility in an environment where agentic transformation is treated less as a discrete project and more as a continuously governed system.

What 2026 Holds for Consulting & IT Services

From AI hype to revenue reality

Disruption from AI adoption, shifting commercial models and macroeconomic pressure will shape the overall market for professional services, including consulting, systems integration and managed services, in 2026. TBR clients have been asking: How fast and how significant will profitable revenue come from AI-related services and adjacent professional services? How will commercial models change? How much longer will time-and-materials and pyramid staffing models last? And what is the timing expectation for a return to double-digit growth based on our decades of data on these markets?
 
TBR’s focus has always been on individual companies. Understanding, and even predicting, market trends is only the starting point. Our analysis goes deeper into what trends, disruptions and opportunities mean for the 50-plus companies in TBR’s Professional Services coverage. Predictions help set the framework for thinking about the future, while company-specific analysis provides the context for every player in the consulting, IT services and technology ecosystem.
 
In the below TBR Insights Live session, Principal Analyst & Practice Manager Patrick Heffernan and TBR’s Professional Services team discuss which macro trends will shape demand for consulting and IT services over the next five years, how companies have positioned themselves for accelerated AI adoption, and which companies will outpace peers and why.
 

This TBR Insights Live session is available on demand on our YouTube channel. Visit this link to download the presentation’s slide deck.
 
If you’d like to further explore the data mentioned in this TBR Insights Live session, sign up for a free trial of TBR Insight Center™ today.
 
TBR Insights Live sessions are held typically on Thursdays at 1 p.m. ET and include a 15-minute Q&A session following the main presentation. Previous sessions can be viewed anytime on TBR’s Webinar Portal.