With Broadcom at the Helm, Profitability Will be at the Center of VMware’s Next Chapter

On Nov. 22, 2023, Broadcom officially closed its acquisition of VMware, concluding an 18-month saga that called on the company to navigate several regulatory roadblocks. While these hurdles may have delayed the deal’s closing, TBR suspects most industry watchers have anticipated this outcome for quite some time.

VMware Acquisition Approved by Global Regulators

The early concerns of global regulators about anti-competitiveness did not take into account the strategic importance to Broadcom of keeping VMware’s platforms accessible across all hardware options, thus eliminating the likelihood of Broadcom limiting these platforms to its own hardware.
 
Chinese regulators were certainly a tail risk given recent geopolitically motivated actions against other U.S.-oriented M&A, yet they ultimately approved the deal, too, perhaps due to Broadcom’s historical ties to the country and the software-centric focus of the acquisition.
 
Now, with the deal done, VMware’s next chapter has begun. It has been a long road for the company, yet many things have remained the same. Although VMware is pushing into new cloud-native platforms, the company’s virtualization platform is still its bread and butter, and much of VMware’s total revenue is tied to this business. This proportion is likely magnified considering the breakdown of operating profit. As Broadcom takes the reins, VMware’s strategy will revolve around maximizing the value of these profit centers, likely to the detriment of emerging businesses.
 

TBR′s 2024 Prediction Series

GenAI: A Growth Catalyst for Cloud Evolution in 2024 and Beyond


 

Broadcom Is in Charge and Will be Guided by Profitability

Broadcom has stated profitability through cost cutting is the top priority, communicating to investors the goal to achieve an adjusted EBITDA of $8.5 billion over the next three years compared to $3.2 billion of GAAP EBITDA for the last 12 months ended CY2Q23. While far from a perfect comparison, the targeted uplift is clearly sizable and will rely heavily on reducing costs.
 
TBR expects general & administrative costs to see the greatest relative decline as Broadcom executes its synergy plan, which will involve slashing redundant headcount in administrative roles. TBR expects Broadcom to be particularly successful in this area, as leadership has extensive experience folding acquired businesses into existing functions in departments like legal, finance and human resources. This skill will be put to work quickly, likely resulting in multiple rounds of layoffs across these departments.
 
Sales & marketing teams are expected to see impacts as well as Broadcom makes use of its existing sales teams and channel distribution partners to sell into existing strategic accounts.
 
Headcount reductions have already begun, just days after the deal closed. The total impact of layoffs so far is unclear, yet there are reports that reductions have affected software development and cloud engineering roles as well as administrative roles. While While VMware’s R&D budget will undoubtedly shrink, it is unknown by how much. The fact that R&D-related headcount is being cut early does not paint a favorable picture for Broadcom’s commitment to innovation, yet TBR’s estimates indicate that drastic cuts may not be necessary. This aligns with commentary from Broadcom management, which has promised to maintain VMware’s previous development strategy. Still, TBR remains skeptical on future R&D efforts.

Profitability Goals may Negatively Impact License Products and Emerging Solutions Over the Long Term

Along with many industry watchers, TBR has been concerned about Broadcom’s intention to invest in innovation since the initial announcement of the VMware acquisition, given Broadcom’s history with CA Technologies and Symantec. In both instances, the company slashed funding for support and R&D after the acquisition, opting to extract free cash flow from their sticky install bases instead of pursuing organic growth. VMware offers a similar opportunity.
 
Cost concerns are prompting many enterprise customers to preserve past investments, including their virtualization platforms. Moreover, since VMware has built highly integrated solutions with all the Tier 1 hyperscalers, enterprises are better equipped to migrate their virtualization platforms to the cloud, where they are able to set up broader cloud migrations without fully committing to the transition to cloud native.
 
This means VMware commands a large, sticky install base, which would be ideal for Broadcom’s previous strategy. Recognizing this, many partners and customers are rightfully worried about the outcome of this deal, expecting higher licensing prices and diminishing support.

Profit Centers Will See Little Impact from Broadcom Ownership

In addition to promoting margin expansion, raising license prices will encourage more customers to transition to subscription offerings, which highlights an important consideration within this business transformation. While Broadcom will deprioritize certain segments, large portions of VMware will be deemed strategic by Broadcom and will continue to see the same level of investment.
 
For instance, many customers and partners collaborating around cloud-based virtualization platforms like VMware Cloud will see minimal differences because of the change in ownership. For the last 12 months ended CY2Q23, over 34% of VMware’s revenue was generated in the Subscription & SaaS segment, and TBR suspects Broadcom will prioritize many of the offerings within this segment.
 
In May Broadcom CEO Hock Tan pledged to invest an incremental $2 billion per year, with half slated for R&D to support the Cross-Cloud portfolio. Considering that an incremental $1 billion investment would increase R&D spend by around 30% over CY2022 levels, Broadcom’s ownership may actually benefit large swaths of VMware’s Cross-Cloud portfolio by adding resources and accelerating development timelines.

Long-term, Profitability Will be King

TBR is skeptical about how far into the future Broadcom’s commitment will go, and it is not clear how Broadcom’s investment will be spread across VMware’s different offerings. Many solutions within the Cross-Cloud portfolio are still underdeveloped and represent long-term opportunities for VMware to achieve long-term sustainable top-line performance.
 
Tanzu is a prime example. The container management platform sits at the heart of the company’s multicloud strategy, which VMware has pushed heavily over the past 18 months, yet TBR suspects Tanzu contributes only a small percentage of total revenue and certainly cannot be considered a profit center.
 
If Broadcom is to achieve its stated profitability goals, VMware will need to scale this offering rapidly. If it does not, TBR expects there will be a limit to Broadcom’s patience and a spinoff may be in the cards over the long term. To TBR, the $2 billion commitment indicates a willingness to only support these emerging businesses over the short term.

Conclusion

Regardless of how much Broadcom messages around maintaining VMware’s current investment strategies, it is very difficult to reconcile this marketing approach with the company’s stated profitability goals. Thus, TBR suspects large changes have begun to arrive for the virtualization leader.
 
The most immediate impacts will be the significant layoffs that have reportedly removed redundant administrative headcount, along with likely price increases on license products. While there is good reason to expect that many of VMware’s emerging products will be supported over the next couple years, the long-term view is much more opaque.
 
TBR will be watching for signs of traction and strong execution around many of the emerging solutions included in the Cross-Cloud portfolio, but if they fail to materialize, TBR expects Broadcom’s management to make decisions that benefit profitability.

 

Microsoft Expands PaaS Portfolio on Path to AI Incumbency

A platform company at its core, Microsoft is less concerned with migrating monolithic applications and instead is focused on building a complete data integration and management layer to capture value-add workloads that tie into said applications, all while maximizing clients’ underlying Azure infrastructure usage. To replicate this approach for the AI era, Microsoft has spent years integrating its various data services, from Synapse to Power BI, to automate customers’ entire data pipelines and prepare them for AI adoption. The result is Microsoft Fabric, a new end-to-end SaaS-like data platform that could help Microsoft reach new audiences and spur Azure growth in the continued race for cloud and AI dominance.  

Microsoft Is Investing in Data Cloud to Support its GenAI Strategy

What Is Microsoft Fabric?

Simply put, Microsoft Fabric is a unified data platform comprising seven core Azure data services: Data Factory, Synapse Data Engineering, Synapse Data Warehouse, Synapse Real Time Analytics, Power BI and Data Activator. While Microsoft Fabric makes it easier for customers to connect to different personas within an organization, from data engineers to business analysts, the hallmark of the new service is its simplified pricing model, which charges customers based on the total amount of IaaS resources consumed, rather than the compute and storage for each individual Azure data service.
 
When we interview enterprise buyers, we continue to find that consolidating point solutions in favor of complete, integrated platforms is a common trend, and Fabric is bound to resonate with customers trying to control runaway cloud costs in a still widely uncertain economy.
 
The other key defining attribute of Microsoft Fabric is the underlying architecture it is built on, OneLake. Microsoft Fabric is based on a repository that allows customers to query data on not just MySQL databases but also object storage, as is customary in the data lake architecture.
 
With OneLake, we see Microsoft moving squarely into the data lake space. Given the symbiotic relationship between data lakes, which are designed for unstructured data, and generative AI (GenAI), OneLake is Microsoft’s under-the-hood way of ensuring that customers can easily load data from multiple sources, put it through the Fabric platform for data management and visualization, and build GenAI applications.
 
Altogether, the unification of Microsoft OneLake and Fabric is the right step for Microsoft and exemplifies how far the company has been willing to go to execute its AI-based growth strategy.
 

TBR’s 2024 Prediction Series

GenAI: Growth Catalyst for Cloud in 2024


 

Fabric Will Help Microsoft Change the PaaS Landscape but Not Without Infringing on Partners

As highlighted in TBR’s 3Q23 Cloud Data Services Market Landscape, Amazon Web Services (AWS) is the clear leader in the cloud data warehouse market, with Microsoft falling squarely in second place and not significantly ahead of Google Cloud and Snowflake. Azure Synapse has not gained the same level of interest and traction in the market as AWS and Google Cloud’s BigQuery. As a result, Microsoft partnered with Databricks in 2017, developing and delivering the first-party Azure Databricks service.
 
Partnering with Databricks to ensure customers have an effective data analytics platform natively available on Azure rather than Synapse was a strategic move. With Fabric, however, we now see Microsoft essentially re-delivering Synapse as part of a more complete product that gets to the heart of what customers want: an end-to-end set of capabilities that automate entire data pipelines from data collection and ingestion up to analytics and visualization.
 
This approach should bring Synapse into more client conversations while helping Microsoft expand its reach outside the analytics department. This, of course, raises the question: What becomes of Microsoft’s partnership with Databricks? As part of OneLake, the architecture underpinning Fabric, Microsoft is leveraging Delta Lake — Databricks’ protocol for storing data in an open table format — and this move could persuade Databricks customers to adopt Fabric.
 
Even still, Microsoft OneLake adopts the data lakehouse architecture pioneered by Databricks, and with Fabric’s feature-rich set of upper-stack capabilities, customers may be more inclined to go all in with Microsoft Fabric and its comprehensive pricing model, which would bring a new layer of competition to the Microsoft-Databricks relationship.
 
This trend is indicative of what we are seeing across the cloud landscape. The hyperscalers, even those perceived as more partner friendly, are expanding into new areas of the cloud stack, posing potential risks to their partners, especially as customers continue to indicate their interest in consolidating point solutions.
 
That said, coopetition is nothing new in the cloud landscape, and vendors are getting more adept at navigating competitive differences to deliver outcome-specific solutions to their joint customers.
 
Perhaps the best example is the relationship between AWS and Snowflake, which are both spending millions of dollars to get legacy data warehouse customers to Snowflake’s platform on AWS. While AWS would naturally prefer customers adopt its own data warehouse service — Redshift — over Snowflake, AWS has realized the trade-off of forfeiting some Redshift customers to Snowflake as long as those customers are running on AWS infrastructure.
 
Microsoft Fabric is much broader than the data warehouse, but if AWS and Snowflake are a barometer of a successful partnership, Microsoft and Databricks will similarly learn to overcome these obstacles.
 
With Fabric, we expect Microsoft will slowly chip away at AWS’ share and potentially Snowflake’s and Databricks’ in the coming years. However, it is important to note we do not see Fabric as any kind of direct threat to pure play data cloud platforms, particularly Snowflake, which has the established presence and reputation in the data warehouse space specifically, not to mention easy inroads into AWS’ customer base.
 
In our talks with enterprise buyers, we often find customers value Snowflake as it allows them to run separate workloads as part of a shared data layer that is not tied to any specific cloud infrastructure. Despite the multicloud capabilities in OneLake, nothing changes the fact that the core data warehousing capabilities within Synapse are still built specifically for Azure infrastructure for the seamless integration with other Azure services.
 
We have no doubt Fabric will be attractive to Microsoft-centric shops, but attracting customers invested with other cloud providers may be a more difficult feat, solidifying Snowflake’s and Databricks’ unique value propositions.

Data Lakes and GenAI Go Hand in Hand, and Microsoft Wants to be the First Hyperscaler Strongly Associated with the Architecture

One other interesting consideration with Fabric is Microsoft’s choice of open table format. Considering its partnership with Databricks, Microsoft has opted for Delta Lake, although it plans to add external support for two other popular frameworks: Apache Iceberg and Hudi.
 
In general, for customers that want to build a data lake, Delta Lake is the preferred format while Apache Iceberg is more aligned with data warehouses. Defaulting to Delta Lake reflects Microsoft’s intent to remain relevant with Databricks customers, while allowing customers to query data on object storage (Amazon S3 and eventually Google Cloud Storage) reflects Microsoft’s commitment to the data lake architecture.
 
Due to data lakes’ ability to combine both structured and unstructured data for prescriptive analytics use cases, they are becoming increasingly popular and, in some scenarios, offer customers a way to bypass data warehouse operations altogether. GenAI, which relies on unstructured data sources, such as documents or images, will fuel customers’ desire to consolidate data warehouses into data lakes, leading us to believe that Databricks is in a strong position despite Microsoft’s Fabric announcement.
 
This is also one of the reasons why Snowflake is trying to add more features that support unstructured and semistructured data in hopes of changing its perception in the market from a data warehouse company to a data lake company.
 
The hyperscalers, however, have been arguably behind in their data lake services and messaging, and with OneLake, Microsoft wants to make sure it is the hyperscaler most strongly associated with data lakes and by default, GenAI.

GenAI Enablement Sits at the Heart of Microsoft’s PaaS Strategy

Considering Microsoft has arguably made the biggest splash in generative AI, the company’s latest PaaS developments come as no surprise. As TBR discussed in our 2Q23 Cloud Ecosystems Market Landscape, a large language model (LLM) is only as good as the data that goes inside, which means the ability to establish a centralized, single source of truth is very important for an enterprise pursuing a serious generative AI strategy.
 
OneLake’s ability to provide an enterprisewide repository and a no-code API to manage data will help the company address this need, and the GenAI tools embedded within Fabric will help accelerate the transition to unified data pipelines.
 
Mostly in preview today, there are three Copilot solutions embedded within Fabric: Copilot for Data Science and Data Engineering, Copilot for Data Factory, and Copilot for Power BI. Broadly, the Copilot solutions in Microsoft Fabric enable code generation capable of automating routine tasks and expediting the transformation from raw data to structured, which is what LLMs hunger for.
 
The integrations built over the years between Microsoft’s platform assets and its application portfolios ensure there is plenty of raw data entering Fabric, which, as it becomes structured, presents an ideal environment for enterprises to pursue custom GenAI development. This is where the Azure OpenAI Service enters the conversation.
 
While the Copilot solutions offered by Microsoft provide quick-and-easy access to GenAI capabilities, true transformational value will be unlocked as enterprises build their own GenAI applications around their proprietary data and business processes, presenting a large opportunity for Microsoft.
 
The Azure OpenAI service has been enabling customers to train LLMs on their proprietary data since it became generally available in January, and, at Ignite 2023, Microsoft took another step forward with the public preview launch of Azure AI Studio. A new addition to the Azure OpenAI service, Azure AI Studio brings together developer tools like Azure AI SDK with the company’s growing catalog of foundation models to enable customers to build their own copilots and other generative AI applications.
 
As more enterprises pursue custom GenAI development, the unified approach to data management offered by Microsoft Fabric and OneLake will become more valuable, drawing interest from enterprises with large Microsoft footprints, yet coopetition at the data layer will remain the standard.
 
Ultimately, Microsoft’s priority is ensuring all data can be easily fed into its foundation model service, so integrations that connect the Azure OpenAI Service with third-party data leaders like Snowflake and Databricks will prove to be popular alternatives to Microsoft’s end-to-end approach.

Microsoft Is Not Just after the Data Layer: The Race for Hybrid Cloud Control Plane Continues as Azure Arc Reaches 21,000 Customers

Throughout this report, we have touched on Microsoft’s pursuit of the data layer, but it is important to note that Microsoft’s PaaS capabilities are much broader and extend closer to the box. Owing to Windows Server, Microsoft has captured a significant portion of the enterprise OS layer, allowing the company to effectively move into the multicloud control plane, which Microsoft calls Azure Arc.
 
Best thought of as an abstraction layer that stiches together infrastructure assets for capabilities like monitoring, provisioning and observability, all while securing the OS instance, Azure Arc has amassed 21,000 customers in the span of four years.
 
In recent quarters we have seen Microsoft become increasingly transparent in its customer reporting. For instance, in 2Q23 and 3Q23 Azure Arc customer count grew 150% and 140% year-to-year, respectively, putting the customer count at just 7,200 in 2Q22. This is much lower than the 21,000 customers announced in 3Q23 and indicates vast interest from Microsoft’s install base of customers trying to bridge the gap between the cloud and legacy data center.
 
Another factor driving the platform’s success is Microsoft’s early support for both virtual machines (VMs) and Kubernetes. This approach contrasts with Google Cloud, whose primary goal is getting customers to move away from VMs and use containers. In other words, Google Cloud wants customers to use GKE (Google Kubernetes Engine) on premises to containerize a VM and keep it there, but also wants customers to build net-new, cloud-native apps in containers.
 
Google Cloud did launch Anthos for VMs in 2021, which we viewed as a direct counterattack to Azure Arc, albeit not a very effective one, as Anthos’ customer count is comparatively low and could suggest the company has not been as adept at tapping into the VMware customer base and attracting enterprises that are not ready to migrate VMs.
 
We will continue to monitor Azure Arc’s growing customer count in the coming quarters, and it will be interesting to see if Microsoft begins to leverage Fabric to support other managed data services outside Azure SQL via Arc to turn the hybrid platform a more complete, centralized management layer.

IT Ecosystem Trust Paves the Way for GenAI-enabled Growth in 2024

2024 Predictions is a series of special reports examining market trends and business changes TBR’s analysts expect in the coming year. In the digital transformation edition, our team looks at expectations for GenAI’s impact, the rise of superpowers and the three things enabling new market growth.

Top 3 Predictions for Digital Transformation in 2024

  1. GenAI hype meets reality
  2. Ecosystems fuel disruption and lead to the rise of the superpowers
  3. Cyber, data and regulations — the three-legged stool enabling new digital transfomration growth

 

Request Your Free Copy of 2024 Digital Transformation Predictions

Challenges and Opportunities in the Era of GenAI and Enterprise Digital Transformation

While cloud remains the backbone of buyers’ digital transformation (DT) programs, generative AI (GenAI) has thrown vendors and their technology partners into a frenzy, especially as enterprise buyers have started paying closer attention to their IT spend in response to macroeconomic headwinds.
 
This new dynamic creates a plethora of challenges and opportunities for technology and services vendors that guide and manage enterprise DT programs. From vendor consolidation to technology stack simplification, buyers continue to look for ways to optimize their digital assets, making it hard for vendors to introduce new technology without the appropriate use cases. Delivering value in a challenging market requires vendors to act more as strategic partners and collaborate rather than simply transact with enterprises.
 
GenAI is here to stay. There are certainly more unknowns than knowns today, despite everyone across the ecosystem convincing others they have found the silver bullet that will enable the creation of the next-gen enterprise business model. As with most new technologies, establishing the right frameworks as well as commercial and pricing models is a necessary first step before adoption can scale. Developing and deploying pricing mechanisms that incorporate pro bono and/or risk-sharing services and using templated offerings to standardize delivery can help vendors maintain their incumbent positions, especially as GenAI will level the skills playing field.
 
TBR Insights Live - Navigating GenAI Opportunities and Challenges in Digital Transformation in 2024
 
Expectations around differentiation are also changing, increasing the need for vendors to add specialization and often spurring them to expand their partner ecosystem. The advent of a new technology stack (e.g., next-gen GPU-run data centers that enable GenAI to reveal its full potential) will compel vendors to re-evaluate and expand their relationships with chip manufacturers — something many software and services vendors have not done for a while.
 
Additionally, the implications for cyber, data, regulations, ethics, and model governance will continue to dominate headlines and vendor-buyer conversations. And while vendors are in the business of making money, we believe the winning formula is to strike the right balance between constantly selling and consistently developing relationships with buyers and partners.
 
To read the entire 2024 Digital Transformation Predictions special report, request your free copy today!

GenAI: A Growth Catalyst for Cloud Evolution in 2024 and Beyond

2024 Predictions is a series of special reports examining market trends and business changes TBR’s analysts expect in the coming year. In the cloud edition, our team looks at expectations for the strategies that will determine growth leaders in cloud in 2024.

Top 3 Predictions for Cloud in 2024

  1. Simply providing cloud services at scale is no longer enough for vendors to gain cloud market share
  2. IaaS will become more tailored to workload and regulation
  3. SaaS vendors promote multiproduct sales with generative AI

 

Request Your Free Copy of 2024 Cloud Predictions

GenAI’s Rise Amid Cloud Challenges: Navigating 2024’s Landscape and Shaping the Future

For all the challenges that cloud vendors faced in 2023, there was a promising sprout of opportunity that developed quite rapidly with generative AI (GenAI) technologies. The pace with which GenAI gained not only awareness but also real investment and usage in the market was notable, and we expect end customers’ real investments in the solutions to continue to grow and develop in 2024.
 
However, GenAI solutions on their own will not overcome the headwinds that worked against the market throughout 2023. Many of the forces that caused revenue growth rates to slow precipitously for nearly every major cloud vendor remain in place heading into 2024.
 
TBR Insights Live: GenAI and the Cloud Revolution in 2024
 
The general macroeconomic conditions remain uncertain, wars continue to threaten global stability, IT buyers remain cautious about spending, and cloud has reached a saturation point in many IT organizations. So, while we do not expect GenAI technology to return the market and leading vendors to their pre-2023 pace of revenue expansion, it will serve as a small yet rapidly growing segment in 2024 and should become a significant market in 2025 and beyond.
 
We also expect the intensity of AI-focused strategies during 2024 to reflect the importance of the technology to long-term growth. AI could reset the cloud leaderboard for the next decade, so incumbents like Amazon Web Services (AWS) and Salesforce will be keen to protect their large customer bases against mounting AI competition from the likes of Google, Microsoft and SAP.
 
To read the entire 2024 Cloud Predictions special report, request your free copy today!

IT Services and Consulting in 2024: Traversing GenAI Pressures, Talent Challenges, and Regulatory Waves

2024 Predictions is a series of special reports examining market trends and business changes TBR’s analysts expect in the coming year. In the professional services edition, our team looks at expectations for the predictable uncertainties of 2024, including the impacts of GenAI hype and outcomes-based strategies for IT services vendors and consultancies.

Top 3 Predictions for Professional Services in 2024

  1. The 2023 focus on reskilling and training will pay off in accelerated revenues in 2024
  2. Generative AI will create a pivot to outcomes-based pricing
  3. Regulations will become a major pain point for all

 

Request Your Free Copy of 2024 Professional Services Predictions

Embracing Change, GenAI Hype and the Imperative of Outcome-Based Strategies for IT Services and Consultancies

As they say, nothing in life is certain except for death and taxes. And change. And data overload. And hype about technology and disruption. Predictions provide a perfect platform for big leaps and wild guesses, but at TBR, we are seeing more of the same for 2024, including taxes, data overload, and technology (read: generative AI [GenAI]) hype.
 
IT services and consulting stubbornly remain a people-centric business, despite advances in automation, analytics and AI, and vendors most adept at attracting and retaining good people continue to outperform peers. Keeping good people when the hype around GenAI suggests that many task-oriented jobs will disappear requires vendors offer training in new skills and develop new career paths.
 
Concurrent with these pressures on talent, GenAI will pressure contracts — with greater transparency comes greater opportunity to pay for exactly what you got. IT services vendors and consultancies that embrace outcome-based pricing models will increasingly find their clients, particularly those enamored with GenAI (although, who isn’t?) open to creative pricing and reluctant to continue business as usual once GenAI has pushed the client’s procurement office out the door.
 
TBR Insights Live - GenAI Hype in 2024: A Deep Dive into IT Services Industry Predictions
 
Additionally, governments continue to lean into regulation to mitigate societal risks and to tame or unleash (depending on your political views) commercial activities. After the last three years of dealing with the pandemic, war, and the emergence of robot overlords (read: again, GenAI), we can reasonably expect governments will increasingly seek the security blanket of tighter regulations.
 
Add a splintering of global approaches to trade, finance and geopolitics, and companies face not just more regulations but also overlapping and potentially conflicting compliance obligations, varying wildly by jurisdiction. Death and taxes, indeed.
 
For IT services vendors and consultancies, 2024 looks a little boring. Reskill and train your people so you’ve got the right folks ready to deploy at scale to address your clients’ toughest problems. Let someone else handle the easy problems until they get replaced by GenAI. Start baking outcomes-based pricing into every engagement, underpinned by AI and analytics that demonstrate unquestionably what value you are bringing your clients. And lean hard into governance, risk and compliance (GRC), unless you do not have those skills already, in which case, find a partner.
 
To read the entire 2024 Professional Services Predictions special report, request your free copy today!

Telecom Industry Retrenches in Response to Macroeconomic Pressures

2024 Predictions is a series of special reports examining market trends and business changes TBR’s analysts expect in the coming year. In the telecom edition, our team looks at expectations for communication service providers and their vendor partners as the macroeconomic and telecom industry-specific challenges of 2023 continue in 2024.

Top 3 Predictions for Telecom in 2024

  1. New round of M&A and bolder combinations are likely to be allowed by regulators
  2. Cash flow management becomes priority due to increase in cost of capital and other headwinds
  3. Open RAN will not be ready for mainstream adoption in 2024

 

Request Your Free Copy of 2024 Telecom Predictions

CSPs and Telecom-centric Vendors Will Have to Adjust to Headwinds in Their Industry and the Wider Economy

Macroeconomic and telecom industry-specific challenges that manifested in 2023 — for example, rising interest rates, inflation, lack of 5G ROI, technological complexity, and the end of key stimulus programs and various other economic support mechanisms instituted by governments during the COVID-19 pandemic — are expected to persist through 2024, prompting a response from communication service providers (CSPs) and their vendor partners.
 
The most impactful and pervasive issue confronting the telecom industry is the rising cost of capital, which has been increasing due to central bankers’ shift from quantitative easing (QE) to quantitative tightening (QT) in an attempt to tamp down inflation. The result thus far is companies are now paying on average two to three (or more) times the interest rates they had grown accustomed to since the Great Recession, when central banks began holding interest rates at close to zero. This relatively abrupt change in monetary and fiscal policy has created a concerning situation for entities that are heavily levered with debt, which encompasses nearly all CSPs and many telecom vendors.
 
CSPs with the weakest financial positions began changing their behavior in 2023, primarily in response to the rising cost of capital, evidenced by fiber build targets being scaled back, assets being revalued and written down, and overall capex budgets being reduced. Some CSPs have also had to layer on more onerous covenants, such as pledging assets as collateral, to secure new debt issuances and partially offset the rise in interest rates.
 
TBR Insights Live - 2024 Telecom Industry Outlook: Navigating Macroeconomic and Industry-specific Turbulence
 
TBR expects many CSPs with relatively stronger financial positions to also change their behavior in 2024. Changed behavior typically occurs after a reassessment of capital structure and capital allocation, which can lead to a variety of outcomes ranging from dividend cuts to capex reduction to M&A events. Said differently, CSP CFOs worldwide will be under an unusual amount of pressure to meet their objectives in 2024 and they are highly likely to place greater emphasis on cost optimization and cash flow management.
 
TBR maintains its belief that the telecom industry will look very different by the end of this decade as current events and entrenched challenges push the industry through an evolution.
 
To read the entire 2024 Telecom Predictions special report, request your free copy today!

HCLTech Solves ‘Know Your Customer’ for European Bank

During HCLTech’s Financial Services Advisor and Analyst Day in New York City this past August, the vendor described an engagement with a European bank in which HCLTech provided a comprehensive Know Your Customer (KYC) solution. TBR requested further details and met with HCLTech leaders responsible for the solution and the vendor’s European Financial Services practice in September in London.

 
HCLTech has a long history of working with banks and has developed an appreciation for the associated challenges, technology environments, and regulatory and compliance demands, in addition to the full stack of ecosystem partners. Additionally, HCLTech understands financial institutions’ KYC risks and has applied the company’s own investment and IP to address clients’ concerns around data and processes.

 

Over the last couple of years, HCLTech created a comprehensive approach to KYC for a European bank, solving a number of the bank’s operational challenges, including collating siloed processes and gathering related and dependent data and analytics into a single stream, allowing the CIO to see and understand the technology challenges and bringing greater confidence in controls and reporting to the chief compliance and risk officer.

 

Having engaged this and other financial services clients, HCLTech leaders described the company as the “glue” between IT and risk and compliance. Critically, HCLTech’s leaders said their professionals on the highlighted engagement spoke extensively with the people handling the day-to-day work of analyzing KYC cases.
 

According to Santosh Kumar, Vice President and Head of Financial Services Solutions, EMEA and APAC, no other IT services vendor has received permission from — or even pressed for permission from — bank CIOs to interview and work with them around designing a technology solution that meets the needs of the banks’ KYC analysts, professionals that Santosh stated are frequently considered necessary cost centers within a bank’s operations.

 

Diving further into the use case, Abhishek Mishra, Senior Solutions Director, Financial Crime and KYC, HCLTech, first detailed the pain points and trends HCLTech sees across the financial services space, including false positive rates, compliance costs, cloud migration and enhanced data analytics.

 

Against these conditions, according to Mishra, HCLTech positions itself as “the beacon of trust and innovation in the ever-evolving landscape of financial crime prevention” and a vendor capable of empowering “organizations worldwide to safeguard their integrity and financial stability.” Against that aspiration, HCLTech highlighted the company’s resilience and experience, enhanced by technologies, particularly smart automation.

 

Further discussing HCLTech’s decision to engage directly with the KYC analysts, Mishra described how he sees a range of KYC issues that are not always apparent at the CIO or chief compliance officer level, including fragmented IT systems and frequent manual interventions into processes that should be standardized and automated.

 

Using the platform codeveloped with the European bank client, HCLTech helped the bank reduce its operations team by 60% and lowered the incidence rate of false positives by 30%. As Mishra noted, the industry standard incidence rate for false positives is around 90%, making any improvement a substantial savings in operations costs. Overall, the breadth and proven depth of HCLTech’s capabilities across the KYC and broader financial services space struck TBR as potentially significant differentiators as IT services vendors face increased pressures on their margins, talent management strategies and business models.

The “glue” between IT and risk and compliance

It is a bold claim, and plenty of consultancies, many IT services and even some technology vendors would self-describe as the essential connection between functional groups within an enterprise. HCLTech’s claim, in this particular case about KYC, holds greater weight based on the thorough — and fully operational — nature of its KYC platform.

 

In a wide-ranging discussion with TBR on this use case, HCLTech’s Mishra and Santosh did not shy away from answering several challenging questions, including why banking clients would trust HCLTech with as material a requirement as KYC as well as how HCLTech interacts with the regulators.

 

HCLTech executives showed a refreshingly honest assessment of the company’s place in the ecosystem, acknowledging that the Big Four firms continue to play an essential role in providing assurance to both clients and regulators that HCLTech’s KYC approach — and other banking process technologies — meet all criteria for reliability and compliance. Mishra and Santosh also readily acknowledged that the company’s role within the ecosystem depended on niche technology vendors, rejecting the idea that HCLTech was “end-to-end” while embracing the need to be a capable and easy-to-work-with ecosystem partner.

 

In addition to recognizing challenges within the ecosystem, HCLTech acknowledged that not every client would or could adopt the company’s KYC solution, given the complexities of banks’ legacy technology environments, ingrained cultural dispositions toward caution around all operational aspects governed by compliance obligations, and the myriad technology and IT services vendors that are scrambling for a chance to sell the next special tech-infused solution to a bank (hearing thunderous echoes here of generative AI).

 

Rather than pressing forward on a one-size-fits-all solution, HCLTech has created sandboxes for banks to experiment with a test solution, including the KYC platform, in safe — but realistic — environments. HCLTech’s well-established credibility among financial services clients unquestionably provides the company with entry to advise on new approaches to solving persistent problems.

Financial crimes will not disappear, but HCLTech could ease banks’ costs

In TBR’s view, offering new ways to solve persistent problems is precisely how HCLTech has tackled KYC. Banks budget a surprisingly substantial amount of their operational costs toward KYC, including funds dedicated specifically to paying fines if (read: when) they are found to be out of compliance.

 

In its European bank use case, HCLTech helped the client reduce its number of FTEs dedicated to KYC by 60% and also delivered an auditable, comprehensive and technology-enabled platform that the bank owns, operates and depends on. KYC challenges will never disappear: Money launderers — perhaps Venice’s second-oldest creation — will always be more creative than governments, banks and technology companies. But if HCLTech can substantially reduce banks’ KYC costs without compromising compliance, it is going to unlock considerable value for banks to invest in additional services, technologies and transformations.

 

TBR believes the keys to HCLTech’s success in KYC include:

  • Continuing to focus on being the “glue” between risk and compliance and IT: HCLTech has established credibility with the latter and has grown its credibility with the former, but both will remain essential to KYC transformation. Sticking to its comfort zone with technologists will limit HCLTech’s ability to scale a KYC solution.
  • Staying within its swim lane: Although it is contradictory to the point above, HCLTech should focus on delivering comprehensive and highly functional solutions, in sync with the company’s engineering DNA. HCLTech executives’ willingness during the conversation to cede consulting territory to the Big Four firms (notably EY) struck TBR as exceptionally self-aware in assessing HCLTech’s role in the broader banking ecosystem.
  • Remaining patient: TBR has been briefed on countless transformational solutions that are ready to address burdensome costs with bullet-proof technology. And TBR has heard specific transformational use cases cited … but often three, four or five years later, raising the question: “If that solution was so great, why didn’t it scale?”

HCLTech may have something great here. With patience, discipline around partnering, and a focus on collaboration with the right clients in the right setting at the right time, HCLTech’s KYC solution could become a materially significant step forward in banks’ operational costs and also a good thing for society when it comes to combating fraud and financial crimes.

 

Operators Target Emerging 5G Use Cases, but Monetization Will Remain a Challenge

Approximately 100 industry analysts in addition to representatives from well-known telecom operators and vendors convened at the 2023 5G Americas Analyst Forum to discuss the state of the 5G market in North America and Latin America. The event featured keynotes from Ulf Ewaldsson, president of Technology at T-Mobile, and Scott Blake Harris, senior spectrum advisor at the National Telecommunications and Information Administration’s Office of the Assistant Secretary. The event also featured a series of roundtable discussions focused on key topics in areas including 5G network infrastructure and technologies, private cellular networks, multi-access edge computing, IoT, regulatory considerations, and enterprise and consumer 5G use cases.

TBR Perspective

The 2023 5G Americas Analyst Forum highlighted that 5G development in the U.S. is in its middle stages as operators are on track to complete the bulk of their midband 5G spectrum deployments in 2024. The return on investment for 5G remains unclear, especially for Verizon (NYSE: VZ) and AT&T (NYSE: T) due to their heavy investment to acquire C-Band spectrum licenses.
Operators remain challenged in monetizing 5G because use cases, with the exception of fixed wireless access (FWA), are still limited, especially within the consumer market as LTE remains sufficient to support current smartphone apps in most instances. Conversely, revenue generation for enterprise 5G use cases in areas including private cellular networks (PCNs) and multi-access edge computing (MEC) is taking longer than anticipated as many clients are postponing implementing these solutions until business cases and benefits become more certain.
 

Despite current challenges in monetizing 5G, investments in the technology remain necessary for U.S. operators to remain competitive with each other, to add network capacity to support rapidly growing data traffic, and to gain network efficiencies and cost savings as 5G is significantly better at handling network traffic compared to LTE. Additionally, new technology standards, including 3rd Generation Partnership Project (3GPP) Releases 16 and 17, are helping to unlock the potential of 5G solutions in areas including MEC, network slicing, industrial IoT and V2X (vehicle-to-everything) while the upcoming 3GPP Release 18 will debut 5G Advanced technology. Though the availability of these technologies will create 5G monetization opportunities, TBR expects hyperscalers, application developers, OEMs and other players within the technology industry to capture the majority of new revenue from 5G-related solutions, while operators will serve mainly as connectivity pipes to support these solutions.

Click to register for our next private 5G TBR Insights Live event!

Impacts and Opportunities

5G Adoption Is Accelerating in North America, but Revenue Generation Remains Minimal

Though North America leads other regions in the adoption of 5G-compatible devices and enrollment in service plans, direct revenue generation for operators from smartphone customers is limited due to minimal use cases besides providing faster data speeds. TBR believes operators are monetizing 5G in indirect ways, however, including by helping to ensure strong quality of mobile broadband service to minimize churn and by leveraging enhanced network capacity to support features exclusive to higher-tier service plans such as increased high-speed mobile hotspot data limits before speeds are throttled as well as increased data tiers for mobile hotspot coverage. Certain operators, most notably Verizon, are also limiting access to midband 5G services to customers enrolled in premium service plans.
 

FWA currently provides the most significant 5G revenue opportunity for operators, as evidenced by T-Mobile’s (Nasdaq: TMUS) and Verizon’s FWA services outperforming cablecos and other broadband providers in broadband subscriber growth in recent quarters. Government initiatives will also help to further FWA customer adoption and service availability, including via broadband funding programs as well as through financial assistance programs, such as Metro by T-Mobile offering discounted FWA pricing via the government’s Affordable Connectivity Program. However, TBR believes FWA will hinder revenue generation long-term when considering the entirety of the broadband industry due to the lower price points of FWA as well as most FWA customer additions stemming from share shifting from other broadband providers. FWA will also result in “race to the bottom” pricing as cablecos and other broadband providers will likely become more competitive in their pricing in the long term to attract and retain customers.

A National Spectrum Strategy Is Vital to Support 5G Long-term While Creating a Foundation for 6G

Scott Blake Harris discussed the National Spectrum Strategy, an initiative headed by the U.S. Department of Commerce, NTIA and other federal agencies, including the Federal Communications Commission (FCC), to address the long-term spectrum requirements within both the public and the private sectors. The National Spectrum Strategy is expected to be finalized by the end of 2023 and is focused on creating a pipeline to enable the U.S. to maintain its leadership in spectrum-based technologies, ensure long-term spectrum planning in the U.S., and foster unprecedented spectrum access and management through technology development. A key priority of the National Spectrum Strategy is to improve communications between government agencies and the private sector and to identify and evaluate 1500MHz of spectrum in the U.S. that could be repurposed based on the requirements of both sectors over the next decade.
 

The clearance of additional spectrum will be essential for U.S. operators to support rising 5G traffic long-term while helping the U.S. to compete at the forefront of 5G development against other leading countries such as China. TBR believes the National Spectrum Strategy may be facing resistance, however, from federal entities hesitant to clear certain spectrum to the private sector as the CTIA reports the U.S. government controls 600% more midband spectrum than the commercial U.S. wireless industry. For instance, the Department of Defense has expressed reservations about clearing certain spectrum, such as within the 3.1GHz -3.45GHz range, due to national security concerns as the spectrum currently helps to support military infrastructure including defense systems.

Revenue Generation from Enterprise 5G Use Cases Will be Limited for Operators as Other Players Within the Technology Industry Position to Capitalize on These Solutions

Keynotes and roundtables throughout the event discussed the benefits 3GPP Releases 16-18 will provide to support 5G-related network capabilities and use cases.
The technology advancements provided by these releases will help to advance the development of 5G enterprise use cases in areas including MEC, PCN and IoT. However, hyperscalers, OEMs and other players in the telecom ecosystem are also making headway in these areas, which is causing operators to share revenue from these solutions in many cases and to be circumvented altogether in other instances.
 

For instance, AT&T’s, T-Mobile’s and Verizon’s go-to-market strategies for MEC have centered on leveraging hyperscalers’ partnerships to accommodate client demand for Amazon Web Services (AWS) (Nasdaq: AMZN), Google Cloud (Nasdaq: GOOGL) and Microsoft Azure Nasdaq: MSFT) solutions. In many cases, clients are opting to work directly with hyperscalers and OEMs in PCN, circumventing operators altogether.
 

Network slicing is another emerging 5G use case discussed throughout the event that is beginning to gain traction. T-Mobile is positioning to be an early leader in network slicing due to its time-to-market advantage in deploying 5G standalone nationwide. The operator recently launched its 5G networking slicing beta program nationwide, which is initially targeting developers seeking to leverage the technology to enhance video calling applications, and T-Mobile will expand the platform to support additional applications and use cases in the future.
 

Initial companies exploring the platform include Dialpad Ai, Google, Cisco and Zoom. TBR expects operators will monetize network slices by providing specialized pricing tiers to optimize coverage and service quality for certain use cases and applications, though in most instances developers and other players will be the entities that will generate the lion’s share of new revenue from these use cases. TBR expects the scenario will be similar to the LTE era, in which operators served mainly as the connectivity pipes for new applications in areas such as ride-hailing and video streaming but other players captured nearly all of the new revenue.
 

Leveraging satellite connectivity to support mobile customers was another emerging use case discussed at the event. Satellite connectivity is gaining headway through new 3GPP standards releases and recent partnerships such as T-Mobile teaming with SpaceX, Verizon partnering with Amazon’s Project Kuiper, and Apple (Nasdaq: AAPL) collaborating with Globalstar. Satellite connectivity is initially being leveraged by operators to support emergency SOS texting services in remote areas without cellular coverage, though satellites will be leveraged to support more advanced voice and data capabilities in the future. Though partnerships between operators and satellite providers are promoted as being mutually beneficial for both parties, opportunity exists for significant market disruption in the long term if satellite providers decide to target nationwide satellite-based smartphone service directly to consumers once technology capabilities advance and a sufficient number of satellites have been deployed.

Conclusion

The 2023 5G Americas Analyst Forum highlighted the progress operators have made in deploying their 5G networks, especially regarding deploying midband 5G services. This progress, coupled with advancing 3GPP technology standards, provides operators with a foundation to target emerging use cases, especially within the enterprise space. Operators will be challenged, however, in sufficiently monetizing these use cases to generate a viable return on investment that offsets heavy 5G spectrum acquisition and infrastructure deployment costs.

One Lenovo: Creating a Cohesive Global Technology Solutions Company Begins with Unification

Lenovo Global Industry Analyst Conference (GIAC) 2023 was the first cross-business unit analyst event Lenovo has held since the start of the pandemic. The conference aimed to give analysts a view of the full breadth of Lenovo’s portfolio, the corporate identity weaving through the company’s various line-of-business (LOB) strategies, and the executives running the show from behind the scenes. Clear goals of the event were to drive market awareness of Lenovo’s capabilities, particularly around its Infrastructure Solutions Group (ISG) and Solutions and Services Group (SSG), and to contribute to the company’s multiyear efforts to reshape its brand image and be known as a wholistic technology solutions company.

Creating intersegment coherence with One Lenovo: CEO Yang Yuanqing

Among the challenges Lenovo has encountered as a global business, maintaining operational consistency while minimizing the increase in organizational complexity as the company scales is chief among them. As Lenovo grew beyond selling PCs with the acquisition of IBM’s (NYSE: IBM) x86 server business in 2016, the business units did not necessarily create broad synergies beyond operations optimization such as component sourcing and manufacturing. More specifically, the sales motion became and remains quite fragmented due to the differences in use cases and end-user personas of PC and server purchasers. To address this, Lenovo is undergoing a transformation to become a more cohesive company instead of a siloed one. This transformation effort has been dubbed “One Lenovo” and represents both an internal process and philosophy shift as well as an external interface shift to unify and simplify the company’s go-to-market approach for its customers and partners.

 

Lenovo CEO Yang Yuanqing’s background and affinity for hardware underpinned his sweeping message, which was a simple and respectably grounded one: Lenovo will continue to have the DNA of a hardware company. In spite of the changes in the company, including a long-term diversification of revenue, Lenovo will continue to sell a massive amount of hardware, and the portfolio changes regarding the company’s vision around solutions and services will be additive in nature, not alternative, to provide customers with end-to-end solutions in a diverse set of commercial scenarios.

Lenovo is bringing AI to the data: CTO & SVP Yong Rui

Predictably, another main focus of the event was to showcase Lenovo’s capabilities and strategy in AI across the portfolio. This began with Yong Rui, Lenovo’s CTO and SVP, laying out the context of Lenovo technologies in eight areas: cloud and edge computing, advanced computing, wireless technologies, vehicle computing, device innovation, next-gen interaction, the metaverse and AI. Rui clarified that Lenovo’s play in foundational models would not be in creating such models but rather leveraging them in its future vision of AI ownership and accessibility.

The company believes that three primary buckets of AI models will emerge: public models (foundational models) accessible to all, private models accessible to a group (such as enterprises), and personal models accessible to a single individual. On top of this, Lenovo suggests these models will differ in size, location (and underlying hardware), and personalization. In essence, Lenovo contends that it will be bringing AI to data rather than bringing the data to AI, as the company envisions a future where each device Lenovo sells will have AI embedded in it.

 

However, there is still a large gap in Lenovo’s current capabilities as well as the overall AI landscape that needs to be bridged in order to reach such a vision. For example, the multimodal framework will need interoperability for public, private and personal models to interact, which creates an underlying challenge in governance and data privacy protocols. Additionally, the battle between foundational AI models in the market remains ongoing, meaning no one truly knows which models will survive and continue to be developed, creating a challenge in future-proofing innovations.

AI initiatives

Lenovo’s AI strategy is currently a broad one. It starts with the company’s core competency in hardware to be an AI-capable infrastructure provider with its data center server portfolio, which includes NVIDIA GPUs. On top of that, Lenovo has an edge server portfolio that spans in form factor from the data center servers to clients featuring a variety of silicon options including Intel Atom, NVIDIA Jetson and AMD EPYC processors. To drive adoption of its edge servers, Lenovo has committed to invest $100 million into its AI Innovators program, which has begun introducing use-case-specific offerings with the goal of creating seamless, verticalized, outcome-based solutions deployments. In storage, Lenovo has been targeting the entry-level market while partnering with WEKA to enable workloads for high-performance compute (HPC) and AI through the combination of Lenovo’s software-defined storage platform and WEKA Data Platform software.

 

In Lenovo’s services division, SSG, AI activity is relatively nascent but is developing quickly. The company is building capabilities and solutions that leverage its operational and customer data to train foundational models. These initiatives are designed to improve both customer support and internal operational efficiency. Lenovo also previewed two new consumption-based TruScale offerings for AI — Developing AI at Scale as-a-Service and Applying AI at Scale as-a-Service — that were announced during the Lenovo Tech World event.

 

In its Intelligent Devices Group (IDG) Lenovo aims to compete in the AI PC space leveraging its core PC portfolio. Lenovo envisions a world where PC users will leverage AI to achieve hyperpersonalized experiences. The company hopes that the AI PC concept will accelerate refreshes in PCs to end the past year’s market slump.

 

In summary, Lenovo’s AI strategy spans all three business units with the most mature and tangible offerings coming out of ISG while SSG and IDG continue to develop. From an overall organizational standpoint, the company is in the middle of the first wave of AI portfolio expansion, defined by its broad pursuit of applications. Lenovo remains in the stage of discovering where AI is a sensible fit and where it may not be. What will follow is an eventual consolidation and clarification stage, where the company will delineate the disparate efforts from the successful vision-fit initiatives and drive a focused expansion from there. It will be exciting to see how the strategy unfolds.

 

Dell Security Services: Steady, Smart and Positioned to Accelerate on Zero Trust Solution

In a wide-ranging discussion with TBR, Dell Technologies’ Adam Miller, a marketing leader focused on cybersecurity, explained his company’s strategy in the security services space, including how Dell Technologies expects to stand out over the coming few years. The following analysis reflects both that discussion and TBR’s ongoing coverage of Dell Technologies.

20 years of experience and 1,000-plus customers

Dell Technologies (NYSE: DELL) is well known for its secure devices and infrastructure but is quickly catching up to peers in terms of name recognition around security services (see below for a description of the company’s security services portfolio). While brand can be improved through marketing, acquisitions, and sustained and successful partnerships that deliver security services value to clients as part of a multiparty engagement, Miller believes Dell Technologies will get a substantial boost based on its expanding Services portfolio and impactful Zero Trust security partnership with the U.S. Department of Defense (DOD).

 

As part of the initiative, named Project Fort Zero, Dell Technologies, in concert with 30-plus other technology partners, will deliver an “advanced maturity Zero Trust solution” — validated by the DOD — within the next 12 months. U.S. defense and intelligence agencies have long been viewed as leading edge organizations with respect to cybersecurity, and vendors have often sought to use credentials related to providing security solutions to the U.S. federal government as a testament to their expertise and reliability. Dell Technologies should see a substantial increase in its brand value around security services as the company expands the Project Fort Zero initiative across the wider U.S. federal government and enterprise organizations.

 

In addition to providing clients with some sense of authority and dependability, Dell Technologies should also benefit from leading a loose consortium of security-related technology solution providers. The cybersecurity space is far too vast for any single player to truly be “end to end,” so partnering well across the ecosystem frequently separates leaders from middle performers and laggards. Dell Technologies should be able to leverage Project Fort Zero to solidify its partnering capabilities and demonstrate leadership.

Proven and low-cost strategy: Going to market as part of larger Dell Technologies

Similar to peers’ offerings, Dell Security Services are offered as an add-on to technology and services engagements, almost always with existing clients. Miller did not anticipate any change in that approach, and TBR believes executing well on a proven and low-cost strategy trumps aggressive sales campaigns and marketing blitzes.

 

TBR recognizes that the add-on approach could limit Dell Technologies’ security services’ growth, but the company can lean into its trusted technology provider reputation and strong client relationships to ensure security services, at a minimum, keep pace with the larger company and are positioned to accelerate when market conditions permit.

 

As noted above, should Project Fort Zero significantly boost Dell Technologies’ security brand, the company may be able to use security services as a leading offering in its go-to-market strategies.

Dell Technologies streamlined its security portfolio following several divestments in recent years

Since the close of the massive $65 billion acquisition of EMC in 2016, Dell Technologies has been a seller in the M&A market, slowly refining its portfolio while also paying down debt. This has involved divesting parts of its security portfolio such as SonicWall and RSA. However, Dell Technologies very much remains a player in the cybersecurity arena, with recent divestments enabling it to narrow its focus on its overall security strategy. Several of the company’s security offerings are now tucked under the APEX umbrella, such as APEX Backup, APEX Cyber Recovery and APEX Data Storage.

Additionally, Dell Technologies has developed its security services strategy to focus on three fundamental areas that help customers reduce their attack surface, protect their assets and data, manage security proactively, and help build cybersecurity maturity. These families of offerings and services fit well with the company’s portfolio and overall strategy, aligning with its existing hardware products, and creating opportunities for attached sales to larger IT infrastructure engagements.

Steady, smart and sane, with a boost coming from Project Fort Zero

Miller made clear to TBR that Managed Detection and Response remains a target area for Dell Technologies’ investments, while noting his company understands that many peers in the security services space view Managed Detection and Response as a core offering. That understanding marks exceptional self-awareness on Dell Technologies’ part about where the company fits within the broader cybersecurity ecosystem. Dell Technologies has strengths it can play to and believes the security services market has long-term potential.

 

In TBR’s view, Dell Technologies has not been trying to differentiate where differentiation is impossible and is not betting the farm on security services growth. Instead, the company is taking an approach that is steady, smart and sane. Add to that strategic approach a potentially highly beneficial solution validation from the U.S. Department of Defense, and Dell Technologies has positioned itself well for steady, and possibly accelerated, security services growth.

 

TBR’s coverage of Dell Technologies includes individual vendor coverage by the IT Infrastructure, Devices and Professional Services teams, along with various benchmark and market forecast coverage across TBR.