Atos’ sustainability play relies on ecosystems, science and leading by example


In mid-May, TBR met with senior leaders from Atos to discuss sustainability and Atos’ role as an ecosystem orchestrator, a services vendor and a role model for decarbonization. Jason Warren, VP head of Atos’ NetZero Transformation portfolio, and Miriam Hanckmann, head of Atos’ Digital Net Zero Portfolio, walked through a detailed presentation, including alliance partnerships, case studies and Atos’ overall strategy around sustainability. The following reflects both that discussion and TBR’s ongoing analysis of Atos.

3 characteristics of Atos’ approach may not be unique, but the combination is

Atos’ Warren and Hanckmann highlighted characteristics of their company’s approach to sustainability, which collectively may separate Atos from peers, even if other IT services vendors can claim one or two similar characteristics.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

 

First, Atos knows it must work within an ecosystem and cannot provide services or advance sustainability goals alone, an approach that reflects the company’s overall ethos of addressing climate change. Second, even with that understanding, Atos has become practiced at being customer zero, demonstrating the value of the company’s sustainability efforts as blueprints for others. Lastly, and perhaps truly unique among peers, Atos relies on and boasts of a science-based approach that resonates with CIOs and other enterprise decision makers responsible for technology and sustainability. Atos leads with science, not just good ideas.

 

While these characteristics potentially separate Atos, some of the company’s offerings could be replicated by firms better positioned to deliver value around government, risk and compliance, particularly the Big Four. In addition, Atos’ sustainability engagements to date have been heavily weighted toward Europe, which may limit the company’s global appeal and ability to deliver to clients worldwide. Even with these cautions, TBR’s overall assessment remains that Atos has built substantial credibility, experience and expertise around sustainability that should keep it among the leaders in the space, even in the event of a company split.

Building an ecosystem sometimes requires establishing a star for the system to revolve around

Within the consulting and IT services space, many vendors claim to possess end-to-end capabilities, a description only applicable when the vendor defines the ends of the spectrum. In reality, every client engagement includes ecosystem partners as every consulting or IT services business problem cuts across more technologies and business challenges than any single vendor can handle.

 

Sustainability takes that reality and stretches it beyond imagination. Among an enterprise’s strategies, the operations and responsibilities potentially implicated in meeting sustainability goals include — at the bare minimum — fleet and real estate management, procurement, supply chain, on-premises and cloud computing, emissions, and employee business travel. If no single IT services vendor or consultancy can address an enterprise’s full range of sustainability needs, partnering across a well-curated and constantly tended ecosystem becomes essential. On this, Atos may stand apart from peers.

 

During the discussion with Atos’ leaders, Hanckmann described the 330+-person EcoAct consultancy, acquired by Atos in 2020, as being staffed primarily with “climate PhDs.” She added that Atos’ technology capabilities, combined with EcoAct’s consultants, would help the combined companies’ clients use “digital to accelerate advisory … measuring emissions, gathering data, accelerating consultancy solutions.” Hanckmann also noted that the Atos-EcoAct combination creates a “green network for cultural alignment, learning from each other what digital and climate mean and how they interact, so clients get the full spectrum and maturity of conversations.”

 

In TBR’s view, Atos could potentially leverage its independence from traditional governance, risk and compliance work as added value to enterprises looking to ensure financial performance metrics are not unduly influencing sustainability metrics.

 

The last 24 months have been marked by an uptick in large consultancies and IT services vendors acquiring sustainability or decarbonization boutiques, so what makes EcoAct special is the ecosystem Atos folded it into, one that includes Atos’ long-standing relationships with Siemens, which is also one of Atos’ key accounts, and Johnson Controls, as well as EcoAct’s role in the Atos Climate Innovation and Knowledge Center (CLICK).

 

The special client relationships give Atos deep insight into sustainability challenges facing manufacturing companies, including, as described by Warren, “how to integrate products and provide real-time data.” Warren added, “Tech partners like Johnson Controls develop and deploy building management systems,” which Atos then aligns with hyperscalers and industry consortia to create a full package of data, analytics and decision making around decarbonization. Atos orchestrates others’ efforts to bring clearly defined value to shared clients. Within the CLICK, EcoAct consultants works with “academic, institutional, public, and private partners” to “develop methodologies, analytics tools, tap key areas of expertise, and support customers … driving standardization and normalization in reports and publications,” according to Warren.

 

While Atos’ ecosystem strategy may not be unique among IT services vendors, the company’s emphasis on partnering across a wide spectrum of sustainability stakeholders and actors, rather than touting stand-alone capabilities and offerings, demonstrates a maturity in thinking about decarbonization, reflecting Atos’ relatively long-standing commitment to climate change efforts.


‘Show me what you did, don’t tell me what you know’

The customer-zero approach — selling to clients based on strategies, initiatives and technology-based solutions deployed by the vendor within its own operations — has been widely adopted in recent years, becoming a resonant use case, when applicable. In TBR’s experience, Atos has rarely presented itself as customer zero, but with sustainability the vendor has been showing its own standards to clients and the internal lessons learned on adoption, measurement and change management. Clients, according to Warren, have been interested in not only the solutions but also what Atos did with its e-car fleet, remanufactured laptops and even its branding around carbon reduction.

 

Additionally, Atos built a carbon data platform internally, which the vendor now offers to clients as a tool, branded as MyCO2Compass, within a sustainability engagement or as a single offering within a subscription base. Perhaps the best summation of Atos’ approach to selling its own record as a means of selling its services comes from the vendor’s list of company credentials, which begins with: “Net zero is key to Atos’ raison d’être.”

 

According to TBR’s March 2022 Digital Transformation: Cross-Vendor Analysis:

 

“As most enterprises consider sustainability to be part of broader DT programs rather than stand-alone initiatives, it is not surprising that buyers rank working knowledge of sustainability services-related compliance risk and privacy issues as the most critical attribute for vendor selection as they want to minimize business disruption.

 

“Vendors’ market awareness backed by ongoing industry knowledge and investments in their own sustainability programs also rank among the most critical attributes, as vendors that can demonstrate business outcomes through industry-aligned use cases typically alleviate buyer concerns around new investments.

 

“Becoming customer zero is a well-known approach, particularly around sustainability, as it can accelerate vendors’ opportunities in the space and supports them in two ways: first, as a PR vehicle and second, in building the necessary use case for conducting workshops. Price was among the important, but least critical, attributes for vendor selection, suggesting buyers are mostly in the exploratory stages and competition on the vendors’ side has yet to intensify.”




One last point on customer zero: As noted above, Atos’ sustainability engagements have mostly been in Europe, which could potentially be played as a strength in two ways. First, Atos’ efforts to meet European standards and regulations should resonate well with a predominantly European client base. Second, if Europe turns out to be a test bed for environmental regulation, as countries in other regions begin adopting similar legislation and monitoring and reporting structures, Atos will be able to demonstrate its own successful adaption to clients in those non-European jurisdictions.

Appealing to technologists through science

Finally, Atos leads with science, in a manner perhaps unique in the IT services space. Of course, all IT services vendors lead with and lean on technology, but Atos’ approach comes across as more foundationally rooted in a scientific approach to solving business, technology, operational and — in this case — global problems.

 

The Atos Scientific and Expert communities provide research and promote Atos’ approach to solving technology-related problems. And throughout the sustainability discussion and in countless briefings over the last dozen years, Atos has consistently placed the scienced-based core of its offerings at the forefront. Warren and Hanckmann mentioned science-based net-zero target setting, cooperation with universities around climate research, and 160 patents relating to decarbonization and energy efficiency.

 

Warren noted, “Decarbonization is embedded in everything Atos does, and we’re working with R&D teams to devise reliable and tangible actions to deliver on carbon commitments in deals,” further cementing the scientific emphasis. In TBR’s view, Atos would benefit from leaning even more heavily into a science-based brand. Atos’ capabilities are rooted in science, not just ideas. While every IT services buyer prioritizes differently — and perhaps European buyers, especially around sustainability, are bit more technology-oriented in contrast to U.S. buyers looking for consulting — most CIOs and CTOs, Atos’ core buyer personas, appreciate the emphasis on science and data.

 

According to the same TBR digital transformation research as cited above, “Regional differences in vendor selection criteria underscore the importance of employing a localized go-to-market approach. For example, the top vendor attribute in North America was price, while in Europe respondents ranked vendors’ sustainability services scope as No. 1. In APAC, working knowledge of sustainability services-related compliance risk and privacy issues was the top factor. We see European buyers as the most mature in both road mapping and executing around their sustainability initiatives as regional legislation requires compliance in certain areas, thus creating broader opportunities for vendors with comprehensive portfolio offerings that can also rely on and manage partner ecosystems.”

Can Atos’ approach maintain sustainability?

In TBR’s view, Atos’ current leadership around sustainability stems in part from the combination in full of these three characteristics: willingness to play across a wide ecosystem, dedication to implementing internally first and then rolling out solutions to clients, and leading with a science-based approach.

 

Atos’ continuing leadership role in the sustainability space may also depend on these characteristics, as a potential global recession, continued high inflation, and an overall weariness of climate change challenges diminish buyers’ willingness to spend on decarbonization efforts. By being able to tap into a wide range of solutions through partners and the continued ability to demonstrate decarbonization and financial success, all underpinned by relentless science, Atos may be part of the overall effort to keep up the necessary pressure to maintain corporate interest around sustainability. A tall order, but no doubt a welcome challenge for a company that has made net zero integral to its overall mission.

 

Atos’ sustainability capabilities extend beyond what we have described above, and in the coming months, TBR will examine Atos’ and other IT services vendors’ and consultancies’ decarbonization efforts and offerings in greater detail, in both the individual vendor reports and the upcoming Decarbonization Market Landscape.

EY Managed Services protect clients from the bleeding edge of regulatory change

EY views managed services as a ‘no regrets business’

Discussion of EY managed services strategy in context of EY’s overall operations kicked off the EY Managed Services Analyst Summit. EY Global Vice Chair – Markets Jay Nibbe touched on the rumors around the operating models with the cryptic statement that regardless of how EY looks from a financial reporting system, managed services will continue to be a strategic aspect of the EY business or businesses.

 

In Nibbe’s view, managed services are strategic to the pivot EY and its peers are making in the market. Nibbe described this shift as going from an advisory and compliance model to a report-advise-operate model. Data-driven insights are provided to clients, EY advises and assists with transformation and change management, and then EY operates the critical services through its ongoing managed services capabilities.

 

A $750 million investment underpins EY’s commitment to growing out its managed services portfolio, with more money to follow. Nibbe described managed services as a “no regrets business,” as in no regrets to continue investing in the space.

 

Market and competitive intelligence straight to your inbox each month, absolutely FREE

 

‘Managed services’ is an improper label for its portfolio, according to EY

Global Vice Chair – Managed Services Paul Clark re-enforced Nibbe’s commitment, saying EY managed services was currently 18% of its total revenue, with a $360 billion total available market estimate. Clark also called managed services “a broad church” and stated EY is not after traditional ITO or BPO engagements.

 

Therein lies EY’s branding challenge. Many view managed services as a new label for the labor arbitrage outsourcing services that rose to popularity at the turn of the century. These BPO services were colloquially described as “handling the mess for less” and do not accurately depict, EY believes, the value proposition of the new suite of services infused with AI and resting on top of a standard data platform such as EY Fabric. For example, EY equated legacy BPO offers to bookkeeping whereas its service is accounting.

 

Most EY managed services engagements start with an advisory engagement. Pandemic pressures held a mirror up to customers’ operations as they struggled to continue their legacy practices amid remote working. Further, the increasing volume of regulatory change across the globe makes it hard for multinational enterprises to keep pace, increasing their risk of being out of compliance.

 

Regulatory change and associated business risk results in greater boardroom attention to operations, with a focus on making sure the business processes work. Handling the mess for less, EY asserts, does not resonate with the board when the current operations leave the company open to risk and regulatory fines for noncompliance. Savvy managed service buyers want to know the process will better monitor outside-in changes to their business environment and will provide advice on impact to internal operations.

A global data platform, EY Fabric is EY’s distinctive accelerator for managed services

EY said little about infrastructure technology, and yet the value propositions discussed throughout the day repeatedly referenced EY Fabric. A cloud-based data lake infused with AI and machine learning, the critical distinction of EY Fabric is that it is one global operating model. A single global operating model requires a standard set of business rules and inordinate amounts of data wrangling before any analytics can be applied against the data for business insights.

 

For years technology vendor events have brought forth clients to share their operating horror stories of trying to get right the standard data model. That EY, a global partnership, was able to settle on one global data model internally, and then drive it out to market is a testament to the EY operating culture, and a boon to its managed services practice.

 

EY Fabric automates data wrangling for EY clients. It then extracts data from client systems, normalizes the data in EY’s data lake and runs proprietary algorithms against the data. Finally, EY Fabric reports fact-based insights and change management recommendations to the client. From those advisory engagements flows the managed services agreements, where EY “lands” by addressing the topmost set of operational pain points, and then “expands” through that proof of value into adjacent service modules.

Critical alliances link EY Fabric to customer instances and orchestrate the services

EY is quick to say it orchestrates and deploys popular commercial software applications rather than builds software. Partners SAP, Microsoft and ServiceNow joined EY on stage at the managed services event. SAP represents the legacy application layer housing most EY client data that must be extracted and run against EY algorithms for business insights. Microsoft underpins the cloud-first EY Fabric and co-innovates with EY on the hooks into customer data. ServiceNow provides the base workflow shell for many of the EY managed services workflows.

 

Other partners exist to provide necessary information feeds, but these three underpin the platform. In emerging technology, EY uses its overarching theme of “making sure it works” to explain why it is reticent to embed software from smaller companies into its services. It stated it will integrate and orchestrate such offerings on behalf of clients, but it does not intend to be on the technology bleeding edge. Its focus, in TBR’s view, is to protect its clients from the bleeding edge of regulatory change.

EY organizes its managed services into five broad categories

EY visually represents its managed services offerings as five suites that all revolve around data and AI, or the EY Fabric platform at the core. Some of the operational themes cut across suites, but how the portfolio is arranged is immaterial to the way in which EY pursues managed services.

 

For EY, the pursuit starts with determining the top pain points customers seek to address, then conducting a business assessment and presenting recommendations on how the EY managed services components can improve operational flows and reduce business risk in the process.

 

Each module has been written under the one global EY architecture in a cloud-first containerized fashion running on Microsoft Azure. As such, the mixing and matching of services integral to the “expand” element of a land-and-expand process becomes a function of activating new services, as proof of value has been displayed.

 

The core modular groupings of EY managed services are:

  • Finance and Tax is by far the largest segment of EY’s managed service portfolio, as expected from an advisory firm with tax and audit lineage. EY Fabric brings the potential of moving to a continuous audit function based on the ongoing AI monitoring of the regulatory environment that is then mapped against the client’s business parameters to create a custom set of action items for the client. EY Virtual Internal Audit is at the core of the disruptive capability. These capabilities augment internal audit functions, enabling internal teams to shift focus in real-time based on the automated advisory notices EY algorithms generate from reviewing regulatory notices.
  • Risk and Cyber grows in client importance with each passing data privacy law and well publicized security breach. Here EY relies on partnerships for threat monitoring to ingest into its AI engines to proactively push alerts and recommendations to its client base. EY claims its cyber practice is growing 30% year-to-year. EY sees the upside to these as cyber engagements continue converting to managed services clients.
  • Talent is an area EY expects to grow rapidly. Accelerated by the pandemic, these EY services are aimed as much at managing the regulatory environment for payroll across multiple countries as they are at improving user experience. From its global platform, EY provides a set list of standard forms employees need for various work verification requirements for home loans, among other things. Additionally, EY talent customers can offer EY tax preparation services to their employees as well as access to EY education modules called EY Badges for ongoing professional development.
  • Legal is a domain EY bolstered with the acquisitions of Pangea3 Legal Managed Services (part of Thomson Reuters) and Riverview Law. EY’s internal relationships with certain clients, including those more acquisitive in nature, allow the firm to lead itself into new engagements in an event-driven business. Leaning on its existing relationships and strengths around contract management and compliance, EY will create repeatable processes that help clients execute on legal managed services contracts.
  • Sustainability is a hot topic industrywide. It is where the notion of evaluating risk for competitive advantage comes to the fore. In anticipating the regulatory change, EY clients can evaluate the upside and the risk of existing businesses against the anticipated back drop of sustainability regulations globally. Additionally, increased scrutiny around the measurement and reporting of environmental, social and governance (ESG) metrics aligns with EY’s auditing resources to secure data management and sharing, validate data, and prepare reports from a standardized perspective.

Like many managed services, EY’s services have evolving commercial constructs

Multienterprise business offers are in a state of commercial flux as legacy license models give way to “as a Service” models. Most commercial contracts are between EY, as the solution orchestrator, and the client. Its strategic partnerships with Microsoft, SAP and ServiceNow also mean it can negotiate terms with greater flexibility and cooperation than most partners of those firms.

 

With customers, the EY value proposition revolves around outcomes and cost avoidance. Similarly, the value proposition is not about labor arbitrage, rather real-time access and insight from EY’s domain experts that are baked into the offer through AI automation and expert pattern recognition. In this sense, EY’s value proposition is strategic staff augmentation rather than data entry staff augmentation.

 

The explanations and use case examples for this strategic staff augmentation were clear at the event. The regulatory environment is moving too fast for individual firms to hire the requisite number of domain experts to reduce risk. It is better to rely on the outside experts ahead of the audits to reduce risk exposure and better inform the accelerating strategy cycles.

 

Examples offered by EY of this point included:

  • Over 57,000 regulatory alerts in 2021 and the $6 trillion cost of breaches; Virtual Internal Auditor handles those alerts in real time
  • $537 billion cost to enterprise for data integration/data wrangling, which EY Fabric does, with two-thirds of the enterprises surveyed intending to spend more in the future
  • A Talent entry point is to assume payroll responsibilities in the second-tier countries where enterprises operate, given EY has domain expertise in 160 countries under one global data model. Further, EY has a real-time chat bot that can answer strategic staffing queries for engaged leaders. EY claims this is the first of its kind in the talent management space.

 

One of the clients speaking on the panel exports products to 140 countries and has localized presence in 50, with only three tax specialists. The Group Tax, Customs and Insurance head summarizes its contract outlook thusly: “We are not after a discount; we want to get it right.”

 

According to this client, the EY value proposition was that EY would return twice the cost of the managed service back to them in savings and cost avoidance. They stated that, to date, EY has returned close to $900 million in savings to the company.

Market implications

Tax and audit firms are extremely well positioned in the IT industry writ large. Tax and audit firms are the final translation layer between business process and automated data flows. They translate business rules into the bits and bytes orchestrated by traditional technology vendors. EY has a distinct advantage and value proposition that focuses more on business risk and strategic planning rather than cost savings.

 

Lone enterprises cannot afford to keep human resources staff also knowledgeable of the rapidly shifting regulatory environment. With this in mind, EY aims to become the real-time advisor to these internal operations through automated delivery of curated EY IP based on its domain expertise.

 

While EY managed services might cost more than the in-house labor, the managed service will reduce the liability of noncompliance and likewise boost the strategic planning scenarios ahead of expected regulatory changes.

 

The implications to EY competitors are broad.

  • India-centric vendors whose value proposition rests on labor arbitrage must show greater value in risk mitigation and domain expertise. Some are wisely partnering with EY in specific use cases where EY’s service provides an additional value layer to the India-centric client.
  • Traditional ITO and BPO vendors face a similar threat. Can these vendors, through alliances or staff hires, provide the domain IP EY is capable of curating and rapidly scaling on the EY Fabric platform?
  • Emerging technology vendors will be well served by entering discussions with EY on how they can integrate into the EY Fabric. While EY is selective, gaining the EY seal of approval would go a long way toward validating the long-term viability so critical to global enterprise decision makers.

 

Clark says the only thing holding EY back is more orchestrators. These orchestrators consist of project heads with the necessary domain expertise to curate client processes for ingestion into EY Fabric as well as orchestrator AI chat bots to be run against the increasing volume of regulatory changes flowing from the public sector as governments seek to keep up with the rate and pace of business change technology unleashes on the economy and environment itself.

 

Relative to peers, EY is better positioned to meet the challenge given the sound fundamentals it deployed in building out a single, global data platform to scale its managed services offerings.

PwC, the SEC, and sustainability

From annual and manual to an automated, measured, and sustainable reporting

In early May, TBR met with PwC’s Casey Herman, the firm’s US ESG Leader and a longtime PwC Partner, to discuss PwC’s views regarding developments around sustainability and decarbonization. Herman explained that PwC’s clients increasingly understand the need to apply enterprisewide accountability to sustainability efforts, including bringing investments, measuring and reporting of standardized metrics up to par with financial disclosures and responsible corporate governance.

 

Unlike traditional accounting systems and processes IT, which have benefited from decades of development around ERP systems and recent advances in automation, sustainability reporting often remains “annual and manual,” in Herman’s colorful turn of phrase. For enterprises, he added, quantifying the impact of moving to sustainable operations, either through self-imposed changes or legal and regulatory compliance, will be key to change and success. And just as IT and financial decision-making benefits from consistent, reliable, and frequent metrics, sustainability will also need to move to a more frequent basis, with more standardized inputs and outcomes.

What clients need and what the SEC wants

Helping to accelerate that change — potentially — the Securities and Exchange Commission’s (SEC) March 21, 2022, release of proposed rules around climate change disclosures gave U.S. companies and consultancies, like PwC, a clear and defined rallying point for understanding near-term climate change strategies and goals. Road maps, data orchestration, change management and, of course, governance, risk and compliance can now all circle back to these proposed rules, expected changes and likely timelines. When TBR asked Herman what, other than the weight of the SEC, might compel companies to fully embrace these changes, he suggested that compensation metrics tied specifically to hitting sustainability goals will likely be the most compelling force. He noted that the market’s two greatest needs at present — automation and quantifiable results from shifting to sustainable operations — remained unmet, but the SEC’s proposed requirements could accelerate progress on both.

 

According to Herman, the proposed SEC guidance asks companies to do three things:

  1. Disclose climate-related risks that may have a material impact on assets and the business.
  2. Disclose, and subject to third-party assurance, Scope 1 and Scope 2 emissions, and disclose Scope 3 emissions if they are material or if the company’s sustainability goals include Scope 3. Herman added that Scope 3 is “an estimate, not a measurement,” which may be why the SEC has not added an attest requirement for Scope 3.
  3. Include a footnote in financial statements describing any historical costs and investments directly related to the impact and remediation of severe weather events and mitigating risks related to climate change — essentially, what has the company already spent on these efforts. Notably, because the SEC proposes this requirement be in a footnote, it too would be audited.

Herman speculated the timeline for adopting these requirements could be pushed beyond the presently proposed 2023 start date (reported in 2024) and that the phasing of the audit requirements may evolve through the public comment period and subsequent SEC revisions.

What PwC can do for clients: enable measurement, plan for success and implement change

After detailing PwC’s views on the SEC’s proposed rules, Herman circled back to how PwC helps their clients, outlining four essential services:

  1. Assisting clients with their reporting, valuation, and measurement of key metrics and KPI aspects related to sustainability — Based on the firm’s heritage and current capabilities, Herman noted, “we do this quite well.”
  2. Technology enablement of reporting, valuation, and measurement — Herman explained that most clients use non-enterprise grade technology for their valuation and measurement (the “annual and manual”), which lack automation, AI and dynamic decision-making tools. PwC, in TBR’s view, has invested heavily in recent years to accelerate and amplify the firm’s technology capabilities, including around automation, low-code, and AI platforms, positioning it well for the next technology evolution in sustainability.
  3. Net zero strategies and sustainable business strategies — Similar to valuation and measurement, strategic planning and governance are firmly within PwC’s wheelhouse.
  4. Implementation (of all the above) — Including organizational culture and change management, tax strategy consulting, and other related ESG services and solutions associated with sustainability and decarbonization.

In TBR’s view, PwC’s range of services reflect the firm’s evolution toward a technology-forward company still rooted in its core competencies and legacy values.

Regulatory pressures and consulting capabilities sustain sustainability

Sustainability trended before, and already the signs of a global recession, lingering supply chain challenges, and an ongoing war in Europe threaten to return sustainability and decarbonization to the back burner. TBR pressed Herman on what might compel change this time. Why will companies invest in new technologies and adopt new reporting requirements, other than to do the minimum to meet regulations? Herman suggested TBR follow the money. When metrics around decarbonization drive investor, lender and customers decisions, as well as potentially compensation, particularly within the C-Suite, enterprises will adjust accordingly and put meaningful investments into measuring and sustaining their sustainability goals.

 

In TBR’s view, two other intertwined forces may likely be accelerants to adoption: political pressures to meaningfully enact and then enforce the SEC’s proposed rules combined with consultancies and technology vendors leveraging those pressures to move their clients to act. Sarbanes-Oxley and Dodd-Frank come to mind when considering how regulatory pressures may create a favorable climate for consulting services around sustainability.

 

We believe, if the SEC’s rules reach adoption and credible, consistent enforcement, PwC may increasingly become a necessary sustainability collaborator for the firm’s clients. Even uncertainty around the regulations, timeline, scope and enforcement plays to PwC’s strengths in being positioned to provide clients with essential advice in staying on the right side of climate change while securing growth and reducing risk.

 

In July TBR will publish a Decarbonization Market Landscape examining the commitments, investments and actions to date around decarbonization by a select group of IT services vendors and consultancies. We will also detail the offerings those vendors bring to their clients to help with reaching decarbonization outcomes. Access this new report as soon as it publishes with a 60-day free trial of TBR Insight Center™.

Lenovo ISG is building the road map to become a market-leading infrastructure provider

Lenovo is executing on ambitious growth objectives

At Lenovo’s ISG Analyst Summit, ISG Executive Vice President Kirk Skaugen expressed Lenovo’s ultimate goal is to become the world’s largest IT infrastructure provider, and the group is executing on strategies across all segments within its Infrastructure Solutions Group (ISG) to meet this objective.

 

Lenovo’s ISG growth over the past two years has been strong, closing its 2022 fiscal year with over $7 billion in revenue, up from $5.5 billion in revenue in 2020, and a profitable operating income. Lenovo ISG is still relatively small compared to Dell Technologies ISG or Hewlett Packard Enterprise (HPE), which reported $34 billion and $28 billion in revenues, respectively, in 2021. But Lenovo’s rapid growth and road map position it to overtake smaller vendors in server and storage market share over the next five years.

 

Market and competitive intelligence straight to your inbox each month, absolutely FREE

 

Lenovo’s growth strategy has two main facets, building ISG brand awareness and honing its business strategy across the core ISG segments. Lenovo ISG is about 10% of corporate revenue, compared to the devices business at roughly 84%, making building awareness critical to gaining share in infrastructure segments such as server, storage, hyperconverged infrastructure and high-performance computing.

 

Lenovo is expanding awareness of its ISG portfolio through corporate advertising and sponsorship initiatives as well as tactically through the company’s One Lenovo strategy. One Lenovo will integrate devices and infrastructure go to market more closely, particularly in sales and channel compensation, to ensure customers are aware of the full Lenovo portfolio and to incentivize business referrals across the two groups.

 

Honing the ISG business strategy to deliver growth across all business segments is a more detailed and complex endeavor that consists of several operational, product and go-to-market initiatives. Core initiatives include:

  • Become a trusted partner
  • Capture vertical-specific edge compute demand
  • Accelerate growth in CSP business
  • Differentiate in Enterprise and SMB (ESMB) with services and cloud capability
  • Build nuanced portfolios that address geo-specific needs
  • Leverage in-house design and manufacturing for competitive advantages

Become a trusted partner

Among the many themes emphasized during the summit, multiple Lenovo executives highlighted: Lenovo strives to earn and retain trust with its partners and, especially, its customers.

 

Skaugen made this point time and again with external evidence and KPIs Lenovo keeps at the forefront of its organization, such being partner of the year to Nutanix, VMware and Microsoft. The company also boasted its 93.6% on-time deliveries rate, which represents the accuracy of its delivery timeline estimates to its actual execution.

 

Regardless of the uncertainties created by global supply chain disruptions, Lenovo wants partners to know that they can rely on the company’s word, even if that means taking conservative estimates on delivery lead times and leaving deals on the table that have the potential to dilute customer trust of Lenovo.

Capture vertical-specific edge compute demand

Edge compute still has no dominant forces, but countless vendors are vying for market share in the emerging industry. Lenovo believes its portfolio spanning the data center to the edge to individual pockets with mobile devices differentiates it from competitors.

 

Beyond its expansive vision, Lenovo’s head of edge infrastructure, Charles Ferland, made it clear the company’s edge compute technology is itself differentiated with practical and customer-centric designing baked into the process of innovation. Lenovo believes its design strengths include physical features ranging from noise minimization and a broad range of form factor sizes to security features with encryption and tampering protection as well as connectivity and automation features.

 

Lenovo’s portfolios range from a “backpack fitting” form factor to high performance, compute-dense form factors that enable on-site AI inferencing and data-intensive applications. It will take a vertical-centric approach to building edge solutions with the help of its Project and Solution Services group, which has a long-standing history of developing vertical-centric solutions such as retail and quick-service restaurant capabilities. To date, Lenovo’s edge customers range from small nonprofits to large retailers with thousands of locations worldwide.

Accelerate growth in CSP business

Over the past five years, Lenovo has grown its cloud service provider (CSP) business, which designs and manufactures IT infrastructures for cloud service providers, to about $3 billion in annual revenues. The ODM business model is generally considered to be less profitable than the OEM model in sales, but Lenovo is well positioned to generate profitability since bringing its entire ODM process in-house.

 

In Lenovo’s “ODM+” model, the company provides in-house design and engineering services, builds the infrastructure in its own manufacturing facilities and provides global deployment services, giving Lenovo an opportunity to cut out costs from other vendors while providing customers with end-to-end service capability.

Differentiate in ESMB with services and cloud capabilities

Lenovo’s ESMB segment accounts for roughly half of ISG revenue and faces stiff competition from fellow OEMs as vendors fight for limited share as growth is constrained by cloud erosion. Lenovo’s approach to adding value in the ESMB segment is similar to competitors, focused on providing a portfolio of end-to-end, “as a Service” solutions and developing private cloud and hybrid cloud capabilities.

 

Lenovo is rapidly expanding its portfolio of TruScale subscription offerings, which currently includes storage, hybrid cloud, multicloud, virtual desktop infrastructure and SAP solutions, in addition to device-specific offerings. Beyond adding use cases, Lenovo will expand the capabilities of the TruScale platform with more options for metering, AI-based insights, and enhanced user interface for management and automation.

 

Like peers, Lenovo has built a portfolio of private and hybrid cloud solutions leveraging alliances such as Microsoft Azure Stack and VMware-based cloud offerings. Lenovo’s strategy diverges from peers in the storage space, where Lenovo offers cloud services through its partnership with NetApp, including the ONTAP operating system and NetApp Cloud Volumes services. Other vendors, such as Dell Technologies and Pure Storage, are rewriting their storage operating systems to run on major public clouds or acquiring software companies that specialize in cloud management.

 

Regardless of vendor alliances used in building solutions, Lenovo intends to be an end-to-end cloud solutions provider and the accountable point of contact for customers.

Build nuanced portfolios that address geo-specific needs

Lenovo believes its competitive advantage is its positioning as a global vendor that can deliver on localization. With its focus on bringing design services and manufacturing completely in-house, Lenovo can provide local customization for its products that meet the specific needs of the region, including power specifications and component configurations. Lenovo’s presence with headquarters in both the U.S. and China gives it the ability to separate its businesses for the two countries, including software design and hardware manufacturing, to navigate geopolitical pressures seen across the market.

 

Partner strategy is also a key piece of tailoring the IT infrastructure portfolio to specific markets, particularly in China, where local vendors dominate market share compared to global ISVs and CSPs that hold a commanding presence in other markets, requiring a deep and specialized partner base.

 

Following the successful implementation of a joint venture with NetApp in China, which began in 2018, Lenovo also announced it is expanding its partner investments with a new APAC-based technology solutions business, PCCW Lenovo Technology Solutions Limited (PLTS). Lenovo will own 84% of PLTS, including 80% direct interest in its partner PCCW. PCCW Solutions is an IT services provider with over 4,000 employees specializing in systems integration, application development and operations, which will be complemented by Lenovo’s infrastructure portfolio and close-to-the-box services.

Leverage in-house design and manufacturing for competitive advantages

Throughout the event, Lenovo made it clear that despite challenges in 1Q22, the company’s supply chain and vertical integration is one of the most valuable assets it has. As part of its ODM+ model, Lenovo has its own in-house design and manufacturing capabilities, which allows it to supply hyperscalers but also leverage processes to optimize cost in ways traditional OEMs cannot.

 

Lenovo utilizes the Root of the Tree Model, which uses adaptable base models that act as the “root” and can therefore be modified to bifurcate into distinct products fill various performance and use-case niches. Significant advantages of this model — and key tenets of the philosophy — are the reuse of design and the commonality between products, which unlocks otherwise inaccessible component procurement scale and simplifying assembly, ultimately lowering cost. Powerfully, this reuse of design and components can be leveraged across portfolios, across ISG products as well as between ISG and CSG products, such as reuse of XPUs, motherboards, cabling or even certain chassis.

According to Lenovo, high-performance computing clients are taking full advantage of its robust, in-house design capabilities. For next-generation exascale supercomputing, Lenovo emphasized two primary factors that differentiate its approach from competitors: the modularity of its systems and its energy efficiency.

 

Competitive exascale offerings mainly consist of large rack-sized units, many of which are larger than standard server racks, weighing in at thousands of pounds. Customers can purchase Lenovo’s exascale servers down to the server, enabling smaller customers without large capex budgets to acquire and access the same technology at lower scale to large enterprise customers.

The other differentiating factor is Lenovo Neptune, a direct, warm-water-cooling system that eliminates the need for refrigeration, thus reducing power consumption. By decreasing energy consumption, Lenovo Neptune subsequently reduces energy costs, which aligns with its environmental objectives to lower emissions.

Conclusion

Lenovo’s ISG division remains small versus market share leaders, but the company is invested in building the business into a formidable competitor. With a multifaceted strategy starting at the corporate level, with One Lenovo initiatives, product-centric development strategies, and a robust design, manufacturing and supply chain, Lenovo is well positioned to maintain, and even accelerate, its revenue and profitability growth.

PwC’s Industry Cloud strategy delivers on 3 major cloud trends

PwC’s ambitious Industry Cloud strategy aims directly at heart of current trends

After spending an afternoon at PwC’s Boston Seaport offices in late March, TBR came away with a clearer picture of how the firm is centralizing its capabilities into solutions to be utilized in client engagements. It is a strategy that has been developed cautiously and thoughtfully over time, mirroring the firm’s overall evolution of the last few years, which has been both methodical and ambitious. The new Industry Cloud strategy is firmly in line with the company’s DNA but is also aligned with the most current trends in the cloud market, namely services, collaboration with partners, and industry alliances and preconfigured ecosystems.

The importance of services in cloud adoption and utilization has only increased over the last two years. The migration of mission-critical workloads and skills shortages have stoked demand for third-party firms to help implement and manage cloud solutions. PwC is tightly integrating services with all the cloud assets being deployed for the firm’s customers, which is an evolution of the long-standing Integrated Solutions program, incorporating the best of PwC’s consulting business across all platforms.

PwC’s Alliance strategy is integral to the Industry Cloud strategy, and through these collaborations, PwC is selecting well accepted and widely adopted cloud technologies to include in the firm’s recommended cloud solution frameworks, then filling the gaps between those individual technologies. The key is not trying to recreate the wheel with technology that already exists but using alliances to bring the leading solutions together across multiple vendors. It ties into broader PwC strategies to use automation, scale and commonalities to reduce deployments times by as much as half in some cases.

A key tenet of PwC’s strategy is also to build common cloud services that bring industry and sector-specific practices and prebuilt configurations to accelerate adoption timelines and reduce custom work. For a variety of reasons, customers are looking for diversity in their IT and cloud vendor landscapes, and PwC’s open solution frameworks cater to that desire. Lastly, industry specificity is an emergent trend in cloud. PwC is addressing the industry specialization void in the market by bringing together industry-leading technologies, tying them together with an integration fabric, and filling any gaps with its own services and innovation based on PwC’s deep experience and investments. These solutions can then enable customer business transformation spanning the front, middle and back office.

Industry customization ties the solutions together, as it as it reduces the need for custom services and is done in tight collaboration with cloud vendors’ technology. In this special report we detail these trends and PwC’s cloud strategy. However, in short, we see PwC’s strategy as being well developed and aligned not only to its core DNA but also to some of the most current trends and developments occurring in the market.

Free webinar: The vendor impact of cloud ERP picking up speed



Industry cloud is moving from a nice-to-have to a must-have

Enterprise maturity around horizontal cloud capabilities has resulted in a growing appetite for solution customization built around highly nuanced, industry-centric needs. This rising need will be addressed by both cloud vendors and services firms like PwC. Vendors have traditionally leveraged partnerships to add vertical functionality and go-to-market support to their solution sets, but that strategy has become even more aggressive recently, with multiple acquisitions being announced.

Oracle’s (NYSE: ORCL) intended acquisition of Cerner, Microsoft’s (Nasdaq: MFST) purchase of Nuance, and Salesforce’s (NYSE: CRM) strong alliance with Veeva (NYSE: VEEV) are all examples of how vendors are investing to offer more industry functionality to customers. Cloud vendors are also supporting industry-based go-to-market ambitions by augmenting their approach with an increased reliance on ecosystem partners across the IT continuum.

While tech partnerships have accelerated industry-based solution design and development, evidenced by Microsoft’s partnerships with both Rockwell (NYSE: ROK) and Honeywell (Nasdaq: HON), engagement with IT services entities will be just as critical to facilitating adoption among customers with industry-fluent advisory, road-mapping and implementation support services.

Specifically, in venues like industrial manufacturing, client DNA is rooted in hardware legacy organizational models and waterfall innovation and many clients lack not only the knowledge to support software-driven business models but also an understanding around the outcomes emerging technology — be it cloud, IoT or AI — can bring to their operations. This knowledge gap plays to the strengths of the professional services side of the IT spectrum, where innovation centers pair educational resources with business cases to provide prospective clients with an understanding as to what their own digital transformation (DT) could look like.

Not only has vendor activity with industry cloud picked up, so too have financial results as end customers increase adoption of these solutions. As shown above, customers see industry cloud capabilities as value-add elements of their cloud technologies, notably with the ability to free up resources being the least cited benefit. The ability of industry cloud offerings to first meet regulatory requirements and then also match the unique business and IT workflows within certain industries are the most compelling benefits, according to TBR’s 2H21 Cloud Applications Customer Research.

TBR’s perspective on PwC’s alignment with industry cloud trends: ‘Micro alliance activation’

      • PwC is not “boiling the ocean” with its approach to Industry Cloud, instead focusing on heavily regulated industries as the firm looks for ways to not only meet regulatory requirements but also leverage investments to competitively differentiate itself with enhanced time to market and ongoing operational excellence, While many vendors on the technology side have taken an even more focused approach to industries, we believe PwC’s strategy is appropriate for the firm, given its partner-driven engagement focus and existing presence within the industries.
      • PwC’s approach aligns to the third most selected benefit of industry cloud: “We are in the early phase of cloud adoption and are pursuing industry cloud services as a preliminary step in the process.” Many companies are still in an early stage of their cloud adoption. Regulations are more stringent in these industries, creating real and perceived barriers to adoption. In many ways, industry cloud is the ramp these customers need to get started using cloud in more significant ways as part of their IT strategy.

Cloud partnerships are moving from important to critical

The shift to partner-led growth is not a new trend but is being further legitimized in 2022. Growth from indirect, partner-led revenue streams have been outpacing direct go-to-market efforts for several years, but indirect revenue is reaching a new level of scale and significance in the market.

TBR estimates indirect cloud revenue is approaching 25% of the total cloud market opportunity, which is a significant milestone. For reference, in traditional IT and software, indirect revenue represents somewhere between 30% and 40% of revenue streams. We expect the indirect portion of the cloud segment to surpass that level within five years, approaching half of the market opportunity within the next decade. For all cloud vendors, the combination of short-term growth and long-term scale makes partnerships an increasingly critical element of their business strategy.

Partner ecosystems have been a core part of the IT business model for decades, but the developments around cloud will be different for various reasons, primarily because the labor-based, logistical tasks of traditional IT are largely unnecessary in the cloud model.

For cloud vendors and their partners to succeed in growing the cloud market, they both need to be focused on enabling business value for the end customer. Traditional custom development becomes cloud solution integration. Outsourcing and hosting are less valuable, while managed services are far more variable for cloud solutions. To capture this growing and sizeable opportunity in 2022, we expect companies will adapt their partner business models and vendor program structures to align with vibrant cloud ecosystems.

TBR’s perspective on PwC’s partner strategy

      • PwC is being proactive in how it leverages alliances, recognizing that winners in industry cloud rely on alliances and that the industry data model is only as good as the ISV solutions that run on top. Within PwC, these relationships are supported by joint business relationships and alliance groups with front-office, middle-office and back-office players, as well as the cloud service providers (CSPs) that go to market with PwC as part of the Journeys model. PwC is being selective about the vendors and technologies it recommends, focusing on leading providers like Amazon Web Services (AWS) (Nasdaq: AMZN) and Microsoft to both offer the most widely used solutions and simplify its alliances.
      • By combining the IaaS and SaaS capabilities of alliances with its own products and accelerators, PwC enables integration points in a platform-like approach. While not a PaaS offering in itself, PwC’s Common Cloud Services Platform, which targets custom Journeys for a specific industry in an end-to-end fashion, should create a high degree of stickiness.
      • PwC is emulating some best practices of its alliances, including the leading cloud service providers (CSPs) and ERP vendors. Further, some of ServiceNow’s success stems from selective innovation and deciding early on where it wishes to develop versus leveraging partners. PwC takes a similar approach, focusing custom development investments on whitespace markets while layering the capabilities of its partners on top of new solutions.
      • One of the most notable obstacles facing PwC is a degree of competitive overlap between PwC and cloud vendors it has collaborated with that are similarly working with industry consortiums to stitch together end-to-end systems. Where PwC stands to benefit in this regard is through its roots as a services firm; unlike some of the product-first competitors overlapping with the Industry Cloud strategy, PwC is going to market first with tech-enabled services that can then get clients exposed to products.

Traditional designations are morphing as value moves to IP development and managed services

In the traditional IT partner model, the business models of partners — such as reseller, systems integrator and ISV — were used to segment partner programs. Cloud has disrupted the traditional model, with born-in-the-cloud partners competing in various activities to optimize their revenue streams and traditional partners expanding their business models to sustain their financials.

As a result, resellers can develop their own solutions and IP, while systems integrators sell and resell their own software solutions and ISVs offer their own managed services. It is common for partners to have multiple business models, making the traditional designations too restrictive.

The other area of strong demand from customers, driving enhanced focus from cloud vendors, is in managed services. Increased cloud adoption has led to higher cloud complexity for many customers, leading to more challenging tasks to provide ongoing administration, integration and operations of the environment. This increasing complexity coincides with a historic shortage of personnel with cloud expertise, driving demand for managed service offerings from third-party providers to fill the gap. As a result, we expect managed services to be the fastest-growing segment of the cloud professional services market, reaching $75 billion by 2026.

Cloud vendors like AWS, Google (Nasdaq: GOOGL) and Microsoft have a vested interest in nurturing their managed service ecosystems to facilitate new investments from their cloud customers. Considering these trends and the likely erosion of legacy services lines by software and managed services, it is critical for consulting-led firms to diversify with serviceable assets that go beyond the underlying modules. While some of its Big Four competitors are similarly recognizing this trend, PwC appears to have caught on to the fact that software and services require vastly different sales models and dedicated teams for successful execution.

With Industry Cloud, PwC serves as consultant, ISV and managed service provider

Using the term Journeys is an apt description of how PwC intends to engage with customers around these solutions. It is not just a cloud technology implementation; there is upfront design and consulting, implementation of both off-the-shelf cloud technology and custom PwC IP to align solutions to industry, and finally provision of managed services to simplify ongoing operations. That is a lot of activity, but it reflects what customers need and want from these types of implementations. It is taking PwC beyond traditional services and value propositions with clients, but it aligns with where customers and the market are heading.

While the framework for Industry Cloud is compelling, it will no doubt be a challenge to execute on the vision. Expanding beyond traditional consulting business roles and activities and maintaining cohesiveness can be challenging, but as we have seen in recent years, PwC has been quite adept at reinventing itself, so we expect the firm to overcome these challenges. Alliance management, cloud service development, packaging and pricing are all competencies being developed within PwC to execute on more Industry Cloud opportunities.

How Informatica uses the cloud to empower a data-driven enterprise

Overview

Setting the stage for what ended up being the primary theme at Informatica World 2022 — Data is your platform — Informatica CEO Amit Walia walked attendees through two emerging trends: the importance of scalable infrastructure through cloud computing, and how AI and machine learning (ML) are no longer just about automating processes but also about enabling predictive intelligence. These trends, while well recognized in theory, are more challenging for businesses to put into practice, particularly due to the proliferation of data and the number of users looking to access said data, including both technical and nontechnical personas.

Informatica’s solution to data complexity is rooted in one of the company’s core values — platform centricity — but the move to essentially replace Intelligent Data Platform with IDMC, after years of innovation and a slight disruption from COVID-19, is now taking Informatica’s approach to data management and integration to new heights. With IDMC in the cloud, Informatica is better positioning itself to help clients translate data into valuable insights at a level that cannot be realized on premises.

In addition to being cloud-native, IDMC is infused with AI, addressing the other emerging trend called out by Walia — the need for AI-powered intelligence. All Informatica capabilities are built on CLAIRE, an automated metadata engine that processes 32 trillion transactions per month, and tie back into IDMC. While the ROI for AI technology is still hard to justify for many businesses, another key factor in the low adoption of the technology is that many businesses are working with complex, siloed data, which means AI models could fall short and lead to inaccuracies.

CLAIRE is designed to address a range of operational, runtime and automation use cases — from auto-scaling to anomaly detection — and acts as a wrapper around IDMC to enable fully automated data management and governance processes. By bringing the power of cloud and AI into one integrated platform, Informatica uses IDMC to help customers focus on the only thing they truly own in the cloud: their data. The result of a $1 billion, six-year investment, IDMC consists of seven core modules, with its value proposition largely stemming from its modularity and the ability to allow customers to pick and choose capabilities and services based on their industry, business and use case.

Informatica expands platform capabilities, driving additional value for its comprehensive, cloud-native solution

New innovations emphasize uniting IT and business functions to improve efficiency

With IDMC, Informatica has solidified its platform approach, but as cited by various customers, the company’s ability to continually offer new capabilities is what drives additional value, by addressing more horizontal and vertical use cases in the data life cycle. Perhaps the most notable announcement at Informatica World 2022, which seemed to garner particular excitement from product leaders and customers, was the general availability of Informatica Data Loader. Jitesh Ghai, Informatica’s chief product officer, led a demo of Data Loader, which is a free, self-service tool that ingests data from over 30 out-of-the-box systems into Google Cloud’s popular data warehouse solution, BigQuery.

As part of the demo, we saw a scenario play out where a marketing analyst needs access to more data to effectively run a campaign. The hypothetical marketing analyst then accesses the Integration module within IDMC to pull data from Marketo using a drop-down tool to access BigQuery through which data can be loaded in only a few steps. This integration could end up acting as a time-saver for large organizations and speaks to the innovative ways Informatica is getting data into the hands of line-of-business teams.

At the event, Informatica also announced INFACore, which targets more technical users, such as data scientists and engineers, allowing them to clean and manage data in a single function. As a low-code plug-in for popular frameworks, such as Jupyter notebooks, INFACore is designed to improve the productivity of the technical user, but naturally this productivity trickles up to business functions. For instance, after using INFACore to cleanse data through a single function, the data scientist can publish a clean data set to the Informatica Marketplace, where other teams within an organization can access it.

Another key innovation called out in opening talks with Ghai was ModelServe, which allows users to upload, monitor and manage ML models within their Informatica data pipelines. There are many ML models in production, but businesses are still looking for ways to scale them from an operational perspective. In talks with more than one customer at the event, the common interface within IDMC came up as a value-add when attempting to scale a data team, suggesting customers are awaiting ModelServe’s general availability as it will allow users to register and manage ML models directly within IDMC.

Informatica strengthens SaaS portfolio, building in intelligence from the data model up

While Informatica’s platform capabilities get much of the market’s attention, the company also has a broad portfolio of IDMC-enabled SaaS offerings, which play a key role in the data management journey, complementing warehousing, integration and automation. As a native service within Informatica’s Master Data Management (MDM) solution, 360 applications act as a gateway for transforming customer experience in the cloud, something we saw in action through the product demo of Supplier 360 SaaS.

Through IDMC, CLAIRE recognized a defective part from a supplier of a hypothetical company, and teams were able to use Supplier 360 SaaS to identify which customers were impacted by the faulty part and automatically notify customer service so they can launch a refund program to keep customers satisfied. Informatica also released various industry and domain extensions for its 360 applications and will continue to offer new packaged offerings available in a SaaS model, providing customers more ways to onboard and manage data.

Joining the industry cloud bandwagon, Informatica verticalizes IDMC

It is no secret that industry specialization is re-emerging as a leading trend in the cloud space, as a maturing enterprise customer base demands solutions that suit their unique IT and business processes. During the event, Informatica unveiled new IDMC customizations for financial services, healthcare and life sciences. These three offerings join IDMC for Retail in Informatica’s industry cloud portfolio to further address demand for purpose-built solutions that will limit the need for customization.

Findings from TBR’s Cloud Infrastructure & Platforms Customer Research continue to indicate that some enterprises are wary of industry cloud solutions, dismissing them as marketing ploys. Other enterprises, however, find them worth evaluating. For instance, in talks with a representative from a hedge fund, we found that the company initially chose a competing MDM solution because it specialized in asset management with its own specific data dictionary but was torn as it viewed Informatica’s MDM as ahead of the competition in terms of capabilities. We can expect Informatica to expand in other industries, including specific subverticals, with additional data models, custom user interfaces and data quality rules to appeal to these customers.

Continued integrations and go-to-market synergies with hyperscalers help Informatica maintain data neutrality

For a company that markets itself as the “Switzerland of data,” Informatica’s ability to make its offerings accessible across leading cloud platforms is critical. Partnering across the cloud landscape is no longer a differentiator, it is a necessity and something customers clearly find value in as they gravitate toward multicloud environments. During the event, Walia welcomed several partner executives both in-person and virtually to discuss new joint offerings and go-to-market synergies the company is forming with cloud service providers to deliver more choice and flexibility and for joint clients.

      • The ubiquity of Microsoft’s cloud portfolio allows Informatica to provide clients a unified data architecture. Informatica and Microsoft (Nasdaq: MSFT) have a well-established relationship, which at its core is focused on migrating data warehouses to the cloud but is evolving and making Informatica relevant across the Microsoft Cloud stack, including Azure, Power Platform and 365 applications. For example, Informatica is typically well known for its integration with Azure Synapse, but the company also integrates with the Dynamics 365 SaaS data model to enable Customer 360 analytics. Expanding its presence throughout the Microsoft Cloud stack, Informatica announced MDM on Azure. With this announcement, customers can deploy MDM as a SaaS offering on Azure via the Azure Marketplace, which could appeal to the large number of Microsoft shops looking to enhance their Azure Data Lakes with a feature-rich MDM solution. Both companies also launched Informatica Data Governance with Power BI, which, as highlighted by Scott Guthrie, EVP of Cloud and AI at Microsoft, brings Informatica’s data catalog scanners to Power BI, allowing customers to have a single view of their data processes from ingestion to consumption. This offering could serve as a more strategic way for customers to modernize their analytics workloads through Azure.
      • Given their respective strengths in data analytics and data management, Google Cloud and Informatica are complementary partners. The Google Cloud-Informatica relationship took a major step forward with the launch of Informatica Data Loader, which could expand client usage of BigQuery and help Google Cloud (Nasdaq: GOOGL) address a wider set of customer needs, including those outside the IT department. In TBR’s own discussions with enterprise buyers, BigQuery is often cited as a leading solution due to its ability to handle petabytes of data at a favorable price point. Walia reaffirmed this notion in discussions with two customers, ADT and Telus, both of which are migrating legacy data warehouses and/or front-end ETL (extract, transform, load) capabilities into their BigQuery instances and using IDMC for cloud-based data management.
      • Oracle awards Informatica preferred partner status for data integration. Informatica and Oracle (NYSE: ORCL) struck a new partnership agreement that offers IDMC on Oracle Cloud Infrastructure (OCI). Addressing the large number of customers running legacy Oracle databases and potentially those that are also deploying on-premises Informatica products, IDMC on OCI provides customers an integrated gateway to the cloud by enabling back-end connections with Oracle Autonomous Database and Exadata Database Service and OCI Object Storage. For example, with IDMC on OCI, customers can import data from legacy Oracle E-Business Suite applications into Autonomous Database and connect to other data sources, such as Azure SQL or Amazon RedShift, through IDMC. As a preferred Oracle partner, Informatica will recommend customers use IDMC with Oracle’s cloud services. Oracle’s EVP of database server technologies, Andy Mendelsohn, walked through numerous incentives to assist customers’ cloud migrations, such as Bring Your Own License, Informatica Migration Factory and Oracle Cloud Lift Services.

Informatica also has close relationships with Amazon Web Services (AWS) (Nasdaq: AMZN), Snowflake (NYSE: SNOW) and Databricks, all of which are expanding their commitments to Informatica to help customers look beyond ETL and handle data in an end-to-end fashion. Given Informatica offers analytics, integration, automation, governance and management capabilities across leading clouds, naturally the company runs up against a high degree of competitive overlap with its partners, which offer similar native tooling as part of a customer’s environment.

However, in talks with customers, the general perception seems to be that the hyperscalers’ capabilities are still relatively immature and that there is also significant value in deploying a vendor-neutral platform like IDMC to avoid vendor lock-in and address the training and skill challenges typically associated with a multicloud environment. While we can expect the hyperscalers to enhance their capabilities, at the end of the day, the primary goal for AWS, Microsoft and Google Cloud is to win compute, so the benefits of partnering with Informatica to capture legacy platform-layer workloads outweigh the downsides of coopetition.

Conclusion

With IDMC, Informatica has fostered a value proposition catered to three core areas: platform-centricity, connecting IT and business ecosystems, and infrastructure agnosticism. The numerous announcements made at Informatica World 2022 show the data management company is building on these strategic pillars by better aligning with cutting-edge trends in the cloud industry, such as industry customization, out-of-the-box integrations and data democratization. With these enhancements in place, along with close partnerships across the IaaS ecosystem, Informatica is positioning itself favorably to assist clients with the large number of on-premises workloads ready to be migrated and modernized in the cloud while enabling the cloud-native enterprise to transition from digital to data-driven.

Drawing on its partner network and Red Hat’s open posture, IBM enables full-stack transformation

TBR attended IBM Think in a virtual format for the third consecutive year, and this time around we sensed a new IBM. No longer beholden to its low-margin managed infrastructure services business, IBM is emerging a more agile, streamlined and focused organization, especially as it looks to lead the digital revolution through two overarching areas: getting customers to embrace a hybrid architecture and helping them unlock data-driven insights through AI.

This strategic pivot was driven home not only by high-level executives, including CEO Arvind Krishna himself in an exclusive Q&A session with the analyst community, but also through the various partnership announcements, service launches and upskilling programs unveiled over the course of the interactive, two-day event.

Through Red Hat, Software and Consulting, IBM has created an end-to-end approach to unlocking hybrid cloud’s value

Closing in on the three-year anniversary of its acquisition of Red Hat, IBM (NYSE: IBM) continues to execute on its hybrid cloud vision, offering the services and software needed to integrate and orchestrate enterprise workloads across multiple environments. With the exception of some mono cloud and data center-only customers, enterprises are largely heterogenous in how they consume IT, drawing on multiple architectures, vendors and environments.

Considering IBM’s large legacy software install base and ties to the mainframe, this trend bodes well for the company as it can leverage Red Hat OpenShift — which now has roughly four times the number of customers it had prior to the acquisition — to unlock siloed data and extend it to any public cloud. The challenge, however, as articulated by Roger Premo, general manager, corporate development and strategy, is that getting greenfield applications to the cloud is only Step 1 in achieving a scalable hybrid cloud framework, yet the amount of time, level of skills needed and executive-level pushback are some of the factors that keep enterprises from expanding on their lift-and-shift investments.

 

Hoping to advance customers through the containerization, operational change and replatforming phases of hybrid cloud adoption, IBM is revamping its go-to-market model, closely aligning the Software and Consulting business units to address customer needs end to end. For instance, IBM Consulting is invested in the technology behind IBM’s hybrid cloud and AI vision, providing clients the tools needed to provision their own hybrid environments, which, as phases of adoption become more complicated, will naturally pull in more automation, observability and AI assets, as well as additional advisory assistance to help determine which clouds are best suited to which workloads.

Specifically, Premo highlighted the data fabric, which has grown synonymous with IBM Cloud Pak for Data, as one of the technology pieces underpinning IBM Consulting’s value proposition for building and modernizing applications in a hybrid cloud environment. While IBM is still committed to supporting legacy data warehouses and on-premises databases, the company is likely encouraging customers to adopt the data fabric for integrated capabilities that help simplify data management, such as cataloging and automated governance. Essentially an ecosystem of data powered by active metadata, IBM’s data fabric allows various AI offerings, from decision intelligence to machine learning, to run in any environment, while maintaining a common, governed framework.

IBM’s partner strategy continues to evolve post-Red Hat

IBM has always prided itself on having a broad partner ecosystem but appears to be taking a page out of Red Hat’s playbook by creating a more open position in how it goes to market. For instance, as a full-stack vendor specializing in infrastructure, platform software and professional services, IBM naturally runs up against competition in many areas but appears more willing to risk coopetition to do what is in the best interests of the customer.

TBR notes this is a stark contrast from the SoftLayer days, when IBM seemed more concerned with protecting its direct business interests. Today, Big Blue is absorbing more of Red Hat’s operational best practices and is investing in dedicated teams across the ecosystem, including niche ISVs, hyperscalers, global systems integrators (GSIs), advisory firms and monolithic SaaS companies. At the same time, preserving Red Hat’s independence remains equally important, and as Premo indicates, the relationship between IBM and Red Hat is asymmetrical in that IBM is biased toward Red Hat but Red Hat is not biased toward IBM.

 

IBM inks strategic partner agreement with AWS to scale ‘as a Service’ software

In one of the more newsworthy announcements at IBM Think Digital 2022, IBM unveiled it is working with Amazon Web Services (AWS) (Nasdaq: AMZN) as part of a multiyear agreement that brings the IBM Software portfolio, delivered “as a Service,” to AWS’ cloud infrastructure. Customers can now take advantage of the popular click-to-buy experience on the AWS Marketplace to run IBM data and automation assets, including Db2, API Connect and Watson Orchestrate, among others, in an AWS environment. This partnership announcement is a testament to the major strategy shift IBM made three years ago when it acquired Red Hat and standardized on the OpenShift platform, which, being based on Linux and containers, makes the platform and subsequent IBM software applicable on any infrastructure, including AWS.

This platform approach is also providing IBM the flexibility to adapt alongside changing customer buying habits, including a shift toward cloud managed services, which is the fastest-growing usage of OpenShift and prompted the launch of Red Hat OpenShift on AWS (ROSA) at last year’s Red Hat Summit. Customers looking to offload operations to site reliability engineers (SREs) will be able to deploy IBM SaaS offerings integrated with ROSA as a managed service, although IBM is continuing to support customers looking to protect their capex investments as there are over 30 IBM licensed software offerings available on the AWS Marketplace. Expanding service availability is only one part of the partner agreement as IBM indicates it will work with AWS in other areas, including co-selling and co-marketing initiatives that could engage AWS sales teams and help IBM further tap into AWS’ expansive customer base.

 

Strategically, IBM is staying the course with its strategy, leveraging Red Hat’s neutral status and integrations with hyperscalers to sell more software and attached services. Offering IBM SaaS on AWS is a strategic move as it will allow IBM to address customers that have years of experience running IBM software but want the scale of AWS’ cloud infrastructure, which TBR interprets as IBM prioritizing partner clouds at the expense of its own so it can focus solely on OpenShift and Software. Further, as IBM looks to grow its software business, particularly through the monetization of “as a Service” models built on OpenShift, leveraging partner marketplaces will be key, especially considering IBM lacks marketplace capabilities at scale and IT procurement continues to rally around the digital catalogs of AWS, Microsoft (Nasdaq: MSFT) and Google Cloud (Nasdaq: GOOGL).

 

Use of RISE with SAP internally aligns with IBM’s vision to bring legacy ERP to the hybrid cloud

IBM joined the roster of 1,000-plus RISE with SAP customers, announcing it is migrating to SAP Business Suite 4 HANA (S/4HANA) to streamline business operations across its Software, Infrastructure and Consulting units. This announcement comes just months after IBM unveiled a new supplier option via the BREAKTHROUGH with IBM for RISE with SAP program, which enables customers to bundle professional services with IBM IaaS offerings as part of a unified contract and set of service-level agreements (SLAs).

IBM’s new migration project will leverage the premium supplier option and bring over 375 terabytes of on-premises data to IBM Power on Red Hat Enterprise Linux (RHEL) on IBM Cloud. While IBM is partnering with GSIs in many areas, SAP (NYSE: SAP) implementations is likely one of the areas where competition is fiercer between IBM and its peers, especially as the end-of-life deadline for legacy SAP R3 approaches. However, the premium supplier option paired with IBM’s over 38,000 trained SAP consultants could help the company better tap into SAP’s base of over 30,000 on-premises ERP customers and challenge the likes of Accenture (NYSE: ACN) and Deloitte.

One of tech’s largest acquisitions will place VMware as strategic and financial centerpiece of Broadcom Software

Broadcom will position VMware at forefront of its software strategy

On May 26, Broadcom (Nasdaq: AVGO) agreed to purchase VMware (NYSE: VMW) at an enterprise value of $69 billion, making it one of the largest tech acquisitions in history. While Broadcom is no stranger to software acquisitions, this transaction will be its most transformative as VMware becomes both the brand and growth driver behind Broadcom Software. If the transaction closes, the new Broadcom will find itself evenly balanced between its semiconductor and infrastructure software businesses. After market close on the day of the announcement, investors on each side of the transaction viewed the proposed deal favorably, signaling shareholders’ confidence in management’s ability to use past experiences to generate free cash flow through the integration of the two companies, bolstered by VMware’s cost structure and pervasive role in enterprise IT.

Should the deal close, VMware will be led by Broadcom Software Group’s current president, Tom Krause, who has a financial background and will report to Broadcom CEO Hock Tan. As with past acquisitions, Broadcom’s primary goal will be to improve profitability through cost synergies, mostly related to redundant headcount. While margins will certainly benefit, VMware’s innovative agenda, spearheaded by Pat Gelsinger and since adopted by current CEO Raghu Raghuram, hangs in the balance, with the outcome dependent upon Broadcom’s desire to drive synergies with VMware in both R&D and go to market. If Broadcom’s acquisitions of CA Technologies and Symantec are any indication, VMware’s future in the cloud and at the edge may be muted. But it is still early days, and commentary from Broadcom management suggests a different course of action relative to past acquisitions with a strong intent to invest in VMware’s core software-defined data center (SDDC) stack.

A deal could bring VMware back to its data center roots

Since the 2016 launch of VMware Cloud Foundation (VCF), VMware has insisted on making its trusted virtualization software relevant beyond data center walls by delivering native, turnkey solutions with all major cloud service providers (CSPs). The rise of cloud-native development through containers and Kubernetes has presented VMware customers with an alternate route to the public cloud, but the 2019 acquisition of Pivotal and resulting Tanzu portfolio — while still built and delivered via ESXi — allowed VMware to position as a complement to containers, rather than a competitive threat.

Often still defined as the company that pioneered enterprise virtualization, VMware has proven its ability to adapt over the past two decades alongside market trends, including cloud computing and containerization, both of which have accelerated VMware’s transition to a Subscription & SaaS company, with related revenue comprising 29% of total business in 1Q22. Broadcom plans to upsell Subscription & SaaS alternatives to legacy customers, including those demanding “as a Service” software inside the data center.

However, given the growth in Broadcom’s software business stems from mainframe customers, we cannot help but wonder if VMware’s push to the cloud will be stalled should the deal close. From a cost perspective, customers may be less incentivized to move their VMware workloads to the cloud, and instead could containerize applications to avoid incurring the cost of VMware or could simply keep their VMware applications on premises, which would erode some cross-selling opportunities for Broadcom. Further, given Broadcom’s focus on revenue-rich products, we can expect detracted focus from the Tanzu initiative, which could bring VMware further back to its data center roots and, in a worst-case scenario, put it back at war with the hyperscalers, as was similarly seen in the early days of EMC.

With VMware’s success hinging on partners, Broadcom cannot afford to decelerate partner investment

Historically, Broadcom’s corporate sales model has been largely direct, but considering the scale of VMware’s partner network, the pivot toward indirect sales motions is inevitable, especially as Broadcom looks to build out a $20 billion software enterprise. Management indicated it will sell directly into 1,500 core accounts while likely providing hands-on professional and support services to these customers, which Broadcom chalks up to a simplification of its overall business model. This suggests, however, that there will be over 300,000 vSphere adopters still left in the hands of partners — and given Broadcom’s lack of comparative experience navigating channel relationships, the company will be most successful if it lets VMware go to market independently while preserving its relationships with strategic resellers, especially Dell Technologies, which is responsible for roughly one-third of VMware’s revenue.

Further, despite a thin R&D budget, Broadcom will still deliver new product integrations with VMware, which could present opportunities for distributors, VARs and potentially ISVs looking to integrate and package their solutions with VMware and Broadcom. However, management has been unclear regarding acquisition synergies, suggesting opportunities could be minimal, and except for some OEMs potentially hoping Broadcom will help level the playing field, partners are likely concerned.

This is particularly true as prior to the announcement VMware was in the middle of overhauling its partner program, announcing promises to improve coselling motions between direct sales teams and VARs, in addition to investments in digital and automation technologies designed to lower implementation costs and improve partner profitability. With Broadcom’s cost structure in place, investments in VMware resources and training programs for partners could decrease, which, when combined with the already higher prices we can expect for VMware products, will present a challenge for partners across the spectrum.

For Broadcom, it is all about profitability

The proposed acquisition can be viewed as another one of Broadcom’s attempts to diversify its hardware portfolio through high-margin software, and with VMware, Broadcom will use redundant costs and license prices as levers for margin expansion. Profit growth will have to come in the form of cost consolidation as VMware’s top line will decelerate, especially as profitable software maintenance revenue streams erode as customers transition from licenses to subscriptions. For context, in 2021 VMware’s SG&A costs accounted for 40% of revenue, a high percentage relative to peers, leaving room for Broadcom to offload redundant resources, particularly in back-office positions.

Meanwhile, as Broadcom prioritizes margins at the expense of top-line growth, at least in the near term, we can expect the sales and marketing line to be impacted, with Broadcom making use of its existing sales teams and channel distribution partners to sell into existing strategic accounts. R&D is perhaps the biggest question mark weighing on the pro forma company, which we expect will require a minimum 15% reduction in spend to meet EBITDA targets, when applying the S&M and G&A estimates shown in Figure 1. The R&D budget will undoubtedly be cut, but the degree depends on the level of “central engineering” synergies Broadcom is willing to form with VMware to deliver new products, with at least basic CI/CD (continuous integration/continuous delivery) procedures in place.

Leveraging VMware’s relationships with the cloud providers, specifically Amazon Web Services (AWS) (Nasdaq: AMZN), it is possible new product synergies could be formed without driving significant R&D investment. However, it will still require a level of commitment from Broadcom to invest in the VMware portfolio beyond SDDC, which does not appear on the company’s radar. This structure could also impact existing offerings like SASE and Project Monterey, which happens to align with Broadcom’s gradual shift away from x86 architectures. This is especially true as Broadcom figures out where there is overlap between its existing software portfolio, which already has plays in security, infrastructure management and FC SAN (fiber channel storage area network) and VMware.
Broadcom Software acquires VMware
At the end of the day, cost actions will run through the income statement over the next three years in a way that gets Broadcom to $8.5 billion in pro forma adjusted EBITDA. Currently estimated at $4.7 billion for FY22, Broadcom would need to grow adjusted EBITDA by a 22% CAGR to achieve this goal, resulting in a drastic operational change for VMware and potentially a loss of momentum outside vSphere, vSAN, NSX and the vRealize suite, which may not have an impact on near-term results but certainly risks VMware’s long-term attractiveness.

Rival bid seems unlikely despite go-shop provision

While the premium pledged by Broadcom in its bid for VMware is likely to ward off most, if not all, potential rival bids, the current agreement contains a 40-day go-shop provision that allows VMware to explore other buyers. Ultimately, any potential bidder would need to have a significant amount of capital ready to be utilized and be willing to push VMware’s valuation further. Given their respective sizes, a hyperscaler is the most likely candidate, with AWS top of mind considering its strategic reseller and product alliance with VMware.

However, TBR believes this is still unlikely, and if any of the cloud providers were to buy VMware, it would be widely perceived as an attempt to buy IaaS revenue. Further, we believe that the cloud providers, while some are more prone to locking in customers than others, generally respect VMware’s neutral position in the market and are cognizant of the fact that owning VMware could create a host of challenges for customers. It is also plausible some of the hardware vendors would like to get in on the deal, but OEMs could be skeptical following last year’s spinoff by Dell Technologies.

TBR takeaway

Considering Broadcom’s aggressive profit targets and previous history running software businesses, customers, partners and employees appear to share mutual concern regarding what will become of Broadcom Software should the deal close. With cost reductions bound to occur across business functions, including R&D, lack of investment raises questions as to how VMware will remain competitive in markets beyond traditional virtualization.

However, Broadcom management has also indicated that VMware will not operate like Symantec and CA Technologies, given its unique market position — and if VMware can materialize R&D to drive new product synergies, the company could at a minimum maintain its trajectory of midsingle-digit growth. VMware’s well-established relationships with channel partners will also help Broadcom establish a large software empire, but this would be contingent on the company’s willingness to invest in less profitable, yet emerging business units, with the final decision coming down to whether management believes the initiative will be accretive to free cash flow.


Past is prologue for EY and the blockchain ecosystem

Gathering in person again for the first time since 2019, EY hosted around 200 blockchain enthusiasts for a full day of presentations, panel discussions and deep dives into the technologies, business use cases and ongoing challenges around the entire blockchain ecosystem, from cryptocurrencies to decentralized autonomous organizations (DAOs) to smart contracts. TBR attended EY Blockchain Summit both in person and virtually and spoke with EY leaders, EY clients, and entrepreneurs using the event to better understand blockchain. Following the in-person event, EY held virtual sessions for three additional days, tailored to practitioners and focused on specific use cases and technologies.

Evolving public blockchain for the masses to enterprise-ready solutions positions EY among the key ecosystem enablers

At every EY Blockchain Summit, TBR has been bowled over by the vision, clarity and passion EY brings and the diverse perspectives and commercial opportunities discussed both freely and critically. No good idea goes unspoken, and no questionable idea passes unscathed. In all these aspects, the May 17 summit in New York City — a welcome return to in-person gatherings — echoed previous summits, including an opening presentation by Paul Brody, EY’s unique blockchain proselytizer (and the firm’s global blockchain leader).

The overarching theme, in contrast to past events, centered on unlocking enterprise use cases, with EY facilitating adoption and adequately addressing privacy on the public blockchain. While last year’s summit featured extensive examinations of cryptocurrencies, central bank digital currencies and decentralized finance (DeFi), Brody and EY kept this year’s focus on getting to scaled adoption of blockchain such that blockchains do for business ecosystems what ERP did for the enterprise.

Numerous presenters and panelists took the discussion far afield, into questions such as the future of the dollar and the value of decentralized autonomous organizations, but Brody and his EY colleagues consistently presented a firm with the right strategy, investments, tool sets, alliances and leadership to act as a good shepherd for blockchain, advising clients on adoption and helping to shape a sustained push to Ethereum as the dominant ecosystem platform.

In TBR’s view, unrestrained passion for blockchain, bolstered by R&D investments (see below) and combined with a Big Four mentality around risk, compliance and consulting for large-scale enterprises, will continue to differentiate EY from peers, a separation that will become financially significant should Brody’s optimistic projections for blockchain’s revenue potential play out.

EY plus Polygon Nightfall makes Ethereum enterprise ready

Brody’s opening monologue covered the vast blockchain space, including three “killer apps,” cryptocurrencies, DeFi and DAOs, and predicted exponential growth for blockchain over the next 15 years. He hammered home the dominance of the Ethereum platform, which he described as “demonstrating all the process maturity you would expect from essential infrastructure.” And he described non-fungible tokens (NFTs) as one of the “most mature use cases” and heading for “mainstream adoption.” In this constantly changing space, Brody centered EY’s value on helping enterprises build, run and manage secure business processes on the Ethereum blockchain. To explain EY’s case, Brody helpfully provided his firm’s “secret plan for world domination” and its four component parts — essentially, advise, build, enable, and manage (tax included).

Circling back to a theme that has surfaced repeatedly at these blockchain summits, Brody said that EY understands enterprises will move to public blockchains when they are assured of privacy — not anonymity — and that the firm has worked to make that privacy possible through a partnership with Polygon Nightfall, a “privacy-centric Layer 2 network built on technology developed by EY teams and placed in the public domain.” TBR cannot assess the technological aspects of Polygon Nightfall, but two critical elements stand out from Brody’s presentation of it: First, EY dedicated people and money toward developing the technology, likely included as part of the firm’s planned $200 million in blockchain R&D spend in 2021, up from $100 million in 2021. Second, the firm released the technology into the public domain, demonstrably committing to public blockchains and EY’s role as a positive force in the ecosystem. Critically, Polygon Nightfall neatly complements EY’s existing blockchain solutions EY OpsChain and EY Blockchain Analyzer, which Brody explained the firm had expanded in the last year.

    • EY OpsChain, which notarizes documents, tokenizes assets, mints NFTs, traces raw materials and manages procurement, had a full production launch for traceability and a beta launch for application programming interface (API) services and inventory management. The latter two are critical to connecting networks and enabling the shift to smart supply chains, tying back to Brody’s suggestion that blockchain will be the ERP equivalent change agent for business networks.
    • EY Blockchain Analyzer, previously only available to EY audit clients, has been opened to non-audit clients, broadening the reach of EY blockchain software with an eye toward the 35x investment yield Brody stressed happens as emerging technologies move into early and later majority adoption over a 15-year period. The product, which reconciles transactions, tests smart contracts and calculates capital gains, has added functionality for reviewing and more options for testing smart contracts (see below). Users can now create, save and share custom tests.


Brody netted out the two suites as covering the essentials of every asset, business process and industry, with every transaction consisting, in his words, of “money, stuff, swap, subject to agreement.”

Vision, execution, results: EY’s track record in blockchain has yet to be challenged

Brody’s opening tour d’horizon highlighted the biggest blockchain trends and EY’s latest developments while also, in TBR’s view, subtly understating EY’s core value to its blockchain clients and the blockchain ecosystem. The firm’s investments include R&D and people — not just the techies capable of developing solutions like Blockchain Analyzer and the rest but also the consultants who can explain the business value and the tax, audit and risk experts who can help clients understand the effects of blockchain on their enterprise. The tool sets, which may be the most underrated but critical aspect of EY’s approach, demonstrate EY goes beyond just hyping, advising and implementing others’ technologies and into developing its own solutions and putting the EY brand — trusted, humans at the center — behind those solutions. A yearslong effort, these tools, along with the people, institutional knowledge and stress-tested capabilities, cannot be easily replicated by competitors. In essence, EY brings consulting and trusted technology into a space littered with hype and opportunities.

We cannot help but repeat what we said one year ago: “But trust, along with translating government intentions to trackable compliance checks, will remain the last bastion of business value in an otherwise commoditized state of the technology industry as we will come to know it as more legacy players fall victim to creative destruction and Moore’s Law Economics. EY, and more specifically, Brody, has a more clear line of sight on how public blockchain networks will evolve on par with the way the public internet evolved than anyone in the technology industry today. It would be foolish to bet against them and wise to partner with them.”

TBR did note the seeming absence of at least one of EY’s traditional blockchain partners, indicating the firm’s maturity in this space may be outpacing previously strategic partners, a development TBR will watch closely over the remainder of 2022. After Brody’s opening, the next round of presentations and panels dove deeper into specific themes and challenges in the blockchain space. Everyone — academics, bitcoin bros, bankers and solarpunks — buys into Brody’s assertion that $1 in blockchain revenue today will be $36 to $40 in blockchain revenue in 15 years.

Smart contracts are proven use cases, helping EY scale up its blockchain portfolio

In addition to the morning plenary, TBR attended an afternoon session on testing the functionality of smart contracts on EY’s Blockchain Analyzer. The presentation and demonstration, led by Sam Davies, EY global blockchain platform lead and engineering manager, and Karin Flieswasser, product owner of EY Blockchain Analyzer: Smart Contract & Token Review, helped participants understand EY’s tools, beginning with the strategies and philosophies behind specific capabilities, restrictions and attributes. (Note: TBR has listened to countless product demonstrations and has rarely heard a description of the mindset going into improving a product and the very basic “why” a solution could and should be changed. This was a welcome change from the assumption everyone would know the thinking behind the technology.)

Over the course of the hour, Davies and Flieswasser demonstrated various permutations of a use case that undoubtedly resonates with administrators of smart contracts wondering, “How can I be sure this thing will work the right way?” Davies began the discussion by detailing a few smart contracts gone wrong, and Flieswasser then described how EY’s Blockchain Analyzer Smart Contract Testing and Review system could have forestalled those issues.

In recent years, blockchain clients (and potential adopters) have consistently told TBR that reluctance to adopting smart contracts begins with uncertainty about the human element, not the technology. With that in mind, two elements of Davies and Flieswasser’s presentation stood out for TBR. First, the tool itself appeared to be intuitive and user-friendly, with every option, drop-down, task and function self-explanatory — a welcome respite from the usual hyper-tech talk around blockchain. Considering people tasked with administering smart contracts may more likely reside in procurement, supply chain management or even human resources, keeping the tech simple to use will likely accelerate adoption. Second, all of the testing and review perfectly mimic on-chain realities without actually using, compromising or changing any on-chain data.

While that should be an obvious characteristic, Flieswasser repeatedly emphasized the point — and took clarifying questions on it — leading TBR to believe this feature figures prominently in the risk management concerns of enterprise smart contract administrators. Lastly, the two presenters themselves, hailing from the U.K. and Israel, reinforced the global nature of EY’s blockchain Practice, and during a post-session discussion, Flieswasser noted the Blockchain Analyzer team is relatively small and geographically diverse. In TBR’s view, smart contracts can be a readily understood blockchain use case and may be one of the quiet catalysts for enterprise ecosystems’ blockchain adoption. Making smart contracts less risky by deploying easy-to-use test and review systems will likely be a critical element to accelerating adoption.

Crypto’s hope and hype are dashed by the history of money, bolstering EY’s role as the community shepherd

If the past is also the prologue for EY innovation, then EY’s foray into smart money tied to smart contracts will likely start in the consumer space. Just as EY’s first scaled blockchain use case was assisting Microsoft with tracking developer royalty payments, this concept has test cases starting with loyalty rewards programs and consumer gaming. In this manner, smart money use cases with small-dollar impacts will not roil capital markets. If the technology works, then it can be applied to higher-value situations in both wholesale and retail financial settings.

In his talk Brody made clear a distinction between privacy and anonymity. One blockchain camp stresses anonymity, and Brody and EY are in the privacy camp. To audit and attest business transactions to regulatory agencies, there cannot be anonymity. Privacy, however, protects the information on a need-to-know basis, leaving competitors unable to garner valuable business information regarding private matters such as unit pricing and discount structures.

When it comes to the overall merit of and need for cryptocurrencies, University of Southern California (USC) professor and former U.S. Federal Reserve executive Rodney Ramcharan’s keynote provided a history lesson on the U.S. dollar, offering ample evidence of lessons learned from not having a reserve bank to backstop against runs on a currency. In this regard, fiat currencies and stablecoins tied to fiat currencies rather than to algorithms appear to provide the kind risk mitigation that will be necessary for commerce. Crypto as a wealth store on par with gold is a different application area where risk is unquestionably higher.

In the past two iterations of TBR’s Digital Transformation Blockchain Market Landscape, we have provided some initial analysis on central bank digital currencies (CBDCs) and DeFi with a few developments worth noting, including the recently published paper by the Federal Reserve Board focused on CBDCs, in particular the digital dollar; the U.S. Security and Exchange Commission’s approval of a Boston-based exchange — BOX Exchange — that will use blockchain for faster settlements and potentially enable exchange tokenized securities; and lastly President Joe Biden’s executive order ensuring the responsible development of digital assets, including CBDCs.

The U.S. government’s awareness of and initial interest in CBDCs are steps in the right direction toward recognizing the implications of digital assets for the economy and everyday consumers. However, given the complexity, particularly around reaching consensus among community participants on the governance side, we believe it will be a while before a digital U.S. dollar will be deployed at scale for everyday merchant transactions and trade. The implications between wholesale and retail CBDCs carry risks, scale, speed and rewards. Connecting Main Street and Wall Street economies through blockchain is a necessary step that we believe will have a bigger, broader impact on enterprise buyers’ digital transformation (DT) initiatives. One might see such a framework as a bit of a long shot, but historically, financial services institutions have paved the way in new tech adoption.

Below is a direct quote from a CTO and a blockchain executive we recently spoke to that perfectly summarizes the implications around CBDCs.

“First, you have to differentiate between wholesale and retail. So if I’m talking about wholesale, then I’m probably talking about cross-border transactions between central banks or Tier 1 banks, for example. And so those are low transaction volume but high-value transactions. So that’s very important to get that right, more than anything else. And I can’t afford to have that hack because we’re talking billions of dollars. So, again, the experiments have proven that it can be done cross-protocol. I know I’ve seen some standards proposed in this space, mostly by some folks at Bank for International Settlements.

So they’ve done a lot of CBDC work. There’s a gentleman in Singapore who has proposed that, if you peel back the covers, he’s basically proposing everything should be on Quorum, everything should be on JPM Coin, which I don’t think that’s going to happen. But nice try, buddy. But you could maybe argue, OK, somebody like SWIFT could say, ‘OK, for international banking, at the wholesale level delivery versus payment kinds of scenarios or end up day netting between multiple banks, we can help you come up with a standard between the banks.’ Again, the technology will have to evolve to meet that because if you’re doing integration between the two different protocols, that’s a weak spot. That’s an attack vector for a hack right off the bat. So if I’m a hacker, I’d be looking at that kind of cross-border protocol switching, or integration play.

“Now at the retail level, let’s say we’re talking about replacing U.S. dollars, for example, with digital dollars, whatever. First of all, I’ll believe it when I see it, because the technology has to scale up to those, that level of transaction. But same thing, it could be, ‘OK, I’ve got my digital dollar, I’ve got an app on my iPhone, now I traveled to Japan, should there be an app, or should there be some bridge between the digital yen and the digital dollar?’ I think that’s decades off. If I’m a central bank in Japan, I’m going to be really, really careful about letting people plug into my letting travelers, for example, plug into my network or do conversions of a digital dollar to a digital yen, just again, for fear of the hacks, the fear of attacks. That loss of control, perhaps over the circulation of that digital yen, the only place where that might work. And now we’re really getting political here.

But you could probably argue that the whole reason that China’s doing its digital yuan, for example, is really about social control. So they have the social scoring in China, where, OK, if [someone] talks negatively about the Communist Party, then he gets points added to this, or points deducted from a social score, however it works. But it prevents you from getting credit, for example, prevents you from getting a plane ticket, things like that.

So they’re really trying to control behavior, social behavior with this point scoring system. And forcing everybody to use digital money really plays into that, because OK, now that [someone] has a negative score, I can block his account, I can prevent him from spending money, I can deduct money from his account, that sort of thing. To me, it seems like the digital currency in China really is just an extension of control of the population. And so maybe in that sense, like, if I go visit China, they really would want me to convert to their digital currency, because they could control it. They could see what I do, they could see where I spend it. And they could block me from accessing it if they want to. So yeah, that’s the negative side of that integration that you were talking about. OK. They would let me use their digital currency because they have ulterior motives for doing so.”

Conclusion

In-person events provide opportunities to gather insights and information not shared on a screen or on the plenary stage. Perhaps the two-year absence from being live in New York City helped make the participants more eager to talk. From conversations with blockchain entrepreneurs, crypto-enthusiasts and EY professionals, TBR heard two common themes.

First, the skepticism around cryptocurrencies has not been skeptical enough for what is out there and what is coming. The current split on crypto falls along the lines of regulation versus total anonymity, with regulated, stable currencies having greater potential than the unregulated coins that have roiled capital markets of late. Further, bad actors, present in any ecosystem, would be shaken out if governments regulate the new instruments (history as prologue), provided total anonymity does not win out.

Second, enterprises and the blockchain providers servicing them increasingly see smart contracts as the use case most likely to scale and accelerate blockchain adoption across the enterprise ecosystem. A final nugget specific to EY made the (persuasive) argument that EY’s most successful blockchain-related engagements to date reside in the firm’s Tax and Risk practices. In TBR’s view, the fact that EY is doubling its R&D spend in blockchain yet earning the most blockchain-related revenue in its legacy practices may be the most compelling evidence of the firm’s all-in bet on blockchain.


Instantaneous interconnectivity: Inside the Department of Defense’s ambitious plan for JADC2

What is Joint All-Domain Command and Control?

Joint All-Domain Command and Control (JADC2) is an evolving Department of Defense (DOD) vision to revamp the Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) programs currently in use across all U.S. military branches. The infrastructures in place at, for example, the U.S. Army, are largely unable to function at a seamless level with the networks of other branches, such as the U.S. Space Force. Additionally, these infrastructures do not meet the DOD’s requirements to handle rapidly evolving and highly complex new-age battlefield situations that require urgent, coordinated responses from U.S. armed forces.

 

JADC2 is an effort to rectify these dilemmas by creating a cloudlike environment that enables the rapid receipt and transmission of intelligence, surveillance and reconnaissance (ISR) data to interconnected networks. By developing a unified network that enables sensors on Internet of Military Things (IoMT) devices to instantly pass on mission-critical information to leaders, more informed and coordinated decision making is possible across the U.S. military’s branches. Decision makers can act faster and establish more cohesive battlefield tactics, factoring in land, sea and air threats with additional support from each other’s assets due to this common operating picture (COP) being immediately relayed to the relevant parties via machine learning (ML) and AI support.

 

Vendors covered in TBR’s series of Public Sector and Mission Systems reports have been increasingly involved in JADC2. It provides a sizable opportunity for vendors with these areas of expertise.

What will be needed to enable JADC2?

In March, the Pentagon published its official JADC2 strategy, which included five “lines of effort” that the JADC2 Cross-Functional Team (CFT) will work on to bring the DOD’s vision closer to reality. The first goal is to set up a uniform “data enterprise,” which includes creating guidelines for baseline metadata tagging. Next, the JADC2 CFT will leverage digital tools like AI to support decision makers and engage in efforts to advance integral technology. The Space Development Agency (SDA) will then establish a network that enables communication across branches and weave nuclear command, control and communication (N3) systems into the overarching JADC2 program. Lastly, the DOD will strive to better connect mission partners by streamlining the exchange of data.

 

This lofty goal of rapidly parsing relevant data from battlefield situations and enabling decision makers to be more agile will require a lot of support. For example, DevSecOps will build out customizable capabilities for JADC2 based on a department’s needs. The electromagnetic battle management system (EMBM), a core piece of the DOD’s vision, will be underpinned by DevSecOps using electromagnetics that will aid branches of the U.S. military, such as the U.S. Air Force, with tasks like identifying and connecting data. Advancing AI technology will also be critical to JADC2’s success and require contractors to increasingly expand their capabilities.

For example, Booz Allen Hamilton (NYSE: BAH) has been positioning itself to capitalize on AI and analytics demand since 2018 with a series of inorganic and organic investments. TBR anticipates Booz Allen Hamilton will play a key role in helping to produce new tactical support systems leveraging AI and familiarize warfighters with newer technologies like directed energy weapons. Additionally, Peraton Labs has been building out its Operational Spectrum Comprehension, Analytics and Response (OSCAR) solution, which will bolster the DOD’s efforts to bring interoperability across the nation’s military branches by leveraging AI as well as 5G technologies.

 

JADC2 will also require an anti-fragile cloud environment underpinned by 5G technology, which is where military contractors like Lockheed Martin (NYSE: LMT) and Northrop Grumman (NYSE: NOC) have been looking to capitalize. In November 2021 Lockheed Martin formed an alliance with Verizon (NYSE: VZ) to enable interoperability among legacy networks and devices already in use as part of the contractor’s efforts to provide 5G connectivity through its 5G.MIL unified infrastructure. Lockheed Martin has since expanded its partner network to include Keysight Technologies (Nasdaq: KEYS), Microsoft (Nasdaq: MSFT), Intel (Nasdaq: INTC) and Omnispace to assist with 5G.MIL, streamlining network communications for both IP and non-IP users.

Meanwhile Northrop Grumman formed an alliance with AT&T (NYSE: T) in April to analyze digital battle networks and integrate Northrop Grumman’s systems with 5G commercial capabilities and AT&T’s 5G private networks to establish a scalable open architecture for the DOD. To do this at the scale the DOD wants, Lockheed Martin and Northrop Grumman will need to build out their partner networks among startups and fringe players while continuing to build out relationships with major names like Verizon and Microsoft.

 

The military/DOD will increasingly require IT assistance to underpin the JADC2 initiative. While the military’s outsourcing efforts will certainly play a part in bringing JADC2 closer to fruition, the branches are expected to bring on more IT workers of their own and invest in systems integration as well as methods to educate these employees and retain them to help build, maintain and troubleshoot applications.

 

Currently, the military branches are working on their own programs compatible with the DOD’s JADC2 vision. For example, the U.S. Air Force is developing its Advanced Battle Management System (ABMS), which has undergone periodic testing in public since December 2019. Recent efforts indicate the U.S. Air Force is trying to fit KC-46 Pegasus tanker aircraft with pods linking F-22 aircraft and other solutions on the ABMS network, which would allow more information to be exchanged. Meanwhile, the U.S. Navy has been working on Project Overwatch while the U.S. Army has been expanding Project Convergence to include additional features that will contribute to its success. For example, the Army’s FIRESTORM system leverages AI that scans relevant points with sensors, maps out a digital battleground, tags hostiles and selects the optimal weapon for the circumstances.

What are the fears surrounding JADC2?

While JADC2 has a lot of potential, there are several concerns with the DOD’s vision, beyond just getting these systems to communicate through one language.

Security

Fears about JADC2’s adaptability and resiliency are prevalent, particularly because China and other countries have invested in disruptive technologies like an anti-access/area denial (A2/AD) conflict deterrence system that could impede JADC2 and other communication networks’ functions. There has been very little discussion about how JADC2 would combat these disruptions or function in these contested environments outside of test settings when facing the brunt of foreign adversaries’ disruptive technologies. The DOD will need to ensure it can generate as much relevant information as possible from a limited number of sensors while maintaining undetectable networks capable of surviving enemies’ efforts to degrade or disrupt the relaying of information.

Design

Accenture (NYSE: ACN) Federal Services Managing Director Bill Marion also emphasized that human-centered design will be necessary throughout JADC2’s framework to ensure that warfighters and decision makers can easily navigate these interconnected networks and learn about all of their capabilities to maximize their use.

Affordability

Targeted internal investments are necessary to implement JADC2. Companies like Raytheon Intelligence & Space of Raytheon Technologies (NYSE: RTX) will need to develop and connect new IT infrastructure and update legacy systems to ensure they are compatible with JADC2 utilizing a cost-effective approach. Simultaneously, affordable and functioning multilevel cybersecurity solutions that can support the DOD’s desired instantaneous relaying of data and commands will be needed. Currently, there are concerns about enemies being able to hack into the MIL-STD-1553 serial data busses found in IoMT weapon systems. External parties might be able to breach the 1553 data bus and either shut down or actively use these connected armaments on U.S. personnel.

Contractors will need to find ways to protect the 1553 data bus from these threats, and Peraton Labs is already collaborating with military branches to establish Bus Defender capabilities. With the DOD looking to interconnect IT systems across all military branches, TBR anticipates that General Dynamics (NASDAQ: GD) Technologies is aiming to be the DOD’s preferred IT vendor by utilizing Agile methods to expedite the construction of tailored prototypes after first consulting with clients and showcasing the contractor’s base zero-trust solutions.

Ultimately, the journey to JADC2’s implementation will be long and complex. The DOD’s ambitious project will certainly face an ever-shifting road to implementation as there is no true endpoint for the project. Key components like hardware will need to be updated, policies will be amended, and the scope of JADC2 will grow, especially as the U.S. eyes getting allies involved with JADC2 in the future to establish a more unified cloudlike environment capable of streamlining the transference of data to all nations. If all goes well, the U.S. will be able to truly integrate its military branches, allowing them to overwhelm adversaries by using mission-critical data to make better, more informed and coordinated tactical decisions. The U.S. will aim to control the next-generation battlefield by gaining the upper hand on intelligence and rapid communication.