Capgemini’s CornerShop: Redefining physical retail shopping in the post-pandemic environment

Evolving the customer experience

In February Capgemini, in partnership with connected experience platform SharpEnd and global media platform The Drum announced the CornerShop. Physically located in London, the CornerShop is a retail innovation store designed to transform shopping and customer engagement post-pandemic. CornerShop works as a live testing environment for brands, retailers and shoppers to reimagine the shopping experience through new technologies.

Utilizing the management consulting and digital design capabilities of Capgemini Invent and insights from research and data analysis, Capgemini will enable retailers and brands to evolve the customer experience, improve in-store operations and help customers rediscover in-person shopping with new engagement options in physical stores. TBR sees CornerShop as closely aligned with Capgemini’s Inventive Shopping offering, which addresses customer-related challenges such as new ways to engage, shop and build loyalty.

Partnering for post-pandemic retail

Overall, Capgemini utilizes advisory and digital design capabilities — some of which came through acquisitions, such as that of Idean and Lyons Consulting Group in 2017 and LiquidHub in 2018 — and Salesforce Commerce Cloud capabilities to win deals.

Last November Capgemini announced the completion of a personalized and data-driven e-commerce initiative for the U.S. website of Italy-based sportswear manufacturer Fila. Capgemini provided digital marketing, e-commerce, experience design, application integration and support services, improving the client’s ability to support online sales after the onset of the pandemic and closure of physical stores. The website, built on Salesforce Commerce Cloud, also uses Salesforce Service Cloud and MuleSoft to integrate the digital ecosystem. In July 2020 Capgemini completed the implementation of a new business-to-business face mask program for Gap Inc. Capgemini created a website in less than five weeks and a solution enabled by Salesforce Commerce Cloud to help Gap quickly add nonmedical cloth face masks to its online store offerings.

Expanding its industry-specific solutions by selecting priority industries for its portfolio development, such as consumer goods and retail, improves Capgemini’s ability to address pandemic-driven challenges specific to each business and industry segment. Consumer goods and retail, which accounted for 12% of revenue in 2020 and declined 0.1% year-to-year in constant currency, is on a path to recovery from the negative effects of the pandemic. Revenue in the segment declined 5.5% year-to-year in 2Q20 and 2.4% year-to-year in constant currency in 3Q20 but increased 4.3% year-to-year in 4Q20, indicating the challenges tied to store closures, supply chain interruptions and social distancing are easing up. Transforming the shopping experience through the CornerShop will help Capgemini increase activities with retail clients as they gradually ramp up sales and open physical locations as the pandemic abates.

TBR’s coverage of Capgemini includes quarterly detailed reports on the company’s financial performance and strategies related to go-to-market, resources, alliances and acquisitions. Capgemini is also covered in TBR’s quarterly IT Services Vendor Benchmark and semiannual Management Consulting Benchmark, and the company is featured frequently in our monthly Digital Transformation reports. 

The state of crowdsourcing

In February TBR virtually attended the Global Technology & Business Services Council’s Global Series: Open Talent conference and heard from leaders across the technology and crowdsourcing industries about emerging themes and trends. While it is not a new phenomenon, crowdsourcing is becoming a compelling delivery model in the IT services space as enterprises increasingly embrace remote services during the pandemic and seek out new ways to fill skills gaps, drive cost savings and accelerate engagement turnaround time. Platform-based crowdsourcing companies such as Topcoder and Freelancer.com, which TBR heard from during the conference, are rapidly expanding their communities of technology-oriented freelancers and driving new use cases with large enterprises that would have traditionally gone to IT service vendors. In TBR’s view, vendors that are not embracing this shift stand to lose the most. At the same time, we question whether those that bring this model to market are really positioned to gain much.

Existential threat or just a piece of the puzzle?

Now more than ever, crowdsourcing and open talent models are proving to be significant disruptors in how services are delivered, and technology appears to be an area that can benefit the most from this trend. Prior to the Global Technology & Business Services Council (GT&BSC) event, our understanding was that organizations engaged with the aforementioned crowdsourcing platforms mostly around the ability to tap large pools of talent much faster and cheaper, but largely for low-value, task-oriented services. To our surprise, during the event TBR learned about use cases in which Fortune 500 companies and renowned research institutions turned to “the crowd” for sophisticated software and data services, which led to significant improvements in speed, cost and even quality. For instance, a coalition between several enterprises and academic institutions such as Harvard and the Massachusetts Institute of Technology (MIT) opened up a project to the Topcoder community around the optimization of a DNA sequencing algorithm, with the goal of surpassing what had been regarded as an “impossible” threshold. Dozens of submissions crossed the threshold within 24 hours. At first glance, the use of these platforms for high-value services poses a threat to IT services vendors, but TBR notes some caveats.

White-labeling labor

It is necessary to point out that the majority of IT services vendors’ activities in the crowdsourcing space happened pre-pandemic. For example, Wipro (NYSE: WIT) made a splash in the industry when it acquired Topcoder in 2016. And in the years following this, other vendors such as DXC Technology (NYSE: DXC) and Deloitte pursued partnerships that enabled the vendors to launch crowdsourcing services by tapping into Topcoder’s labor ecosystem. First, this distorts the image of crowdsourcing platforms as competition and instead reflects a more symbiotic relationship. Second, while Wipro might be able to take a small commission on engagements done through Topcoder, it lacks a significant competitive advantage over peers that partner with it or other similar platforms.

Security and trust

Soliciting bids from unknown global technologists presents obvious risks. This model is not suited for workloads involving sensitive data and therefore is not gaining the same traction in industries such as financial services or healthcare, where data security and privacy are top concerns. IT services vendors that cater to this clientele will be much more capable of steering clients away from crowdsourcing services and commanding profitable revenues. Similarly, many firms’ value propositions revolve around the reputation of the company and its quality of services, helping them garner more trust-based relationships. Clients seeking this level of service will largely be uninterested in crowdsourcing. This concern will also put pressure on services providers that partner with crowdsourcing platforms, as well as the platforms themselves, to establish guardrails against potential leaks or security breaches, but it remains no easy task to vet millions of global freelancers.   

Bottom line

These platforms are intended to optimize costs and speed. While IT services vendors likely do not want to miss out on any opportunities to engage with potential clients, partnering with a crowdsourcing provider and delivering the cheapest possible services will limit margin growth. Instead, we see opportunities for vendors to embed this open talent model into their organizations to improve utilization and more efficiently deploy staff. We learned during the GT&BSC event, for example, that Deloitte partnered with Freelancer.com in 2019 to develop an internal marketplace for Deloitte’s employees to join open projects within the firm and integrated the solution into the Freelancer.com ecosystem so Deloitte could extend its pool of external resources if needed.

What the future holds

If anything has stood out during the pandemic, it is that incumbents in every industry must be prepared to quickly pivot and adjust to new and nontraditional ways of doing business. In the IT services space, TBR believes disruptions such as the gig economy and crowdsourcing pose the biggest threat to management consulting firms, where generational and technology shifts are creating instances in which enterprises may opt to collect third-party opinions and advice through these types of platforms instead of via expensive consultancies, which can be as enigmatic as unidentified, crowdsourced respondents. In general, IT services vendors that have established pathways into this type of model will benefit from bringing new logos into account ecosystems, which will provide opportunities downstream to upsell higher-value services. We anticipate crowdsourcing will continue to play a supplementary, but necessary, role in IT services as a way for companies to easily scale services at the expense of security and margins. But much like organizations’ hesitation to fully embrace the cloud for their IT ecosystems, taking a hybrid approach to “the crowd” will likely remain the preferred method for most enterprises to minimize risk while still reaping the benefits of scalability and speed that the crowdsourcing model offers.

TBR’s Professional Services practice will continue to monitor the trends outlined above and provide analysis across our syndicated vendor reports and benchmarks, notably the IT Services Vendor Benchmark and Global Delivery Benchmark. The next iterations of these two products, which synthesize TBR’s in-depth analysis and data across covered vendors, are set to publish in April.

New NTT Global Data Centers facilities in Chicago and Oregon solidify infrastructure footprint and position the vendor for continued growth

As part of parent company NTT’s July 2019 restructuring effort, a separate company called NTT Ltd. was formed, which unified 31 global brands to create a 40,000-person, $11 billion company dedicated to offering IT, cloud and colocation services to large enterprises. At the center of NTT Ltd.’s strategy is NTT Global Data Centers, a separate division that offers a portfolio of global data center assets including RagingWire (Americas), NTT Communications (APAC), e-shelter and Gyron (EMEA), and Netmagic (India). With over 160 facilities, NTT Global Data Centers is now the third largest global data center provider.

Americas expansion to support NTT Ltd.’s growth in 2021

On Feb. 25, NTT Global Data Centers held a virtual event to unveil its two new data centers, in Hillsboro, Ore. (HI1), and Chicago (CH1). Both CH1 and HI1 are currently 36 megawatts (MW) but are expected to expand to 76MW and 126MW, respectively, to support increasingly complex IT workloads for both hyperscale and enterprise customers. NTT’s roots in telecommunications allow it to provide a broad portfolio of carrier-neutral connectivity options within each data center. Meanwhile the company’s IT services arm is also strong with offerings such as Remote Hands, which removes the need for on-site service and maintenance and has been in high demand during COVID-19.

The establishment of HI1 and CH1 marks the beginning of NTT Global Data Centers’ Americas expansion efforts for 2021. The company plans to open a campus in Silicon Valley, break ground in Phoenix, and expand its campus in Ashburn, Va., while the attach of various connectivity products and managed services will continue to support growth throughout the year. In a company press release, Doug Adams, CEO of NTT Global Data Centers Americas, highlighted the openings in the context of plans for the year, which he stated “will be a year like no other for our division, and opening these two new data centers is just the beginning [of] efforts that underline our commitment to put our clients at the center and bring data center services to key data center markets across the Americas.”

New data centers are strategically placed to address varied client needs

As the colocation market in the U.S. becomes increasingly crowded, NTT Global Data Centers expands in strategic markets to support its retail and wholesale colocation strategies and address the needs of clients regardless of size. This includes pursuing markets with access to affordable sources, low risk of natural disaster and the ability to support connections to emerging markets, among other factors.

NTT Global Data Centers supports 100% renewable energy

As technology sustainability remains a top-of-mind concern for CTOs, NTT Global Data Centers continues to operate on a message of clean energy and efficiency. Specifically, the new HI1 facility is an appealing option for customers looking to consume renewable energy options, while the new campus has earned a Level 3 certification from the Cleaner Air Oregon program, setting NTT Global Data Centers apart from competitors as it is the only data center in the region to receive the certification to date.

Local cost-saving opportunities support lower TCO for customers

One of the key attractions of NTT Global Data Centers’ CH1 facility is access to state and local tax incentives on equipment. Additionally, CH1 is powered by local energy company Commonwealth Edison (ComEd), which offers electricity at rates ComEd states are 18% lower rates than the national average. These initiatives are designed to support lower total cost of ownership (TCO) for customers through cheaper electricity, sales tax exemptions and lower cooling requirements.

Subsea cabling

NTT Global Data Centers has targeted the Pacific Northwest with HI1 due to accessibility to subsea cables that can connect across regions. With HI1, locally housed customers have an opportunity to access strategic markets in Japan as NTT Communications acquired Pacific Crossing for its subsea cable in 2009, supporting data communication between the U.S. and Japan. This connection with HI1 allows customers to contract with NTT Global Data Centers on a single cable and eliminates the need for multiple contracts, underscoring NTT Global Data Centers’ approach of leveraging parent company NTT to support clients. TBR believes NTT Communications’ strong foothold in Japan will boost NTT Global Data Centers’ ability to provide customers with low-latency connections between two emerging markets, serving as a growth driver and offering differentiation from other colocation peers.

Eyeing the future: Accenture’s fundamentals drive human-centric technology change at scale

‘Leaders Wanted — Masters of Change at a Moment of Truth’

Accenture’s (NYSE: ACN) recent virtual event to introduce its Accenture Technology Vision 2021 kicked off with a quick recap of the socioeconomic headwinds of 2020. These headwinds include four new concerns facing people personally and professionally: an increasing global population driving a need for new ways of interacting; the evolution of “Every business is a tech business” as technology’s role changes with the changing environment; the workforce of the future; and sustainability. Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty then outlined in detail the five major trends of its 2021 vision.

Delivered under the slogan “Leaders Wanted — Masters of Change at a Moment of Truth,” the vision highlights five key areas, which we expect to drive investments not just from Accenture but also peers and enterprises, given the company’s market-making status in multiple domains.

  1. Stack strategically: While this trend at its core applies to architecting and redesigning organizations’ technology stacks to support the enterprise of the future, which includes attributes from the customer experience to the security layer, it also maps to Accenture’s core value proposition of joining consultants, designers, researchers, solution architects and delivery personnel, all through the umbrella of Accenture Innovation Architecture.
  2. Mirrored world: The resurgence of the digital twin is moving beyond experimental phases, and large enterprises are seeing an opportunity to invest in an area that, in the era of COVID-19, which has led to social distancing and reduced access to physical plants, will allow them to use IoT techniques to enable remote monitoring and control. Accenture’s ongoing investments in mobility and IoT service offerings over the past five years, along with the recent push into product engineering offerings, largely enabled through acquisitions, will enable the company to address demand and increase client stickiness.
  3. I, technologist: The democratization of technology, which has enabled workforces to do more with less and orient their productivity to higher-value tasks largely enabled by automation, while not a new trend, has certainly reached a pivotal point, given the changes over the past 12 months in how employees perform their work. Accenture’s rigorous approach to and ongoing investments in training — including spending $1 billion per year on reskilling and upskilling personnel, with efforts most recently focused on building cloud consulting, architecting and delivery skills — enable it to drive internal change at scale, and then sell its capabilities “as a Service” to clients.

On Feb. 17, 2021, Accenture held a one-hour virtual session introducing its Accenture Technology Vision 2021. While the format was different than in previous years, the 21st iteration of the summit had a similar goal: to portray Accenture’s technology prowess and appetite for innovation and scale. Hosted by Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty, Accenture Senior Managing Director and Lead – Technology Innovation and Accenture Labs Marc Carrel-Billiard, and Managing Director – Accenture Technology Vision Michael Blitz, the virtual delivery of the content was both a sign of times and a demonstration of Accenture’s ability to coordinate, deliver and manage virtual events in collaboration with ecosystem partners — in this case, Touchcast. 

Who’s there?: The rise of multienterprise business networks

Not everything about business is technology, but every business has to leverage technology everywhere

Over the last few years, executives discussed redesigning their businesses for the safe, secure and accurate flow of actionable data with as little human involvement and oversight as possible, a change Google describes as removing the “human toil” from economic activity. Business leaders called this process optimization, a process often resisted by employees which in turn slows an organization’s digital efforts. Organizations big and small have been forced to embrace a cloud- and digital-first posture to maintain business continuity and participate in everyday economic activity. In short, these efforts are being done to maintain relevance. As a result, nontechnology-savvy executives and employees will exit the workforce exponentially over the next five years.     

In this transformative period, future managers train now at new entry-level IT jobs, even as IT services vendors and other players in the technology ecosystem complain about a shortage of STEM talent in the hiring markets. The talent that does come on in new roles spread across a digitally savvy enterprise understands application interfaces, which align human interaction with technology and data platforms. By entering the business in this capacity, the incoming talent gains experience across the various elements of the business operation that executive managers require while also ensuring they are fully digitally versed for the Business of One.

Adding further complexity has been the disaggregation of business functions or value among different business entities. In technology we see this as the IP-centric elements of a business being split away from the labor, or task-centric functions. Looking at semiconductors, for example, some on Wall Street are calling for Intel to be split between the IP-laden aspect of chip design and the capital-intensive aspect of fabrication plants capable of manufacturing those designs reliably at scale. They are two businesses with entirely different rhythms and economic drivers, yet neither can thrive without the other.

The work-around to this business disaggregation taking place is to establish a network of businesses with complementary value propositions. This network is increasingly being called the multienterprise business network (MEBN). Many technology-centric firms describe this as their platform. But platforms are a stage on which something is performed, and that performance is the outcome enabled by multiple different parties. As such, viewing MEBNs from solely a technology-centric view can miss the point entirely.

As the Business of One evolves, legacy technology vendors selling on technical merits, or speeds and feeds, and selling just to IT face tremendous market pressures to pivot to selling business outcomes. Today’s reality requires understanding customers’ busines objectives and speaking directly to business decision makers.

For Technology Business Research’s (TBR) Digital Practice, this necessitates taking our core value proposition of vendor-centric business analysis of technology companies across a standard technology business value chain and combining it with additional considerations about industries and the operating best practices of business ecosystems that tie back to the specific use case and the personas integral to that use case. After having established those core frameworks, the analysis then ties back to time horizon and MEBN participant. In short, what is in it for the MEBN participant at what stage (commonly referred to as Horizon 1, 2 and 3 in today’s frameworks) in the MEBN product road map.

To illustrate the intent here, consider the creation of an MEBN for the utilization, storage and maintenance of autonomous vehicles. Having autonomous vehicles moving about a defined geography would clearly be the Horizon 3 aspiration, which is nowhere near commercial reality today.

Horizon 1 would be delivering an immediate level of business value creation to entice the participants necessary for that Horizon 3 aspiration. For example, gas stations, mechanics and parking garages, at a minimum, will need to be recruited into the MEBN for autonomous vehicles. Later, additional services for the auto owner could be added such as online ordering with brick-and-mortar pickup across various nontech-centric small businesses providing localized services. Creating a buyer network in Horizon 1 for today’s cars and owners has to provide sufficient business value for enrolling participants.

The capital investment in the technology infrastructure likely must come from the Horizon 3 business benefactor and be viewed as a long-term investment to facilitate the recruitment of the necessary member participants. In the end, those autonomous vehicles will need the fueling, maintenance and parking services to function and the adjacent human services of pickup and delivery to increase their utilization rates beyond a source of human transport. Yes, it requires a technology value chain as its backbone, but nontechnology participants are just as necessary to flesh it out into a thriving MEBN of buyers and sellers who may not even concern themselves with the technology underpinnings at all.

More colloquially, few singer-songwriters would have the capital necessary to build the technology assets for downloading music over the internet. But once Apple took a long view to their investment posture, it was able to build out a robust MEBN that profited many artists, disrupted traditional nontechnology businesses, and delivered value to many customers in the form of the iTunes platform, which itself has been disrupted by streaming services such as Spotify and Pandora.

TBR’s Digital Practice remit is to take its core value proposition of discrete company business model analysis and apply it to the MEBNs by isolating the different components through a series of frameworks. In doing this, we will then be able to assess the financial impact for the different member participants across the near-term, mid-term and long-term horizons.

Industries have different automation leverage points, enabling different personas; inexpensive tech makes possible a myriad use cases

Compute ubiquity has been well documented. The multimillion-dollar supercomputer performance of yesteryear is now contained in smartphones. The first IBM PC chip, the 8088, is now matched by CPUs the size of a grain of sand that cost $0.10 to produce. Historically, the heavily regulated industries of financial services and healthcare were early technology adopters, given the risk exposure of noncompliance with government regulation. As the cost of compute was brought down to incredibly inexpensive price points, compute expanded from those back-office functions into front offices. Today, we are at a point where, as on EY executive summed it up in analyst interaction when peppered with multiple questions: “We can do whatever you want; you just have to make up your minds.”

Making up our minds translates into codification of standard business results to digitize activity in a consistent way, and this sits at the heart of multiple game-changing technologies including AI, machine learning and blockchain. And these are horizontal technological capabilities that cascade through a variety of industries. Retail, once cost-conscious, was one of the later industries to adopt technology. Amazon, as we know, has disrupted this sector at the detriment of many high-profile brick-and-mortar brands of yesteryear.

TBR will use this construct to incubate standard coverage of markets, facilitating a way to bring analysis of that market to a vendor-centric view. TBR’s Digital Transformation research portfolio will serve as the vehicle to introduce these frameworks. The inaugural Digital Transformation Blockchain Market Landscape is set to publish in April 2021 and Digital Transformation IIoT Market Landscape will be published in June 2021. These reports will follow a semiannual publication cadence.

Atos Named a Leader in the TBR Quantum Market Landscape

“Atos announced today that it has been named a Leader in Technology Business Research Inc.’s (TBR) Market Landscape for Quantum Computing. Atos was identified as a Leader for its ability to advance the exploration and development of quantum algorithms, reflecting its commitment to deliver early and concrete benefits of quantum computing by bringing forth new use cases.” — HPC Wire

Quick Quantum Quips: Vendors seek ways to increase quantum accessibility

Welcome to TBR’s monthly newsletter on the quantum computing market: Quick Quantum Quips (Q3). This market changes rapidly, and the hype can often distract from the realities of the actual technological developments. This newsletter keeps the community up to date on recent announcements while stripping away the hype around developments.

For more details, reach out to Stephanie Long or Geoff Woollacott to set up a time to chat.

February 2021 Developments

As the power of quantum computing becomes more widely understood, accelerating access to quantum technology and quantum-like capabilities has become a key focus of vendors in the industry worldwide. The COVID-19 vaccine has highlighted the value quantum computing can have in accelerating drug discovery, creation, manufacture and distribution once the technology can be fully harnessed. Additionally, direct application of quantum computing exists in climate change, a top global concern, and sustainability, a focus of major corporations.

  1. Quantum Computing Inc. (QCI) unveiled Qatalyst, a quantum application accelerator. The aim of this software-centric offering is to leverage quantum principles with classically trained computer scientists to harness the power of quantum technologies for complex optimization problems such as supply chain and delivery route optimization by bypassing QPUs and leveraging APIs in their place. While Qatalyst is likely to accelerate near-term access to quantum computing capabilities, TBR believes advancements in other quantum computing technologies will surpass it in the long term. Qatalyst and related QPU and CPU resources are all available via the cloud and do not require on-premises resources to access.
  2. Cambridge Quantum Computing (CQC) partnered with CrownBio and JSR Life Sciences on cancer treatment research. The companies will leverage CQC’s quantum capabilities and CrownBio and JSR’s years of cancer-related research and data to identify multigene biomarkers for cancer treatment drug discovery. It is generally accepted throughout the quantum community that drug discovery will be one of the initial use cases for quantum systems that will be able to achieve economic advantage due to the costly and laborious techniques currently employed in drug discovery. Quantum computing could accelerate this process and reduce the amount of wet-lab research necessary to bring new drugs to market.
  3. IonQ is in early talks to merge with public company DMYT Technology Group Inc., which is a special purpose acquisition company (SPAC) created for the purpose of acquiring an existing company. The merger would enable IonQ to become a public company without going through a lengthy initial public offering. This would be the first U.S.-based pure play quantum computing vendor to go public through a SPAC if it comes to fruition. IonQ is also one of the leading vendors in the trapped ion quantum architecture space, and an IPO would provide the vendor with access to additional capital, which could accelerate its innovation efforts.
  4. D-Wave expanded the availability of its Leap quantum cloud service to Singapore, providing users in the country with real-time access to D-Wave’s Advantage quantum computer, hybrid quantum/classical solvers, and the Quantum Application Environment (QAE).
  5. Microsoft has acknowledged the potential positive impact quantum computing could have on energy, including reducing emissions and power consumption. Further, the research enabled by quantum technologies could lead to discoveries around cleaner energy sources and more efficient electrical power systems. TBR notes there has been an industrywide increase in focus on sustainability so while these acknowledgements by Microsoft of the environmental benefits of quantum computing are not unique, they mesh well with industrywide marketing efforts.

If you would like more detailed information around the quantum computing market, please inquire about TBR’s Quantum Computing Market Landscape, a semiannual deep dive into the quantum computing market. Our latest version, published in December, focuses on the software layer of quantum systems.

Complexity and trust: EY’s evolving approach to risk

Internal risk professionals may have the best internal intelligence

Setting the stage for changes at EY and in the broader market, Frank Leenders, EY’s Digital & Innovation lead based in the Netherlands, explained that the firm helps clients “reframe the future” and focus on “trusted transformation,” which comes through six different lenses: Investor Trust, Organizational Trust, Third-party Trust, Customer Trust, Technology Trust and Regulatory Trust. Leenders added that the COVID-19 pandemic helped expose in greater detail how clients think about risk and trust and how different lines of defense can become sources of organizational intelligence.

Risk-oriented functions within clients’ organizations brought forward insights using data analytics and provided timely analysis on strengths and weaknesses, as revealed by internal responses to operational challenges created by the pandemic (and echoed by EY’s own Megatrends pandemic-related survey findings). While many clients’ digital agendas had been accelerating over the past four years, 2020 became an inflection point in understanding how using data and technology for timely insights related to risk could show not only what could go wrong but also how clients could improve their operations and enhance overall risk management. In short, internal audit and risk professionals likely have the best intelligence and insight into their own organizations — skills that are critical to running the business and optimizing opportunities during a prolonged crisis.

After walking through details on EY Resilience Edge — an AI-powered emerging risk modeling and scenario planner developed with IBM Watson and IBM Research — the EY partners described the EY VIA (Virtual Internal Audit) platform, a tool for end-to-end digitalization of the internal audit process and activities, including continuously ingesting data and developing analytics on clients’ ERP environments. EY uses the platform, which includes risk monitoring and what EY has named its Flexible Audit Response Model, not only as a tool for delivering on its internal audit engagements but also as a stand-alone Software as a Service offering. In addition to the technology tools and bespoke configurations, EY has the opportunity to provide change management consulting as clients adopt new tools and processes.

Regulatory Trust as the gateway to trusted complexity

Shifting to Regulatory Trust, which EY defines as managing “the regulatory burden with innovative frameworks that make compliance an enabler, allowing organizations to pursue sustainable pathways,” Federico Guerreri, EY’s Global Financial Services Risk leader, noted that stakeholders and customers have increased pressures around understanding and evaluating an enterprise’s full ecosystem, including suppliers, particularly as the end of the COVID-19 pandemic is in sight. For EY, “compliance and conduct” have become “the most important offerings” as clients in highly regulated sectors, including financial services and energy and utilities, recognize new risks associated with ecosystem partners’ behaviors and the regulators’ view of those risks.

For EY, this leads to “working from the future back to transform compliance” and infusing technology to create “continuous, dynamic monitoring.” Guerreri specifically pointed out that EY’s clients see the potential risk impacts of new regulations as a board-level issue, further raising the profile of risk professionals as well as the need for EY’s services and solutions centered on compliance.

Building on that point, Amy Gennarini, EY’s America’s FSO Risk Technology leader, said the organizations most successfully addressing risk have explicitly tied together regulatory obligations and business attributes. By integrating and making complex linkages across an entire organization, a business can enable faster and more comprehensive transformation. For TBR, this insight stands out as critical to understanding how EY sees the future of risk, trust and digital transformation: Complex linkages help identify risks and facilitate transformations. Complexity, usually a byword for making things too complicated, can be hugely beneficial for enterprises, if managed properly.

In late January, TBR spoke with leaders in EY’s risk consulting services practice about recent portfolio developments and expectations for 2021. Three critical elements stood out for TBR. First, the maturation of EY’s risk consulting services practice (which sits in the firm’s Business Consulting domain) provides the firm with a solid foundation to build new offerings and help clients with the transformational opportunities connected to risk, not simply the obligatory or compliance-related aspects of risk management. Second, the firm remains committed to making technology an enabler, through innovation and at scale, while keeping the fundamental consulting business model intact. Third, and most critically for understanding EY’s overall thinking on risk, the firm fully embraces the complexities that arise when applying technologies at scale to every component of a client’s organization and utilizes these complexities to build trust while addressing risk. In EY’s approach, complex linkages between data, technology platforms and internal business groups help identify risk and thus help clients’ transformations. In short, complexity can be good if handled well.

Peraton’s purchase of Perspecta: The latest move in the quest for scale in federal IT

Scale is king

Peraton’s purchase of Northrop Grumman’s (NYSE: NOC) IT services business and pending acquisition of Perspecta (NYSE: PRSP) are clearly aimed at obtaining the scale necessary to compete for large enterprise and digital transformation deals, which have become common in the public sector IT services market.

Peraton is hardly the first in this space to make such transformative purchases. SAIC (NYSE: SAIC) made two large acquisitions in two years with Engility and Unisys Federal in 2019 and 2020, respectively; General Dynamics IT (NYSE: GD) purchased CSRA in 2018; and Leidos (NYSE: LDOS) perhaps started the trend with its purchase of Lockheed Martin (NYSE: LMT) Information Systems & Global Solutions (IS&GS) in 2016. As federal agencies seek to modernize and transform their operations to take advantage of emerging technologies such as cloud, 5G, AI, machine learning, and AR and VR, large monolithic deals, such as the Next Generation Enterprise Networks Recompete (NGEN-R), Defense Enterprise Office Solution (DEOS), Global Solutions Management – Operations II (GSM-O II) and Joint Enterprise Defense Infrastructure (JEDI), among others, illustrate the importance of being able to deliver these technologies and surrounding services at scale.

Companies such as Leidos, General Dynamics Technologies (GDT) and Booz Allen Hamilton (NYSE: BAH) have come out as the clear winners on the vast majority of multibillion-dollar deals like the ones mentioned above, thanks largely to their ability to deliver digital transformation at scale and proven past performance. TBR believes this trend is only going to become more pervasive in 2021 as the federal government pursues continued IT modernization across defense, intelligence and civilian agencies. Alternatively, if the federal government begins to move toward smaller contracts in terms of total value and/or duration, Peraton’s newly acquired scale would no longer be an asset. However, this is likely only a long-term concern, as the federal government shows no signs of ramping down contract sizes or duration for the foreseeable future.  

Why Perspecta had to die

Perhaps nothing illustrates the importance of scale more than the death of Perspecta. When the company was formed from the merger of DXC Technology’s (NYSE: DXC) public sector business with Vencore and KeyPoint Government Solutions in 2018, the clear intention was to create a federally focused contractor of scale that could compete for the large transformative deals that have become commonplace. Most important among these was the NGEN-R contract, whose predecessor, the NGEN contract, was held by Perspecta and represented nearly 20% of the company’s total revenue.

Despite this, Perspecta was unable to win the $7.7 billion NGEN-R, which was awarded to Leidos and will begin to ramp up in 2H21, leaving Perspecta with a loss of 19% of its total revenue, which cannot be replaced quickly enough to avoid steep losses year-to-year.

Losing the NGEN-R bid put Perspecta in a very difficult place, beyond the obvious financial burden. The company’s leadership has fielded tough questions from Wall Street about where the company is headed without NGEN-R. Perspecta has been unable to win any comparable deals, such as DEOS or GSM-O II, on which it has bid in the last year or two. Additionally, the company does not have as strong of a portfolio in emerging technologies as many of its competitors, and it is highly unlikely Perspecta on its own could have returned to growth quickly enough to appease its stakeholders. In this context, it is clear that Perspecta needed to die. With its pending sale to Peraton, there is opportunity to reemerge as a more formidable competitor in the federal IT services market, free from the burdens associated with its past failures as part of Peraton.

On Jan. 27, Perspecta announced its purchase by Peraton, a Veritas Capital portfolio company, for an all-cash price of $7.1 billion. This acquisition comes on the heels of Peraton’s purchase of Northrop Grumman’s IT services business, which closed Feb. 1 (outlined in TBR’s special report End game for Northrop Grumman’s IT services business). The resulting company, which will retain the Peraton name, will be a $7.6 billion to $7.9 billion business on a pro forma basis with approximately 24,300 employees, in TBR’s estimates.

Partnership with Palantir further unlocks IBM’s AI value

Since Arvind Krishna took the helm as CEO in April, IBM has engaged in a series of acquisitions and partnerships to support its transformative shift to fully embrace an open hybrid cloud strategy. The company is further solidifying the strategy with the announcement that IBM and Palantir are coming together in a partnership that combines AI, hybrid cloud, operational intelligence and data processing into an enterprise offering. The partnership will leverage Palantir Foundry, a data integration and analysis platform that enables users to easily manage and visualize complex data sets, to create a new solution called Palantir for IBM Cloud Pak for Data. The new offering, which will be available in March, will leverage AI capabilities to help enterprises further automate data analysis across a wide variety of industries and reduce inherent silos in the process.

Combining IBM Cloud Pak for Data with Palantir Foundry supports IBM’s vision of connecting hybrid cloud and AI

A core benefit that customers will derive from the collaboration between IBM (NYSE: IBM) and Palantir (NYSE: PLTR) is the easement of the pain points associated with adopting a hybrid cloud model, including integration across multiple data sources and the lack of visibility into the complexities of cloud-native development. By partnering with Palantir, IBM will be able to make its AI software more user-friendly, especially for those customers who are not technical by nature or trade. Palantir’s software requires minimal, if any, coding and enhances the accessibility of IBM’s cloud and AI business.

According to Rob Thomas, IBM’s senior vice president of software, cloud and data, the new offering will help to boost the percentage of IBM’s customers using AI from 20% to 80% and will be sold to “180 countries and thousands of customers,” which is “a pretty fundamental change for us.” Palantir for IBM Cloud Pak for Data will extend the capabilities of IBM Cloud Pak for Data and IBM Cloud Pak for Automation, and according to a recent IBM press release, the new solution is expected to “simplify how businesses build and deploy AI-infused applications with IBM Watson and help users access, analyze and take action on the vast amounts of data that is scattered across hybrid cloud environments, without the need for deep technical skills.”

By drawing on the no-code and low-code capabilities of Palantir’s software as well as the automated data governance capabilities embedded into the latest update of IBM Cloud Pak for Data, IBM is looking to drive AI adoption across its businesses, which, if successful, can serve as a ramp to access more hybrid cloud workloads. IBM perhaps summed it up best during its 2020 Think conference, with the comment: “AI is only as good as the ecosystem that supports it.” While many software companies are looking to democratize AI, Red Hat’s open hybrid cloud approach, underpinned by Linux and Kubernetes, positions IBM to bring AI to chapter 2 of the cloud.

For historical context, it is important to remember that the acquisition of Red Hat marked the beginning of IBM’s dramatic transformation into a company that places the values of flexibility, openness, automation and choice at the core of its strategic agenda. IBM Cloud Paks, which are modular AI-powered solutions that enable customers to efficiently and securely move workloads to the cloud, have been a central component of IBM’s evolving identity.

After more than a year of messaging to the market the critical role Red Hat OpenShift plays in IBM’s hybrid cloud strategy, Big Blue is now tasked with delivering on top of the foundational layer with the AI capabilities it has been tied to since the inception of Watson. By leveraging the openness and flexibility of OpenShift, IBM continues to emphasize its Cloud Pak portfolio, which serves as the middleware layer, allowing clients to run IBM software as close or as far away from the data as they desire. This architectural approach supports IBM’s cognitive applications, such as Watson AIOps and Watson Analytics, while new integrations, such as those with Palantir Foundry will support the data integration process for customers’ SaaS offerings.

The partnership will provide IBM and Palantir with symbiotic benefits in scale, customer reach and capability

The partnership with IBM is a landmark relationship for Palantir that provides access to a broad network of internal sales and research teams as well as IBM’s expansive global customer base. To start, Palantir will now have access to the reach and influence of IBM’s Cloud Paks sales force, which is a notable expansion from its current team of 30. The company already primarily sells to companies that have over $500 million in revenue, and many of them already have relationships with IBM. By partnering with IBM, Palantir will not only be able to deepen its reach into its existing customer base but also have access to a much broader customer base across multiple industries. The partnership additionally provides Palantir with access to the IBM Data Science and AI Elite Team, which helps organizations across industries address data science use cases as well as the challenges inherent in AI adoption.

Partners such as Palantir support IBM, including by helping the company scale Red Hat software and double down on industry cloud efforts

As a rebrand of its partner program, IBM unveiled the Public Cloud Ecosystem program nearly one year ago, onboarding key global systems integrators, such as inaugural partner Infosys, to push out IBM Cloud Paks solutions to customers on a global scale. As IBM increasingly looks up the technology stack, where enterprise value is ultimately generated, the company is emphasizing the IBM Cloud Pak for Data, evidenced by the November launch of version 3.5 of the solution, which offers support for new services.

In addition, IBM refreshed the IBM Cloud Pak for Automation while integrating robotic process automation technology from the acquisition of WDG Automation. Alongside the product update, IBM announced there are over 50 ISV partners that offer services integrated with IBM Cloud Pak for Data, which is also now available on the Red Hat Marketplace. IBM’s ability to leverage technology and services partners to draw awareness to its Red Hat portfolio has become critical and has helped accelerate the vendor’s efforts in industry cloud following the launch of the financial services-ready public cloud and the more recent telecommunications cloud. New Cloud Pak updates such as these highlight IBM’s commitment to OpenShift as well as its growing ecosystem of partners focused on AI-driven solutions.

Palantir’s software, which serves over 100 clients in 150 countries, is diversified across various industries, and the new partner solution will support IBM’s industry cloud strategy by targeting AI use cases. Palantir for IBM Cloud Pak for Data was created to mitigate the challenges faced by multiple industries, including retail, financial services, healthcare and telecommunications — in other words, “some of the most complex, fast-changing industries in the world,” according to Thomas. For instance, many financial services organizations have been involved in extensive M&A activity, which results in a fragmented and dispersed environment involving multiple pools of data.

Palantir for IBM Cloud Pak for Data will remediate associated challenges with rapid data integration, cleansing and organization. According to IBM’s press release, Guy Chiarello, chief administrative officer and head of technology at Fiserv (Nasdaq: FISV), an enterprise focused on supporting financial services institutions, reacted positively to the announcement, stating, “This partnership between two of the world’s technology leaders will help companies in the financial services industry provide business-ready data and scale AI with confidence.”