mimik pioneers a unique hybrid edge cloud solution that empowers the localized autonomy of devices

The journey to capitalize on the edge is rooted in deep telco experience, coupled with a passion for breaking boundaries

A brief history lesson is important to understand how mimik came to be. It was during her tenure as CEO of Vodafone xone that mimik CEO and founder Fay Arjomandi realized the growing importance of decentralizing data analytics and processing to the edge. Through the testing of capacity improvement and utilization of network traffic, Arjomandi noted the inherent delay that occurs when traffic hits a data center, causing extensive issues such as bottlenecks as data struggles to reach the back end of the application. This was all occurring in the context of the rapid evolution of devices themselves, increasing not only in sheer volume but also in sophistication.

Arjomandi came to the realization that the existing architecture of the time was not equipped to support the ongoing shift to a hyperconnected digital world where almost every object can be smart. The future is not about vertically integrated devices that communicate in a linear fashion to the cloud or on-premises data center environments, but rather will be rooted in horizontal platforms where data can be processed and exchanged across diverse networks, platforms and systems. Created in the context of IoT but viewed with new eyes as the Internet of Systems versus “things,” mimik pioneered a new architecture in the form of a hybrid cloud edge solution that enables any computing device to act as a cloud server with the ability to communicate autonomously and locally and to make decisions across and within networks. 

Empowering local systems to make autonomous decisions is mimik’s core value 

By virtue of placing enterprise applications closer to where data is created and where insights are actionable, edge devices have always maintained some degree of autonomy. That said, there has also been an underlying perception that the cloud has an umbilical-cord-like function in that it ultimately serves as the main governing force and point at which most of the data is processed, analyzed and housed. mimik has cut the cord, recognizing that as IT becomes increasingly decentralized, localized servers and sensors are evolving beyond mere endpoints and becoming part of powerful systems that can function independently of the cloud. mimik’s Hybrid edgeCloud application development platform was born out of the realization that applications can interact locally with the power to function as clusters of communities that communicate, inform and analyze data at the source.

The edge has traditionally been viewed as a localized extension of the cloud, providing a 1+1=3 opportunity to capitalize on the inherent benefits of the cloud with localized data processing and reduced latency. In the context of an increasingly hyperconnected world, the devices and sensors that interact at the edge are taking a central role, driving more and more use cases, rather than acting just as add-ons to amplify the value of the cloud. By focusing on what devices can accomplish as part of interconnected systems at the edge, mimik, a Canada-based technology firm, has emerged with an advanced out-of-the-box solution, Hybrid edgeCloud, which enables any computing device to act as a cloud server. The multiple positive implications include lowered latency, reduced constraints on network bandwidth, heightened security and decreased cost of cloud hosting — all due to the reduction of traffic traveling to and from the cloud and the enhanced connectivity within and between systems of devices.

PwC unleashed: A professional services firm adopts Netflix-like business models

From Products to Digital on Demand and ProEdge

We reported this time last year that PwC Products completely shifted from being an old-school, white-shoe, tax- and audit-focused professional services firm from the previous age of the Big Eight to being a business solutions provider, with those “solutions” including SaaS, managed services and platforms. Now the firm has taken another large leap forward, adopting elements of business models most notably deployed by Netflix to bring its software and solutions into clients’ environments in a completely new way, while simultaneously reorienting the firm’s professionals around the skills and capabilities needed to serve their clients in a new world. We understand that assessment sounds over the top in a market already swamped by exaggerated claims around digital transformation.  

Sustained investment and committed leadership — it is that simple

PwC launched PwC Products in early 2020, as covered in our special report, in which we noted: “PwC is a business solution provider, and some of those solutions include products — tangible, defined assets that allow the firm to be, as the PwC leaders noted, ‘better, faster, and cheaper for clients.’ Some of those assets will remain within the firm, scalable but deployed only to increase speed or efficiency in certain engagements. Some assets will remain with the client, paid for in full, through licensing or by subscription. For all of the solutions, PwC’s approach will start with a business problem in mind, rather than employing a systems integrator mindset of plugging technology into a business.”

Building on PwC Products, perhaps on a timeline accelerated by the remote-working realities of the pandemic, PwC rolled out Digital on Demand and ProEdge in late 2020, bringing to clients two distinct offerings made possible by years of sustained investment in digital capabilities, including software and the firm’s own IP, as well as a leadership commitment to adjusting the firm’s business model to fully accommodate subscription-based pricing and software-centric engagement models. In TBR’s view, the first element — investing in technology — does not differentiate PwC from peers, except perhaps in the firm’s early start in some areas and sustained commitment to an organizing framework. The second element — leadership and adjusting the business model — marks a critical difference for PwC. Even though peers have made some similar changes, PwC has aggressively gone all-in and adopted multiple changes to its business models.

Digital on Demand: All the apps you want for one low monthly price (Netflix model 1)

In essence, Digital on Demand is PwC’s version of Apple’s (Nasdaq: AAPL) App Store, but with a client experience more akin to Netflix (Nasdaq: NFLX), where every option is available immediately without separate pricing or technical concerns. Similar to how everyone can watch their Netflix shows on their own device, PwC’s Digital on Demand solutions can be downloaded into the client environment, where they can be configured.

Led by PwC Labs Partner Michelle Wilkes, from the firm’s Consulting practice, and US Automation Leader Jeff Lower, from Tax, Digital on Demand belongs within the larger PwC Labs practice and carries through a relatively basic premise: Take the automation PwC incorporated internally, curate the solutions and refine the automations, and then make them available for PwC’s clients to deploy into their own environments. According to Wilkes, PwC built the foundational 6,500 automations across its own back office and for client engagement and saved 8.6 million hours of staff time across the firm.

Starting with finance functions, where PwC has legacy strengths and strong brand permission, the firm has partnered with Microsoft (Nasdaq: MSFT), UiPath, Alteryx (NYSE: AYX) and others to provide clients a menu of downloadable automations (access to cloud-based AI models via information extraction using natural language processing and machine learning), deemed by PwC’s Wilkes as “proven and relevant” because the automations had been designed by people who are deeply familiar with the finance functions and have experience in the finance environment. In short, Digital on Demand is readily deployable software built by finance process people for finance process people. Wilkes said the firm has 393 downloadable automations today, with plans to reach 500 by May 1.

On Feb. 18 and 19, 2021, TBR spoke with several PwC leaders: Michelle Wilkes, partner, PwC Labs; Jeff Lower, US Automation leader; Suneet Dua, chief product officer, PwC US; Darren Lee, partner, PwC Consulting; Mike Mendola, senior associate, PwC Labs; and Maria D’Alessandro, strategy director, PwC Products. This special report includes information and analysis drawn from these discussions and looks at how much the firm has changed and where the future of consulting lies for PwC and its peers. 

The state of crowdsourcing

In February TBR virtually attended the Global Technology & Business Services Council’s Global Series: Open Talent conference and heard from leaders across the technology and crowdsourcing industries about emerging themes and trends. While it is not a new phenomenon, crowdsourcing is becoming a compelling delivery model in the IT services space as enterprises increasingly embrace remote services during the pandemic and seek out new ways to fill skills gaps, drive cost savings and accelerate engagement turnaround time. Platform-based crowdsourcing companies such as Topcoder and Freelancer.com, which TBR heard from during the conference, are rapidly expanding their communities of technology-oriented freelancers and driving new use cases with large enterprises that would have traditionally gone to IT service vendors. In TBR’s view, vendors that are not embracing this shift stand to lose the most. At the same time, we question whether those that bring this model to market are really positioned to gain much.

Existential threat or just a piece of the puzzle?

Now more than ever, crowdsourcing and open talent models are proving to be significant disruptors in how services are delivered, and technology appears to be an area that can benefit the most from this trend. Prior to the Global Technology & Business Services Council (GT&BSC) event, our understanding was that organizations engaged with the aforementioned crowdsourcing platforms mostly around the ability to tap large pools of talent much faster and cheaper, but largely for low-value, task-oriented services. To our surprise, during the event TBR learned about use cases in which Fortune 500 companies and renowned research institutions turned to “the crowd” for sophisticated software and data services, which led to significant improvements in speed, cost and even quality. For instance, a coalition between several enterprises and academic institutions such as Harvard and the Massachusetts Institute of Technology (MIT) opened up a project to the Topcoder community around the optimization of a DNA sequencing algorithm, with the goal of surpassing what had been regarded as an “impossible” threshold. Dozens of submissions crossed the threshold within 24 hours. At first glance, the use of these platforms for high-value services poses a threat to IT services vendors, but TBR notes some caveats.

White-labeling labor

It is necessary to point out that the majority of IT services vendors’ activities in the crowdsourcing space happened pre-pandemic. For example, Wipro (NYSE: WIT) made a splash in the industry when it acquired Topcoder in 2016. And in the years following this, other vendors such as DXC Technology (NYSE: DXC) and Deloitte pursued partnerships that enabled the vendors to launch crowdsourcing services by tapping into Topcoder’s labor ecosystem. First, this distorts the image of crowdsourcing platforms as competition and instead reflects a more symbiotic relationship. Second, while Wipro might be able to take a small commission on engagements done through Topcoder, it lacks a significant competitive advantage over peers that partner with it or other similar platforms.

Security and trust

Soliciting bids from unknown global technologists presents obvious risks. This model is not suited for workloads involving sensitive data and therefore is not gaining the same traction in industries such as financial services or healthcare, where data security and privacy are top concerns. IT services vendors that cater to this clientele will be much more capable of steering clients away from crowdsourcing services and commanding profitable revenues. Similarly, many firms’ value propositions revolve around the reputation of the company and its quality of services, helping them garner more trust-based relationships. Clients seeking this level of service will largely be uninterested in crowdsourcing. This concern will also put pressure on services providers that partner with crowdsourcing platforms, as well as the platforms themselves, to establish guardrails against potential leaks or security breaches, but it remains no easy task to vet millions of global freelancers.   

Bottom line

These platforms are intended to optimize costs and speed. While IT services vendors likely do not want to miss out on any opportunities to engage with potential clients, partnering with a crowdsourcing provider and delivering the cheapest possible services will limit margin growth. Instead, we see opportunities for vendors to embed this open talent model into their organizations to improve utilization and more efficiently deploy staff. We learned during the GT&BSC event, for example, that Deloitte partnered with Freelancer.com in 2019 to develop an internal marketplace for Deloitte’s employees to join open projects within the firm and integrated the solution into the Freelancer.com ecosystem so Deloitte could extend its pool of external resources if needed.

What the future holds

If anything has stood out during the pandemic, it is that incumbents in every industry must be prepared to quickly pivot and adjust to new and nontraditional ways of doing business. In the IT services space, TBR believes disruptions such as the gig economy and crowdsourcing pose the biggest threat to management consulting firms, where generational and technology shifts are creating instances in which enterprises may opt to collect third-party opinions and advice through these types of platforms instead of via expensive consultancies, which can be as enigmatic as unidentified, crowdsourced respondents. In general, IT services vendors that have established pathways into this type of model will benefit from bringing new logos into account ecosystems, which will provide opportunities downstream to upsell higher-value services. We anticipate crowdsourcing will continue to play a supplementary, but necessary, role in IT services as a way for companies to easily scale services at the expense of security and margins. But much like organizations’ hesitation to fully embrace the cloud for their IT ecosystems, taking a hybrid approach to “the crowd” will likely remain the preferred method for most enterprises to minimize risk while still reaping the benefits of scalability and speed that the crowdsourcing model offers.

TBR’s Professional Services practice will continue to monitor the trends outlined above and provide analysis across our syndicated vendor reports and benchmarks, notably the IT Services Vendor Benchmark and Global Delivery Benchmark. The next iterations of these two products, which synthesize TBR’s in-depth analysis and data across covered vendors, are set to publish in April.

Eyeing the future: Accenture’s fundamentals drive human-centric technology change at scale

‘Leaders Wanted — Masters of Change at a Moment of Truth’

Accenture’s (NYSE: ACN) recent virtual event to introduce its Accenture Technology Vision 2021 kicked off with a quick recap of the socioeconomic headwinds of 2020. These headwinds include four new concerns facing people personally and professionally: an increasing global population driving a need for new ways of interacting; the evolution of “Every business is a tech business” as technology’s role changes with the changing environment; the workforce of the future; and sustainability. Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty then outlined in detail the five major trends of its 2021 vision.

Delivered under the slogan “Leaders Wanted — Masters of Change at a Moment of Truth,” the vision highlights five key areas, which we expect to drive investments not just from Accenture but also peers and enterprises, given the company’s market-making status in multiple domains.

  1. Stack strategically: While this trend at its core applies to architecting and redesigning organizations’ technology stacks to support the enterprise of the future, which includes attributes from the customer experience to the security layer, it also maps to Accenture’s core value proposition of joining consultants, designers, researchers, solution architects and delivery personnel, all through the umbrella of Accenture Innovation Architecture.
  2. Mirrored world: The resurgence of the digital twin is moving beyond experimental phases, and large enterprises are seeing an opportunity to invest in an area that, in the era of COVID-19, which has led to social distancing and reduced access to physical plants, will allow them to use IoT techniques to enable remote monitoring and control. Accenture’s ongoing investments in mobility and IoT service offerings over the past five years, along with the recent push into product engineering offerings, largely enabled through acquisitions, will enable the company to address demand and increase client stickiness.
  3. I, technologist: The democratization of technology, which has enabled workforces to do more with less and orient their productivity to higher-value tasks largely enabled by automation, while not a new trend, has certainly reached a pivotal point, given the changes over the past 12 months in how employees perform their work. Accenture’s rigorous approach to and ongoing investments in training — including spending $1 billion per year on reskilling and upskilling personnel, with efforts most recently focused on building cloud consulting, architecting and delivery skills — enable it to drive internal change at scale, and then sell its capabilities “as a Service” to clients.

On Feb. 17, 2021, Accenture held a one-hour virtual session introducing its Accenture Technology Vision 2021. While the format was different than in previous years, the 21st iteration of the summit had a similar goal: to portray Accenture’s technology prowess and appetite for innovation and scale. Hosted by Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty, Accenture Senior Managing Director and Lead – Technology Innovation and Accenture Labs Marc Carrel-Billiard, and Managing Director – Accenture Technology Vision Michael Blitz, the virtual delivery of the content was both a sign of times and a demonstration of Accenture’s ability to coordinate, deliver and manage virtual events in collaboration with ecosystem partners — in this case, Touchcast. 

Who’s there?: The rise of multienterprise business networks

Not everything about business is technology, but every business has to leverage technology everywhere

Over the last few years, executives discussed redesigning their businesses for the safe, secure and accurate flow of actionable data with as little human involvement and oversight as possible, a change Google describes as removing the “human toil” from economic activity. Business leaders called this process optimization, a process often resisted by employees which in turn slows an organization’s digital efforts. Organizations big and small have been forced to embrace a cloud- and digital-first posture to maintain business continuity and participate in everyday economic activity. In short, these efforts are being done to maintain relevance. As a result, nontechnology-savvy executives and employees will exit the workforce exponentially over the next five years.     

In this transformative period, future managers train now at new entry-level IT jobs, even as IT services vendors and other players in the technology ecosystem complain about a shortage of STEM talent in the hiring markets. The talent that does come on in new roles spread across a digitally savvy enterprise understands application interfaces, which align human interaction with technology and data platforms. By entering the business in this capacity, the incoming talent gains experience across the various elements of the business operation that executive managers require while also ensuring they are fully digitally versed for the Business of One.

Adding further complexity has been the disaggregation of business functions or value among different business entities. In technology we see this as the IP-centric elements of a business being split away from the labor, or task-centric functions. Looking at semiconductors, for example, some on Wall Street are calling for Intel to be split between the IP-laden aspect of chip design and the capital-intensive aspect of fabrication plants capable of manufacturing those designs reliably at scale. They are two businesses with entirely different rhythms and economic drivers, yet neither can thrive without the other.

The work-around to this business disaggregation taking place is to establish a network of businesses with complementary value propositions. This network is increasingly being called the multienterprise business network (MEBN). Many technology-centric firms describe this as their platform. But platforms are a stage on which something is performed, and that performance is the outcome enabled by multiple different parties. As such, viewing MEBNs from solely a technology-centric view can miss the point entirely.

As the Business of One evolves, legacy technology vendors selling on technical merits, or speeds and feeds, and selling just to IT face tremendous market pressures to pivot to selling business outcomes. Today’s reality requires understanding customers’ busines objectives and speaking directly to business decision makers.

For Technology Business Research’s (TBR) Digital Practice, this necessitates taking our core value proposition of vendor-centric business analysis of technology companies across a standard technology business value chain and combining it with additional considerations about industries and the operating best practices of business ecosystems that tie back to the specific use case and the personas integral to that use case. After having established those core frameworks, the analysis then ties back to time horizon and MEBN participant. In short, what is in it for the MEBN participant at what stage (commonly referred to as Horizon 1, 2 and 3 in today’s frameworks) in the MEBN product road map.

To illustrate the intent here, consider the creation of an MEBN for the utilization, storage and maintenance of autonomous vehicles. Having autonomous vehicles moving about a defined geography would clearly be the Horizon 3 aspiration, which is nowhere near commercial reality today.

Horizon 1 would be delivering an immediate level of business value creation to entice the participants necessary for that Horizon 3 aspiration. For example, gas stations, mechanics and parking garages, at a minimum, will need to be recruited into the MEBN for autonomous vehicles. Later, additional services for the auto owner could be added such as online ordering with brick-and-mortar pickup across various nontech-centric small businesses providing localized services. Creating a buyer network in Horizon 1 for today’s cars and owners has to provide sufficient business value for enrolling participants.

The capital investment in the technology infrastructure likely must come from the Horizon 3 business benefactor and be viewed as a long-term investment to facilitate the recruitment of the necessary member participants. In the end, those autonomous vehicles will need the fueling, maintenance and parking services to function and the adjacent human services of pickup and delivery to increase their utilization rates beyond a source of human transport. Yes, it requires a technology value chain as its backbone, but nontechnology participants are just as necessary to flesh it out into a thriving MEBN of buyers and sellers who may not even concern themselves with the technology underpinnings at all.

More colloquially, few singer-songwriters would have the capital necessary to build the technology assets for downloading music over the internet. But once Apple took a long view to their investment posture, it was able to build out a robust MEBN that profited many artists, disrupted traditional nontechnology businesses, and delivered value to many customers in the form of the iTunes platform, which itself has been disrupted by streaming services such as Spotify and Pandora.

TBR’s Digital Practice remit is to take its core value proposition of discrete company business model analysis and apply it to the MEBNs by isolating the different components through a series of frameworks. In doing this, we will then be able to assess the financial impact for the different member participants across the near-term, mid-term and long-term horizons.

Industries have different automation leverage points, enabling different personas; inexpensive tech makes possible a myriad use cases

Compute ubiquity has been well documented. The multimillion-dollar supercomputer performance of yesteryear is now contained in smartphones. The first IBM PC chip, the 8088, is now matched by CPUs the size of a grain of sand that cost $0.10 to produce. Historically, the heavily regulated industries of financial services and healthcare were early technology adopters, given the risk exposure of noncompliance with government regulation. As the cost of compute was brought down to incredibly inexpensive price points, compute expanded from those back-office functions into front offices. Today, we are at a point where, as on EY executive summed it up in analyst interaction when peppered with multiple questions: “We can do whatever you want; you just have to make up your minds.”

Making up our minds translates into codification of standard business results to digitize activity in a consistent way, and this sits at the heart of multiple game-changing technologies including AI, machine learning and blockchain. And these are horizontal technological capabilities that cascade through a variety of industries. Retail, once cost-conscious, was one of the later industries to adopt technology. Amazon, as we know, has disrupted this sector at the detriment of many high-profile brick-and-mortar brands of yesteryear.

TBR will use this construct to incubate standard coverage of markets, facilitating a way to bring analysis of that market to a vendor-centric view. TBR’s Digital Transformation research portfolio will serve as the vehicle to introduce these frameworks. The inaugural Digital Transformation Blockchain Market Landscape is set to publish in April 2021 and Digital Transformation IIoT Market Landscape will be published in June 2021. These reports will follow a semiannual publication cadence.

Complexity and trust: EY’s evolving approach to risk

Internal risk professionals may have the best internal intelligence

Setting the stage for changes at EY and in the broader market, Frank Leenders, EY’s Digital & Innovation lead based in the Netherlands, explained that the firm helps clients “reframe the future” and focus on “trusted transformation,” which comes through six different lenses: Investor Trust, Organizational Trust, Third-party Trust, Customer Trust, Technology Trust and Regulatory Trust. Leenders added that the COVID-19 pandemic helped expose in greater detail how clients think about risk and trust and how different lines of defense can become sources of organizational intelligence.

Risk-oriented functions within clients’ organizations brought forward insights using data analytics and provided timely analysis on strengths and weaknesses, as revealed by internal responses to operational challenges created by the pandemic (and echoed by EY’s own Megatrends pandemic-related survey findings). While many clients’ digital agendas had been accelerating over the past four years, 2020 became an inflection point in understanding how using data and technology for timely insights related to risk could show not only what could go wrong but also how clients could improve their operations and enhance overall risk management. In short, internal audit and risk professionals likely have the best intelligence and insight into their own organizations — skills that are critical to running the business and optimizing opportunities during a prolonged crisis.

After walking through details on EY Resilience Edge — an AI-powered emerging risk modeling and scenario planner developed with IBM Watson and IBM Research — the EY partners described the EY VIA (Virtual Internal Audit) platform, a tool for end-to-end digitalization of the internal audit process and activities, including continuously ingesting data and developing analytics on clients’ ERP environments. EY uses the platform, which includes risk monitoring and what EY has named its Flexible Audit Response Model, not only as a tool for delivering on its internal audit engagements but also as a stand-alone Software as a Service offering. In addition to the technology tools and bespoke configurations, EY has the opportunity to provide change management consulting as clients adopt new tools and processes.

Regulatory Trust as the gateway to trusted complexity

Shifting to Regulatory Trust, which EY defines as managing “the regulatory burden with innovative frameworks that make compliance an enabler, allowing organizations to pursue sustainable pathways,” Federico Guerreri, EY’s Global Financial Services Risk leader, noted that stakeholders and customers have increased pressures around understanding and evaluating an enterprise’s full ecosystem, including suppliers, particularly as the end of the COVID-19 pandemic is in sight. For EY, “compliance and conduct” have become “the most important offerings” as clients in highly regulated sectors, including financial services and energy and utilities, recognize new risks associated with ecosystem partners’ behaviors and the regulators’ view of those risks.

For EY, this leads to “working from the future back to transform compliance” and infusing technology to create “continuous, dynamic monitoring.” Guerreri specifically pointed out that EY’s clients see the potential risk impacts of new regulations as a board-level issue, further raising the profile of risk professionals as well as the need for EY’s services and solutions centered on compliance.

Building on that point, Amy Gennarini, EY’s America’s FSO Risk Technology leader, said the organizations most successfully addressing risk have explicitly tied together regulatory obligations and business attributes. By integrating and making complex linkages across an entire organization, a business can enable faster and more comprehensive transformation. For TBR, this insight stands out as critical to understanding how EY sees the future of risk, trust and digital transformation: Complex linkages help identify risks and facilitate transformations. Complexity, usually a byword for making things too complicated, can be hugely beneficial for enterprises, if managed properly.

In late January, TBR spoke with leaders in EY’s risk consulting services practice about recent portfolio developments and expectations for 2021. Three critical elements stood out for TBR. First, the maturation of EY’s risk consulting services practice (which sits in the firm’s Business Consulting domain) provides the firm with a solid foundation to build new offerings and help clients with the transformational opportunities connected to risk, not simply the obligatory or compliance-related aspects of risk management. Second, the firm remains committed to making technology an enabler, through innovation and at scale, while keeping the fundamental consulting business model intact. Third, and most critically for understanding EY’s overall thinking on risk, the firm fully embraces the complexities that arise when applying technologies at scale to every component of a client’s organization and utilizes these complexities to build trust while addressing risk. In EY’s approach, complex linkages between data, technology platforms and internal business groups help identify risk and thus help clients’ transformations. In short, complexity can be good if handled well.

Peraton’s purchase of Perspecta: The latest move in the quest for scale in federal IT

Scale is king

Peraton’s purchase of Northrop Grumman’s (NYSE: NOC) IT services business and pending acquisition of Perspecta (NYSE: PRSP) are clearly aimed at obtaining the scale necessary to compete for large enterprise and digital transformation deals, which have become common in the public sector IT services market.

Peraton is hardly the first in this space to make such transformative purchases. SAIC (NYSE: SAIC) made two large acquisitions in two years with Engility and Unisys Federal in 2019 and 2020, respectively; General Dynamics IT (NYSE: GD) purchased CSRA in 2018; and Leidos (NYSE: LDOS) perhaps started the trend with its purchase of Lockheed Martin (NYSE: LMT) Information Systems & Global Solutions (IS&GS) in 2016. As federal agencies seek to modernize and transform their operations to take advantage of emerging technologies such as cloud, 5G, AI, machine learning, and AR and VR, large monolithic deals, such as the Next Generation Enterprise Networks Recompete (NGEN-R), Defense Enterprise Office Solution (DEOS), Global Solutions Management – Operations II (GSM-O II) and Joint Enterprise Defense Infrastructure (JEDI), among others, illustrate the importance of being able to deliver these technologies and surrounding services at scale.

Companies such as Leidos, General Dynamics Technologies (GDT) and Booz Allen Hamilton (NYSE: BAH) have come out as the clear winners on the vast majority of multibillion-dollar deals like the ones mentioned above, thanks largely to their ability to deliver digital transformation at scale and proven past performance. TBR believes this trend is only going to become more pervasive in 2021 as the federal government pursues continued IT modernization across defense, intelligence and civilian agencies. Alternatively, if the federal government begins to move toward smaller contracts in terms of total value and/or duration, Peraton’s newly acquired scale would no longer be an asset. However, this is likely only a long-term concern, as the federal government shows no signs of ramping down contract sizes or duration for the foreseeable future.  

Why Perspecta had to die

Perhaps nothing illustrates the importance of scale more than the death of Perspecta. When the company was formed from the merger of DXC Technology’s (NYSE: DXC) public sector business with Vencore and KeyPoint Government Solutions in 2018, the clear intention was to create a federally focused contractor of scale that could compete for the large transformative deals that have become commonplace. Most important among these was the NGEN-R contract, whose predecessor, the NGEN contract, was held by Perspecta and represented nearly 20% of the company’s total revenue.

Despite this, Perspecta was unable to win the $7.7 billion NGEN-R, which was awarded to Leidos and will begin to ramp up in 2H21, leaving Perspecta with a loss of 19% of its total revenue, which cannot be replaced quickly enough to avoid steep losses year-to-year.

Losing the NGEN-R bid put Perspecta in a very difficult place, beyond the obvious financial burden. The company’s leadership has fielded tough questions from Wall Street about where the company is headed without NGEN-R. Perspecta has been unable to win any comparable deals, such as DEOS or GSM-O II, on which it has bid in the last year or two. Additionally, the company does not have as strong of a portfolio in emerging technologies as many of its competitors, and it is highly unlikely Perspecta on its own could have returned to growth quickly enough to appease its stakeholders. In this context, it is clear that Perspecta needed to die. With its pending sale to Peraton, there is opportunity to reemerge as a more formidable competitor in the federal IT services market, free from the burdens associated with its past failures as part of Peraton.

On Jan. 27, Perspecta announced its purchase by Peraton, a Veritas Capital portfolio company, for an all-cash price of $7.1 billion. This acquisition comes on the heels of Peraton’s purchase of Northrop Grumman’s IT services business, which closed Feb. 1 (outlined in TBR’s special report End game for Northrop Grumman’s IT services business). The resulting company, which will retain the Peraton name, will be a $7.6 billion to $7.9 billion business on a pro forma basis with approximately 24,300 employees, in TBR’s estimates.

Partnership with Palantir further unlocks IBM’s AI value

Since Arvind Krishna took the helm as CEO in April, IBM has engaged in a series of acquisitions and partnerships to support its transformative shift to fully embrace an open hybrid cloud strategy. The company is further solidifying the strategy with the announcement that IBM and Palantir are coming together in a partnership that combines AI, hybrid cloud, operational intelligence and data processing into an enterprise offering. The partnership will leverage Palantir Foundry, a data integration and analysis platform that enables users to easily manage and visualize complex data sets, to create a new solution called Palantir for IBM Cloud Pak for Data. The new offering, which will be available in March, will leverage AI capabilities to help enterprises further automate data analysis across a wide variety of industries and reduce inherent silos in the process.

Combining IBM Cloud Pak for Data with Palantir Foundry supports IBM’s vision of connecting hybrid cloud and AI

A core benefit that customers will derive from the collaboration between IBM (NYSE: IBM) and Palantir (NYSE: PLTR) is the easement of the pain points associated with adopting a hybrid cloud model, including integration across multiple data sources and the lack of visibility into the complexities of cloud-native development. By partnering with Palantir, IBM will be able to make its AI software more user-friendly, especially for those customers who are not technical by nature or trade. Palantir’s software requires minimal, if any, coding and enhances the accessibility of IBM’s cloud and AI business.

According to Rob Thomas, IBM’s senior vice president of software, cloud and data, the new offering will help to boost the percentage of IBM’s customers using AI from 20% to 80% and will be sold to “180 countries and thousands of customers,” which is “a pretty fundamental change for us.” Palantir for IBM Cloud Pak for Data will extend the capabilities of IBM Cloud Pak for Data and IBM Cloud Pak for Automation, and according to a recent IBM press release, the new solution is expected to “simplify how businesses build and deploy AI-infused applications with IBM Watson and help users access, analyze and take action on the vast amounts of data that is scattered across hybrid cloud environments, without the need for deep technical skills.”

By drawing on the no-code and low-code capabilities of Palantir’s software as well as the automated data governance capabilities embedded into the latest update of IBM Cloud Pak for Data, IBM is looking to drive AI adoption across its businesses, which, if successful, can serve as a ramp to access more hybrid cloud workloads. IBM perhaps summed it up best during its 2020 Think conference, with the comment: “AI is only as good as the ecosystem that supports it.” While many software companies are looking to democratize AI, Red Hat’s open hybrid cloud approach, underpinned by Linux and Kubernetes, positions IBM to bring AI to chapter 2 of the cloud.

For historical context, it is important to remember that the acquisition of Red Hat marked the beginning of IBM’s dramatic transformation into a company that places the values of flexibility, openness, automation and choice at the core of its strategic agenda. IBM Cloud Paks, which are modular AI-powered solutions that enable customers to efficiently and securely move workloads to the cloud, have been a central component of IBM’s evolving identity.

After more than a year of messaging to the market the critical role Red Hat OpenShift plays in IBM’s hybrid cloud strategy, Big Blue is now tasked with delivering on top of the foundational layer with the AI capabilities it has been tied to since the inception of Watson. By leveraging the openness and flexibility of OpenShift, IBM continues to emphasize its Cloud Pak portfolio, which serves as the middleware layer, allowing clients to run IBM software as close or as far away from the data as they desire. This architectural approach supports IBM’s cognitive applications, such as Watson AIOps and Watson Analytics, while new integrations, such as those with Palantir Foundry will support the data integration process for customers’ SaaS offerings.

The partnership will provide IBM and Palantir with symbiotic benefits in scale, customer reach and capability

The partnership with IBM is a landmark relationship for Palantir that provides access to a broad network of internal sales and research teams as well as IBM’s expansive global customer base. To start, Palantir will now have access to the reach and influence of IBM’s Cloud Paks sales force, which is a notable expansion from its current team of 30. The company already primarily sells to companies that have over $500 million in revenue, and many of them already have relationships with IBM. By partnering with IBM, Palantir will not only be able to deepen its reach into its existing customer base but also have access to a much broader customer base across multiple industries. The partnership additionally provides Palantir with access to the IBM Data Science and AI Elite Team, which helps organizations across industries address data science use cases as well as the challenges inherent in AI adoption.

Partners such as Palantir support IBM, including by helping the company scale Red Hat software and double down on industry cloud efforts

As a rebrand of its partner program, IBM unveiled the Public Cloud Ecosystem program nearly one year ago, onboarding key global systems integrators, such as inaugural partner Infosys, to push out IBM Cloud Paks solutions to customers on a global scale. As IBM increasingly looks up the technology stack, where enterprise value is ultimately generated, the company is emphasizing the IBM Cloud Pak for Data, evidenced by the November launch of version 3.5 of the solution, which offers support for new services.

In addition, IBM refreshed the IBM Cloud Pak for Automation while integrating robotic process automation technology from the acquisition of WDG Automation. Alongside the product update, IBM announced there are over 50 ISV partners that offer services integrated with IBM Cloud Pak for Data, which is also now available on the Red Hat Marketplace. IBM’s ability to leverage technology and services partners to draw awareness to its Red Hat portfolio has become critical and has helped accelerate the vendor’s efforts in industry cloud following the launch of the financial services-ready public cloud and the more recent telecommunications cloud. New Cloud Pak updates such as these highlight IBM’s commitment to OpenShift as well as its growing ecosystem of partners focused on AI-driven solutions.

Palantir’s software, which serves over 100 clients in 150 countries, is diversified across various industries, and the new partner solution will support IBM’s industry cloud strategy by targeting AI use cases. Palantir for IBM Cloud Pak for Data was created to mitigate the challenges faced by multiple industries, including retail, financial services, healthcare and telecommunications — in other words, “some of the most complex, fast-changing industries in the world,” according to Thomas. For instance, many financial services organizations have been involved in extensive M&A activity, which results in a fragmented and dispersed environment involving multiple pools of data.

Palantir for IBM Cloud Pak for Data will remediate associated challenges with rapid data integration, cleansing and organization. According to IBM’s press release, Guy Chiarello, chief administrative officer and head of technology at Fiserv (Nasdaq: FISV), an enterprise focused on supporting financial services institutions, reacted positively to the announcement, stating, “This partnership between two of the world’s technology leaders will help companies in the financial services industry provide business-ready data and scale AI with confidence.” 

EY Blockchain Asia: The revolution starts now

EY’s blockchain world

EY’s Asia-Pacific Blockchain Summit started with the firm’s Global Blockchain leader, Paul Brody, making three clear points. First, EY is committed to China and to the region, seeing huge potential for blockchain growth. Second, EY is committed to public blockchain as the long-term solution for most business and governments. Third, Brody’s concept of blockchain as the bridge between enterprises — as the tool to tackle the previously uncrossable chasm between different enterprises’ data and business processes — remains a driving force behind how EY sees the future of blockchain, in Asia and the rest of the world.

TBR’s December 2020 special report EY 2021: Hybrid and omnipresent discussed these latter two points: “Public blockchain, in Brody’s words, ‘will do for networks of enterprises and business ecosystems what ERP did for the single company.’ Brody added that conducting B2B [business-to-business] transactions over a public blockchain increases transparency and compliance with commercial terms.” The February event carried that discussion further, and specifically into Asia. 

EY and public blockchain in China  

Brody outlined a few major developments for EY in China, with all his comments reinforced by the subsequent panel speakers and EY professionals who provided additional color, both for the China-specific elements and developments impacting the entire region. In short:

  • EY has formerly joined the Financial Blockchain Shenzhen Consortium (FISCO) and made the firm’s EY OpsChain solution available on the FISCO BCOS (Be Credible, Open & Secure) platform.
  • EY intends to deploy its entire Ethereum suite of solutions to users in China.
  • EY has fully localized its blockchain entrée — blockchain.ey.com — for the Chinese market.

In addition, Brody touched on the opportunity blockchain presents in Asia, highlighting China and the Chinese market’s emphasis on digital payments as a precursor to blockchain adoption as well as a robust startup scene. He also highlighted three sectors where EY has been “making exceptionally large” investments: financial services, supply chain and the public sector, which underscored one of Brody’s main points around the importance of public blockchain as the core, foundational building block. He noted that “money and stuff are tokens … contracts are a mix of legal agreements and business processes,” so all business could be conducted on the public blockchain, which is EY’s focus on enterprise solutions. 

On Feb. 2, EY hosted an Asia-Pacific Blockchain Summit, a virtual event run by the EY Blockchain practice based in Singapore that included EY professionals and clients, startup executives, and industry experts who are primarily, but not exclusively, based in Asia. The three-hour event included a keynote from EY Global Blockchain Leader Paul Brody, a blockchain solution demonstration, and panel discussions covering the technology, including the challenges and opportunities associated with blockchain and the broader emerging technology space. The following is TBR’s commentary on noteworthy announcements and participants’ assertions made during the event as well as EY’s overall blockchain strategy.

IBM unveils 5-year full-stack quantum road map

IBM’s quantum road map includes dynamic circuits in 2022

IBM (NYSE: IBM) spent the second half of the 2010s laying the foundation for its quantum business. This foundation predominantly focused on hardware development and hardening until the available quantum systems at IBM supported more sophisticated software capabilities. In the 2020s, IBM is now able to pivot its strategy toward more sophisticated aspects of quantum computing, mainly software and control, but with a constant current of hardware innovation to fundamentally support more sophisticated software innovation.

Reminiscent of Intel’s tick-tock development cycle, where the “tick” represented a new chip design and the “tock” represented software optimizations, IBM now has sufficiently stable quantum componentry within the systems to begin working on the next evolutionary step, which is the creation of dynamic circuits within the next two years. As IBM has  decided to build off existing, classical programming languages, Python is at the core of IBM Quantum’s software strategy. This provides IBM with access to about 8 million existing classical computing Python coders, who need minimal quantum-specific training to pivot into this new world of computing.

A key pillar of IBM’s quantum road map and a game-changer in scalability and speed to insight is the development of dynamic circuits, which IBM has listed as a 2022 goal on its road map. Dynamic circuits will enable quantum computation to more closely mimic classical computation in that if/then statements will become possible on quantum computing. Without dynamic circuits, quantum algorithms cannot pivot midway through a process. Therefore, one must run an algorithm through completion, analyze that data and then run another circuit based on insights gained halfway through the process. Dynamic circuits enable an algorithm to measure a qubit’s state — a 0 or a 1 — at a predetermined point in the process and react accordingly, reducing the need to rerun algorithms and reducing the time to insight as well as the volume of qubits consumed.

SOURCE: IBM

Market overview: IBM’s five-year quantum road map comprises developments across the entire quantum infrastructure stack, including hardware, software, services, ecosystem and use-case-specific goals. General focus areas include an emphasis on application modules through 2022 and on application services from 2023 to 2025. Underpinning these broad goals is the systematic development of hardware, software and services capabilities, much of which hinges on a quantum ecosystem IBM has invested in and built, the foundation of which is Qiskit and the IBM Quantum Network. Developing cloud-based solutions is a theme of quantum developments, as COVID-19 has both highlighted and accelerated the need and desire by customers to consume compute capabilities via the cloud.