Demand pull and cost push: Two sides of the inflation coin

A nonfactor for decades, inflation is now parsed into demand pull and cost push

Pricing analysts across the tech industry will have to take a new look at the mix of inflationary pressures, sort those pressures by cost push and demand pull, and adjust their own business muscle memory for both short- and long-term instability stemming from rising energy costs, destabilized supply chains and increased component demand.

 

To excel at textbook economic analysis, you are told to look at one variable, “all other being equal,” or ceteris paribus. But the real world never works that way. The real world becomes particularly alarming for economists when their textbook explanations are not borne out by what is happening.

 

Such a situation occurred in the late 1970s. Inflation (and unemployment) increased from 7.5% to 10.4% from Jimmy Carter’s presidency to Ronald Reagan’s presidency, and then chair of the U.S. Federal Reserve Paul Volcker began raising interest rates to choke off inflation. The term “stagflation” was coined to explain the inexplicable. As we seek to understand inflation today, we have a better parsing of the multiple factors that can drive inflation.

Demand pull inflation: The classic definition

Demand pull inflation is classic 20th century thinking. It loosely corresponds to “too many dollars chasing too few goods.” Traditional Keynesian economic thinking asserts that this situation means the economy is overheating. It asserts the solution is for central banks to raise interest rates to slow demand and bring it in line with current supply. Low unemployment is expected to be a compatible condition to this kind of economic overheating, and with the slowdown, labor market pressure will likewise abate.

Cost push inflation: A fancier definition for resource scarcity (and potentially disruption)

Cost push drove the high inflationary cycle of the 1970s. Oil price shocks, driven by OPEC production decisions, triggered choke points across the entire economy. Companies had little recourse but to pass along price increases, or to “push” the costs onto the consumers at a time when unemployment was high.

The current blend of demand pull and cost push

Today demand pull influences are seen more on the consumer side, with record-high savings during the pandemic shutdown, plus consumer purchases and housing prices spiking due to various government relief payments. Similarly, unemployment remains low, and certain skilled positions, especially acute in the tech sector, remain hard to fill.

 

Textbook moves to raise interest rates to slow the economy are being deployed to cool this economic overheating. Whether it results in a soft or hard landing of the economy on more stable ground remains open to, at times, contentious debate, given cost push inflation is also rampant.

 

Cost push is readily apparent in any purchases of basic staples. Energy and food are about as basic as it gets, and there are no simple solutions to those issues. Energy is a heavily regulated industry, with public support for moving away from fossil fuels mounting, especially among younger individuals. Food prices are being hit by both the energy cost increases and the supply chain disruptions resulting from the conflict in Ukraine. Neither fuel nor food disruptions are likely to abate soon, and the food issues around wheat and the attendant products manufactured from wheat are likely to worsen before they improve.

 

If that news is not dire enough, energy supply chain constraints do not appear to be a short-term blip on the way to resuming normal productive capacity. A Grid News article lays out in great detail the supply chain constraints — rising demand, conversion of refinery capacity to renewable diesel, and shuttering of older domestic refinery capacity during the pandemic —impacting the refinery segment.

 

Further, expansion of U.S. refinery capacity is unlikely given regulatory and environmental risk, as well as a fundamental shift in financial outlook. The article expects gas prices to rise over the summer, which is a normal occurrence, but the article is not optimistic about an increase in refining capacity in the U.S. due to cost, environmental regulation, and long-term industry outlook as public sentiment grows for green alternatives.

What does this mean for the technology industry?

Within the tech sector multiple inflationary influences apply pressure. Rising energy costs will often be remediated with temporary surcharges. Amazon indicated in the company’s last earnings call that it was considering such measures. This surcharge can work for finished goods; however, component shortages triggering demand pull inflationary pressures similarly have a blend of cost considerations.

 

Some cost spikes will work themselves out as supply chains react to pandemic-induced disruptions. Other cost spikes will persist until productive capacity increases, given how many more finished goods outside of the technology space itself consume embedded components in smart devices.

 

For infrastructure manufacturers, this means year-to-year price declines in the “faster, better, cheaper” trendline, predicated on Moore’s law for decades, should at least temporarily flatten out. TBR analysis of the silicon shortages suggests these challenges will be ameliorated in the next two to three fiscal quarters.

 

The other pressure and risk consideration will be the impact on long-term pricing agreements for managed services and “as a Service” subscriptions. A backlog of recurring revenue priced with lower inflation forecasts will likely apply margin pressure to developing “as a Service” revenue streams of pure play companies and independent operating units within multiline businesses. Pricing adjustments may be baked into new agreements and renewals moving forward, but the runoff of booked contractual commitments could impact profit margins. Noncurrent bookings will be a telling indicator.

 

Services firms basing project fees on billable hours should be reasonably well insulated given hourly rates can change in real time to adjust for rising labor and fringe expenses.

 

As stated, pricing analysts will have to take a new look at the mix of inflationary pressures, sort those pressures by cost push and demand pull, and adjust their own business muscle memory for both short- and long-term instability rising energy costs, destabilized supply chains, and increased component demand brings to bear on their respective market segments and the monitoring of their commercial offers.

PwC’s Industry Cloud strategy delivers on 3 major cloud trends

PwC’s ambitious Industry Cloud strategy aims directly at heart of current trends

After spending an afternoon at PwC’s Boston Seaport offices in late March, TBR came away with a clearer picture of how the firm is centralizing its capabilities into solutions to be utilized in client engagements. It is a strategy that has been developed cautiously and thoughtfully over time, mirroring the firm’s overall evolution of the last few years, which has been both methodical and ambitious. The new Industry Cloud strategy is firmly in line with the company’s DNA but is also aligned with the most current trends in the cloud market, namely services, collaboration with partners, and industry alliances and preconfigured ecosystems.

The importance of services in cloud adoption and utilization has only increased over the last two years. The migration of mission-critical workloads and skills shortages have stoked demand for third-party firms to help implement and manage cloud solutions. PwC is tightly integrating services with all the cloud assets being deployed for the firm’s customers, which is an evolution of the long-standing Integrated Solutions program, incorporating the best of PwC’s consulting business across all platforms.

PwC’s Alliance strategy is integral to the Industry Cloud strategy, and through these collaborations, PwC is selecting well accepted and widely adopted cloud technologies to include in the firm’s recommended cloud solution frameworks, then filling the gaps between those individual technologies. The key is not trying to recreate the wheel with technology that already exists but using alliances to bring the leading solutions together across multiple vendors. It ties into broader PwC strategies to use automation, scale and commonalities to reduce deployments times by as much as half in some cases.

A key tenet of PwC’s strategy is also to build common cloud services that bring industry and sector-specific practices and prebuilt configurations to accelerate adoption timelines and reduce custom work. For a variety of reasons, customers are looking for diversity in their IT and cloud vendor landscapes, and PwC’s open solution frameworks cater to that desire. Lastly, industry specificity is an emergent trend in cloud. PwC is addressing the industry specialization void in the market by bringing together industry-leading technologies, tying them together with an integration fabric, and filling any gaps with its own services and innovation based on PwC’s deep experience and investments. These solutions can then enable customer business transformation spanning the front, middle and back office.

Industry customization ties the solutions together, as it as it reduces the need for custom services and is done in tight collaboration with cloud vendors’ technology. In this special report we detail these trends and PwC’s cloud strategy. However, in short, we see PwC’s strategy as being well developed and aligned not only to its core DNA but also to some of the most current trends and developments occurring in the market.

Free webinar: The vendor impact of cloud ERP picking up speed



Industry cloud is moving from a nice-to-have to a must-have

Enterprise maturity around horizontal cloud capabilities has resulted in a growing appetite for solution customization built around highly nuanced, industry-centric needs. This rising need will be addressed by both cloud vendors and services firms like PwC. Vendors have traditionally leveraged partnerships to add vertical functionality and go-to-market support to their solution sets, but that strategy has become even more aggressive recently, with multiple acquisitions being announced.

Oracle’s (NYSE: ORCL) intended acquisition of Cerner, Microsoft’s (Nasdaq: MFST) purchase of Nuance, and Salesforce’s (NYSE: CRM) strong alliance with Veeva (NYSE: VEEV) are all examples of how vendors are investing to offer more industry functionality to customers. Cloud vendors are also supporting industry-based go-to-market ambitions by augmenting their approach with an increased reliance on ecosystem partners across the IT continuum.

While tech partnerships have accelerated industry-based solution design and development, evidenced by Microsoft’s partnerships with both Rockwell (NYSE: ROK) and Honeywell (Nasdaq: HON), engagement with IT services entities will be just as critical to facilitating adoption among customers with industry-fluent advisory, road-mapping and implementation support services.

Specifically, in venues like industrial manufacturing, client DNA is rooted in hardware legacy organizational models and waterfall innovation and many clients lack not only the knowledge to support software-driven business models but also an understanding around the outcomes emerging technology — be it cloud, IoT or AI — can bring to their operations. This knowledge gap plays to the strengths of the professional services side of the IT spectrum, where innovation centers pair educational resources with business cases to provide prospective clients with an understanding as to what their own digital transformation (DT) could look like.

Not only has vendor activity with industry cloud picked up, so too have financial results as end customers increase adoption of these solutions. As shown above, customers see industry cloud capabilities as value-add elements of their cloud technologies, notably with the ability to free up resources being the least cited benefit. The ability of industry cloud offerings to first meet regulatory requirements and then also match the unique business and IT workflows within certain industries are the most compelling benefits, according to TBR’s 2H21 Cloud Applications Customer Research.

TBR’s perspective on PwC’s alignment with industry cloud trends: ‘Micro alliance activation’

      • PwC is not “boiling the ocean” with its approach to Industry Cloud, instead focusing on heavily regulated industries as the firm looks for ways to not only meet regulatory requirements but also leverage investments to competitively differentiate itself with enhanced time to market and ongoing operational excellence, While many vendors on the technology side have taken an even more focused approach to industries, we believe PwC’s strategy is appropriate for the firm, given its partner-driven engagement focus and existing presence within the industries.
      • PwC’s approach aligns to the third most selected benefit of industry cloud: “We are in the early phase of cloud adoption and are pursuing industry cloud services as a preliminary step in the process.” Many companies are still in an early stage of their cloud adoption. Regulations are more stringent in these industries, creating real and perceived barriers to adoption. In many ways, industry cloud is the ramp these customers need to get started using cloud in more significant ways as part of their IT strategy.

Cloud partnerships are moving from important to critical

The shift to partner-led growth is not a new trend but is being further legitimized in 2022. Growth from indirect, partner-led revenue streams have been outpacing direct go-to-market efforts for several years, but indirect revenue is reaching a new level of scale and significance in the market.

TBR estimates indirect cloud revenue is approaching 25% of the total cloud market opportunity, which is a significant milestone. For reference, in traditional IT and software, indirect revenue represents somewhere between 30% and 40% of revenue streams. We expect the indirect portion of the cloud segment to surpass that level within five years, approaching half of the market opportunity within the next decade. For all cloud vendors, the combination of short-term growth and long-term scale makes partnerships an increasingly critical element of their business strategy.

Partner ecosystems have been a core part of the IT business model for decades, but the developments around cloud will be different for various reasons, primarily because the labor-based, logistical tasks of traditional IT are largely unnecessary in the cloud model.

For cloud vendors and their partners to succeed in growing the cloud market, they both need to be focused on enabling business value for the end customer. Traditional custom development becomes cloud solution integration. Outsourcing and hosting are less valuable, while managed services are far more variable for cloud solutions. To capture this growing and sizeable opportunity in 2022, we expect companies will adapt their partner business models and vendor program structures to align with vibrant cloud ecosystems.

TBR’s perspective on PwC’s partner strategy

      • PwC is being proactive in how it leverages alliances, recognizing that winners in industry cloud rely on alliances and that the industry data model is only as good as the ISV solutions that run on top. Within PwC, these relationships are supported by joint business relationships and alliance groups with front-office, middle-office and back-office players, as well as the cloud service providers (CSPs) that go to market with PwC as part of the Journeys model. PwC is being selective about the vendors and technologies it recommends, focusing on leading providers like Amazon Web Services (AWS) (Nasdaq: AMZN) and Microsoft to both offer the most widely used solutions and simplify its alliances.
      • By combining the IaaS and SaaS capabilities of alliances with its own products and accelerators, PwC enables integration points in a platform-like approach. While not a PaaS offering in itself, PwC’s Common Cloud Services Platform, which targets custom Journeys for a specific industry in an end-to-end fashion, should create a high degree of stickiness.
      • PwC is emulating some best practices of its alliances, including the leading cloud service providers (CSPs) and ERP vendors. Further, some of ServiceNow’s success stems from selective innovation and deciding early on where it wishes to develop versus leveraging partners. PwC takes a similar approach, focusing custom development investments on whitespace markets while layering the capabilities of its partners on top of new solutions.
      • One of the most notable obstacles facing PwC is a degree of competitive overlap between PwC and cloud vendors it has collaborated with that are similarly working with industry consortiums to stitch together end-to-end systems. Where PwC stands to benefit in this regard is through its roots as a services firm; unlike some of the product-first competitors overlapping with the Industry Cloud strategy, PwC is going to market first with tech-enabled services that can then get clients exposed to products.

Traditional designations are morphing as value moves to IP development and managed services

In the traditional IT partner model, the business models of partners — such as reseller, systems integrator and ISV — were used to segment partner programs. Cloud has disrupted the traditional model, with born-in-the-cloud partners competing in various activities to optimize their revenue streams and traditional partners expanding their business models to sustain their financials.

As a result, resellers can develop their own solutions and IP, while systems integrators sell and resell their own software solutions and ISVs offer their own managed services. It is common for partners to have multiple business models, making the traditional designations too restrictive.

The other area of strong demand from customers, driving enhanced focus from cloud vendors, is in managed services. Increased cloud adoption has led to higher cloud complexity for many customers, leading to more challenging tasks to provide ongoing administration, integration and operations of the environment. This increasing complexity coincides with a historic shortage of personnel with cloud expertise, driving demand for managed service offerings from third-party providers to fill the gap. As a result, we expect managed services to be the fastest-growing segment of the cloud professional services market, reaching $75 billion by 2026.

Cloud vendors like AWS, Google (Nasdaq: GOOGL) and Microsoft have a vested interest in nurturing their managed service ecosystems to facilitate new investments from their cloud customers. Considering these trends and the likely erosion of legacy services lines by software and managed services, it is critical for consulting-led firms to diversify with serviceable assets that go beyond the underlying modules. While some of its Big Four competitors are similarly recognizing this trend, PwC appears to have caught on to the fact that software and services require vastly different sales models and dedicated teams for successful execution.

With Industry Cloud, PwC serves as consultant, ISV and managed service provider

Using the term Journeys is an apt description of how PwC intends to engage with customers around these solutions. It is not just a cloud technology implementation; there is upfront design and consulting, implementation of both off-the-shelf cloud technology and custom PwC IP to align solutions to industry, and finally provision of managed services to simplify ongoing operations. That is a lot of activity, but it reflects what customers need and want from these types of implementations. It is taking PwC beyond traditional services and value propositions with clients, but it aligns with where customers and the market are heading.

While the framework for Industry Cloud is compelling, it will no doubt be a challenge to execute on the vision. Expanding beyond traditional consulting business roles and activities and maintaining cohesiveness can be challenging, but as we have seen in recent years, PwC has been quite adept at reinventing itself, so we expect the firm to overcome these challenges. Alliance management, cloud service development, packaging and pricing are all competencies being developed within PwC to execute on more Industry Cloud opportunities.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

How Informatica uses the cloud to empower a data-driven enterprise

Overview

Setting the stage for what ended up being the primary theme at Informatica World 2022 — Data is your platform — Informatica CEO Amit Walia walked attendees through two emerging trends: the importance of scalable infrastructure through cloud computing, and how AI and machine learning (ML) are no longer just about automating processes but also about enabling predictive intelligence. These trends, while well recognized in theory, are more challenging for businesses to put into practice, particularly due to the proliferation of data and the number of users looking to access said data, including both technical and nontechnical personas.

Informatica’s solution to data complexity is rooted in one of the company’s core values — platform centricity — but the move to essentially replace Intelligent Data Platform with IDMC, after years of innovation and a slight disruption from COVID-19, is now taking Informatica’s approach to data management and integration to new heights. With IDMC in the cloud, Informatica is better positioning itself to help clients translate data into valuable insights at a level that cannot be realized on premises.

In addition to being cloud-native, IDMC is infused with AI, addressing the other emerging trend called out by Walia — the need for AI-powered intelligence. All Informatica capabilities are built on CLAIRE, an automated metadata engine that processes 32 trillion transactions per month, and tie back into IDMC. While the ROI for AI technology is still hard to justify for many businesses, another key factor in the low adoption of the technology is that many businesses are working with complex, siloed data, which means AI models could fall short and lead to inaccuracies.

CLAIRE is designed to address a range of operational, runtime and automation use cases — from auto-scaling to anomaly detection — and acts as a wrapper around IDMC to enable fully automated data management and governance processes. By bringing the power of cloud and AI into one integrated platform, Informatica uses IDMC to help customers focus on the only thing they truly own in the cloud: their data. The result of a $1 billion, six-year investment, IDMC consists of seven core modules, with its value proposition largely stemming from its modularity and the ability to allow customers to pick and choose capabilities and services based on their industry, business and use case.

Informatica expands platform capabilities, driving additional value for its comprehensive, cloud-native solution

New innovations emphasize uniting IT and business functions to improve efficiency

With IDMC, Informatica has solidified its platform approach, but as cited by various customers, the company’s ability to continually offer new capabilities is what drives additional value, by addressing more horizontal and vertical use cases in the data life cycle. Perhaps the most notable announcement at Informatica World 2022, which seemed to garner particular excitement from product leaders and customers, was the general availability of Informatica Data Loader. Jitesh Ghai, Informatica’s chief product officer, led a demo of Data Loader, which is a free, self-service tool that ingests data from over 30 out-of-the-box systems into Google Cloud’s popular data warehouse solution, BigQuery.

As part of the demo, we saw a scenario play out where a marketing analyst needs access to more data to effectively run a campaign. The hypothetical marketing analyst then accesses the Integration module within IDMC to pull data from Marketo using a drop-down tool to access BigQuery through which data can be loaded in only a few steps. This integration could end up acting as a time-saver for large organizations and speaks to the innovative ways Informatica is getting data into the hands of line-of-business teams.

At the event, Informatica also announced INFACore, which targets more technical users, such as data scientists and engineers, allowing them to clean and manage data in a single function. As a low-code plug-in for popular frameworks, such as Jupyter notebooks, INFACore is designed to improve the productivity of the technical user, but naturally this productivity trickles up to business functions. For instance, after using INFACore to cleanse data through a single function, the data scientist can publish a clean data set to the Informatica Marketplace, where other teams within an organization can access it.

Another key innovation called out in opening talks with Ghai was ModelServe, which allows users to upload, monitor and manage ML models within their Informatica data pipelines. There are many ML models in production, but businesses are still looking for ways to scale them from an operational perspective. In talks with more than one customer at the event, the common interface within IDMC came up as a value-add when attempting to scale a data team, suggesting customers are awaiting ModelServe’s general availability as it will allow users to register and manage ML models directly within IDMC.

Informatica strengthens SaaS portfolio, building in intelligence from the data model up

While Informatica’s platform capabilities get much of the market’s attention, the company also has a broad portfolio of IDMC-enabled SaaS offerings, which play a key role in the data management journey, complementing warehousing, integration and automation. As a native service within Informatica’s Master Data Management (MDM) solution, 360 applications act as a gateway for transforming customer experience in the cloud, something we saw in action through the product demo of Supplier 360 SaaS.

Through IDMC, CLAIRE recognized a defective part from a supplier of a hypothetical company, and teams were able to use Supplier 360 SaaS to identify which customers were impacted by the faulty part and automatically notify customer service so they can launch a refund program to keep customers satisfied. Informatica also released various industry and domain extensions for its 360 applications and will continue to offer new packaged offerings available in a SaaS model, providing customers more ways to onboard and manage data.

Joining the industry cloud bandwagon, Informatica verticalizes IDMC

It is no secret that industry specialization is re-emerging as a leading trend in the cloud space, as a maturing enterprise customer base demands solutions that suit their unique IT and business processes. During the event, Informatica unveiled new IDMC customizations for financial services, healthcare and life sciences. These three offerings join IDMC for Retail in Informatica’s industry cloud portfolio to further address demand for purpose-built solutions that will limit the need for customization.

Findings from TBR’s Cloud Infrastructure & Platforms Customer Research continue to indicate that some enterprises are wary of industry cloud solutions, dismissing them as marketing ploys. Other enterprises, however, find them worth evaluating. For instance, in talks with a representative from a hedge fund, we found that the company initially chose a competing MDM solution because it specialized in asset management with its own specific data dictionary but was torn as it viewed Informatica’s MDM as ahead of the competition in terms of capabilities. We can expect Informatica to expand in other industries, including specific subverticals, with additional data models, custom user interfaces and data quality rules to appeal to these customers.

Continued integrations and go-to-market synergies with hyperscalers help Informatica maintain data neutrality

For a company that markets itself as the “Switzerland of data,” Informatica’s ability to make its offerings accessible across leading cloud platforms is critical. Partnering across the cloud landscape is no longer a differentiator, it is a necessity and something customers clearly find value in as they gravitate toward multicloud environments. During the event, Walia welcomed several partner executives both in-person and virtually to discuss new joint offerings and go-to-market synergies the company is forming with cloud service providers to deliver more choice and flexibility and for joint clients.

      • The ubiquity of Microsoft’s cloud portfolio allows Informatica to provide clients a unified data architecture. Informatica and Microsoft (Nasdaq: MSFT) have a well-established relationship, which at its core is focused on migrating data warehouses to the cloud but is evolving and making Informatica relevant across the Microsoft Cloud stack, including Azure, Power Platform and 365 applications. For example, Informatica is typically well known for its integration with Azure Synapse, but the company also integrates with the Dynamics 365 SaaS data model to enable Customer 360 analytics. Expanding its presence throughout the Microsoft Cloud stack, Informatica announced MDM on Azure. With this announcement, customers can deploy MDM as a SaaS offering on Azure via the Azure Marketplace, which could appeal to the large number of Microsoft shops looking to enhance their Azure Data Lakes with a feature-rich MDM solution. Both companies also launched Informatica Data Governance with Power BI, which, as highlighted by Scott Guthrie, EVP of Cloud and AI at Microsoft, brings Informatica’s data catalog scanners to Power BI, allowing customers to have a single view of their data processes from ingestion to consumption. This offering could serve as a more strategic way for customers to modernize their analytics workloads through Azure.
      • Given their respective strengths in data analytics and data management, Google Cloud and Informatica are complementary partners. The Google Cloud-Informatica relationship took a major step forward with the launch of Informatica Data Loader, which could expand client usage of BigQuery and help Google Cloud (Nasdaq: GOOGL) address a wider set of customer needs, including those outside the IT department. In TBR’s own discussions with enterprise buyers, BigQuery is often cited as a leading solution due to its ability to handle petabytes of data at a favorable price point. Walia reaffirmed this notion in discussions with two customers, ADT and Telus, both of which are migrating legacy data warehouses and/or front-end ETL (extract, transform, load) capabilities into their BigQuery instances and using IDMC for cloud-based data management.
      • Oracle awards Informatica preferred partner status for data integration. Informatica and Oracle (NYSE: ORCL) struck a new partnership agreement that offers IDMC on Oracle Cloud Infrastructure (OCI). Addressing the large number of customers running legacy Oracle databases and potentially those that are also deploying on-premises Informatica products, IDMC on OCI provides customers an integrated gateway to the cloud by enabling back-end connections with Oracle Autonomous Database and Exadata Database Service and OCI Object Storage. For example, with IDMC on OCI, customers can import data from legacy Oracle E-Business Suite applications into Autonomous Database and connect to other data sources, such as Azure SQL or Amazon RedShift, through IDMC. As a preferred Oracle partner, Informatica will recommend customers use IDMC with Oracle’s cloud services. Oracle’s EVP of database server technologies, Andy Mendelsohn, walked through numerous incentives to assist customers’ cloud migrations, such as Bring Your Own License, Informatica Migration Factory and Oracle Cloud Lift Services.

Informatica also has close relationships with Amazon Web Services (AWS) (Nasdaq: AMZN), Snowflake (NYSE: SNOW) and Databricks, all of which are expanding their commitments to Informatica to help customers look beyond ETL and handle data in an end-to-end fashion. Given Informatica offers analytics, integration, automation, governance and management capabilities across leading clouds, naturally the company runs up against a high degree of competitive overlap with its partners, which offer similar native tooling as part of a customer’s environment.

However, in talks with customers, the general perception seems to be that the hyperscalers’ capabilities are still relatively immature and that there is also significant value in deploying a vendor-neutral platform like IDMC to avoid vendor lock-in and address the training and skill challenges typically associated with a multicloud environment. While we can expect the hyperscalers to enhance their capabilities, at the end of the day, the primary goal for AWS, Microsoft and Google Cloud is to win compute, so the benefits of partnering with Informatica to capture legacy platform-layer workloads outweigh the downsides of coopetition.

Conclusion

With IDMC, Informatica has fostered a value proposition catered to three core areas: platform-centricity, connecting IT and business ecosystems, and infrastructure agnosticism. The numerous announcements made at Informatica World 2022 show the data management company is building on these strategic pillars by better aligning with cutting-edge trends in the cloud industry, such as industry customization, out-of-the-box integrations and data democratization. With these enhancements in place, along with close partnerships across the IaaS ecosystem, Informatica is positioning itself favorably to assist clients with the large number of on-premises workloads ready to be migrated and modernized in the cloud while enabling the cloud-native enterprise to transition from digital to data-driven.



Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

Drawing on its partner network and Red Hat’s open posture, IBM enables full-stack transformation

TBR attended IBM Think in a virtual format for the third consecutive year, and this time around we sensed a new IBM. No longer beholden to its low-margin managed infrastructure services business, IBM is emerging a more agile, streamlined and focused organization, especially as it looks to lead the digital revolution through two overarching areas: getting customers to embrace a hybrid architecture and helping them unlock data-driven insights through AI.

This strategic pivot was driven home not only by high-level executives, including CEO Arvind Krishna himself in an exclusive Q&A session with the analyst community, but also through the various partnership announcements, service launches and upskilling programs unveiled over the course of the interactive, two-day event.

Through Red Hat, Software and Consulting, IBM has created an end-to-end approach to unlocking hybrid cloud’s value

Closing in on the three-year anniversary of its acquisition of Red Hat, IBM (NYSE: IBM) continues to execute on its hybrid cloud vision, offering the services and software needed to integrate and orchestrate enterprise workloads across multiple environments. With the exception of some mono cloud and data center-only customers, enterprises are largely heterogenous in how they consume IT, drawing on multiple architectures, vendors and environments.

Considering IBM’s large legacy software install base and ties to the mainframe, this trend bodes well for the company as it can leverage Red Hat OpenShift — which now has roughly four times the number of customers it had prior to the acquisition — to unlock siloed data and extend it to any public cloud. The challenge, however, as articulated by Roger Premo, general manager, corporate development and strategy, is that getting greenfield applications to the cloud is only Step 1 in achieving a scalable hybrid cloud framework, yet the amount of time, level of skills needed and executive-level pushback are some of the factors that keep enterprises from expanding on their lift-and-shift investments.

 

Hoping to advance customers through the containerization, operational change and replatforming phases of hybrid cloud adoption, IBM is revamping its go-to-market model, closely aligning the Software and Consulting business units to address customer needs end to end. For instance, IBM Consulting is invested in the technology behind IBM’s hybrid cloud and AI vision, providing clients the tools needed to provision their own hybrid environments, which, as phases of adoption become more complicated, will naturally pull in more automation, observability and AI assets, as well as additional advisory assistance to help determine which clouds are best suited to which workloads.

Specifically, Premo highlighted the data fabric, which has grown synonymous with IBM Cloud Pak for Data, as one of the technology pieces underpinning IBM Consulting’s value proposition for building and modernizing applications in a hybrid cloud environment. While IBM is still committed to supporting legacy data warehouses and on-premises databases, the company is likely encouraging customers to adopt the data fabric for integrated capabilities that help simplify data management, such as cataloging and automated governance. Essentially an ecosystem of data powered by active metadata, IBM’s data fabric allows various AI offerings, from decision intelligence to machine learning, to run in any environment, while maintaining a common, governed framework.

IBM’s partner strategy continues to evolve post-Red Hat

IBM has always prided itself on having a broad partner ecosystem but appears to be taking a page out of Red Hat’s playbook by creating a more open position in how it goes to market. For instance, as a full-stack vendor specializing in infrastructure, platform software and professional services, IBM naturally runs up against competition in many areas but appears more willing to risk coopetition to do what is in the best interests of the customer.

TBR notes this is a stark contrast from the SoftLayer days, when IBM seemed more concerned with protecting its direct business interests. Today, Big Blue is absorbing more of Red Hat’s operational best practices and is investing in dedicated teams across the ecosystem, including niche ISVs, hyperscalers, global systems integrators (GSIs), advisory firms and monolithic SaaS companies. At the same time, preserving Red Hat’s independence remains equally important, and as Premo indicates, the relationship between IBM and Red Hat is asymmetrical in that IBM is biased toward Red Hat but Red Hat is not biased toward IBM.

 

IBM inks strategic partner agreement with AWS to scale ‘as a Service’ software

In one of the more newsworthy announcements at IBM Think Digital 2022, IBM unveiled it is working with Amazon Web Services (AWS) (Nasdaq: AMZN) as part of a multiyear agreement that brings the IBM Software portfolio, delivered “as a Service,” to AWS’ cloud infrastructure. Customers can now take advantage of the popular click-to-buy experience on the AWS Marketplace to run IBM data and automation assets, including Db2, API Connect and Watson Orchestrate, among others, in an AWS environment. This partnership announcement is a testament to the major strategy shift IBM made three years ago when it acquired Red Hat and standardized on the OpenShift platform, which, being based on Linux and containers, makes the platform and subsequent IBM software applicable on any infrastructure, including AWS.

This platform approach is also providing IBM the flexibility to adapt alongside changing customer buying habits, including a shift toward cloud managed services, which is the fastest-growing usage of OpenShift and prompted the launch of Red Hat OpenShift on AWS (ROSA) at last year’s Red Hat Summit. Customers looking to offload operations to site reliability engineers (SREs) will be able to deploy IBM SaaS offerings integrated with ROSA as a managed service, although IBM is continuing to support customers looking to protect their capex investments as there are over 30 IBM licensed software offerings available on the AWS Marketplace. Expanding service availability is only one part of the partner agreement as IBM indicates it will work with AWS in other areas, including co-selling and co-marketing initiatives that could engage AWS sales teams and help IBM further tap into AWS’ expansive customer base.

 

Strategically, IBM is staying the course with its strategy, leveraging Red Hat’s neutral status and integrations with hyperscalers to sell more software and attached services. Offering IBM SaaS on AWS is a strategic move as it will allow IBM to address customers that have years of experience running IBM software but want the scale of AWS’ cloud infrastructure, which TBR interprets as IBM prioritizing partner clouds at the expense of its own so it can focus solely on OpenShift and Software. Further, as IBM looks to grow its software business, particularly through the monetization of “as a Service” models built on OpenShift, leveraging partner marketplaces will be key, especially considering IBM lacks marketplace capabilities at scale and IT procurement continues to rally around the digital catalogs of AWS, Microsoft (Nasdaq: MSFT) and Google Cloud (Nasdaq: GOOGL).

 

Use of RISE with SAP internally aligns with IBM’s vision to bring legacy ERP to the hybrid cloud

IBM joined the roster of 1,000-plus RISE with SAP customers, announcing it is migrating to SAP Business Suite 4 HANA (S/4HANA) to streamline business operations across its Software, Infrastructure and Consulting units. This announcement comes just months after IBM unveiled a new supplier option via the BREAKTHROUGH with IBM for RISE with SAP program, which enables customers to bundle professional services with IBM IaaS offerings as part of a unified contract and set of service-level agreements (SLAs).

IBM’s new migration project will leverage the premium supplier option and bring over 375 terabytes of on-premises data to IBM Power on Red Hat Enterprise Linux (RHEL) on IBM Cloud. While IBM is partnering with GSIs in many areas, SAP (NYSE: SAP) implementations is likely one of the areas where competition is fiercer between IBM and its peers, especially as the end-of-life deadline for legacy SAP R3 approaches. However, the premium supplier option paired with IBM’s over 38,000 trained SAP consultants could help the company better tap into SAP’s base of over 30,000 on-premises ERP customers and challenge the likes of Accenture (NYSE: ACN) and Deloitte.



Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

One of tech’s largest acquisitions will place VMware as strategic and financial centerpiece of Broadcom Software

Broadcom will position VMware at forefront of its software strategy

On May 26, Broadcom (Nasdaq: AVGO) agreed to purchase VMware (NYSE: VMW) at an enterprise value of $69 billion, making it one of the largest tech acquisitions in history. While Broadcom is no stranger to software acquisitions, this transaction will be its most transformative as VMware becomes both the brand and growth driver behind Broadcom Software. If the transaction closes, the new Broadcom will find itself evenly balanced between its semiconductor and infrastructure software businesses. After market close on the day of the announcement, investors on each side of the transaction viewed the proposed deal favorably, signaling shareholders’ confidence in management’s ability to use past experiences to generate free cash flow through the integration of the two companies, bolstered by VMware’s cost structure and pervasive role in enterprise IT.

Should the deal close, VMware will be led by Broadcom Software Group’s current president, Tom Krause, who has a financial background and will report to Broadcom CEO Hock Tan. As with past acquisitions, Broadcom’s primary goal will be to improve profitability through cost synergies, mostly related to redundant headcount. While margins will certainly benefit, VMware’s innovative agenda, spearheaded by Pat Gelsinger and since adopted by current CEO Raghu Raghuram, hangs in the balance, with the outcome dependent upon Broadcom’s desire to drive synergies with VMware in both R&D and go to market. If Broadcom’s acquisitions of CA Technologies and Symantec are any indication, VMware’s future in the cloud and at the edge may be muted. But it is still early days, and commentary from Broadcom management suggests a different course of action relative to past acquisitions with a strong intent to invest in VMware’s core software-defined data center (SDDC) stack.

A deal could bring VMware back to its data center roots

Since the 2016 launch of VMware Cloud Foundation (VCF), VMware has insisted on making its trusted virtualization software relevant beyond data center walls by delivering native, turnkey solutions with all major cloud service providers (CSPs). The rise of cloud-native development through containers and Kubernetes has presented VMware customers with an alternate route to the public cloud, but the 2019 acquisition of Pivotal and resulting Tanzu portfolio — while still built and delivered via ESXi — allowed VMware to position as a complement to containers, rather than a competitive threat.

Often still defined as the company that pioneered enterprise virtualization, VMware has proven its ability to adapt over the past two decades alongside market trends, including cloud computing and containerization, both of which have accelerated VMware’s transition to a Subscription & SaaS company, with related revenue comprising 29% of total business in 1Q22. Broadcom plans to upsell Subscription & SaaS alternatives to legacy customers, including those demanding “as a Service” software inside the data center.

However, given the growth in Broadcom’s software business stems from mainframe customers, we cannot help but wonder if VMware’s push to the cloud will be stalled should the deal close. From a cost perspective, customers may be less incentivized to move their VMware workloads to the cloud, and instead could containerize applications to avoid incurring the cost of VMware or could simply keep their VMware applications on premises, which would erode some cross-selling opportunities for Broadcom. Further, given Broadcom’s focus on revenue-rich products, we can expect detracted focus from the Tanzu initiative, which could bring VMware further back to its data center roots and, in a worst-case scenario, put it back at war with the hyperscalers, as was similarly seen in the early days of EMC.

With VMware’s success hinging on partners, Broadcom cannot afford to decelerate partner investment

Historically, Broadcom’s corporate sales model has been largely direct, but considering the scale of VMware’s partner network, the pivot toward indirect sales motions is inevitable, especially as Broadcom looks to build out a $20 billion software enterprise. Management indicated it will sell directly into 1,500 core accounts while likely providing hands-on professional and support services to these customers, which Broadcom chalks up to a simplification of its overall business model. This suggests, however, that there will be over 300,000 vSphere adopters still left in the hands of partners — and given Broadcom’s lack of comparative experience navigating channel relationships, the company will be most successful if it lets VMware go to market independently while preserving its relationships with strategic resellers, especially Dell Technologies, which is responsible for roughly one-third of VMware’s revenue.

Further, despite a thin R&D budget, Broadcom will still deliver new product integrations with VMware, which could present opportunities for distributors, VARs and potentially ISVs looking to integrate and package their solutions with VMware and Broadcom. However, management has been unclear regarding acquisition synergies, suggesting opportunities could be minimal, and except for some OEMs potentially hoping Broadcom will help level the playing field, partners are likely concerned.

This is particularly true as prior to the announcement VMware was in the middle of overhauling its partner program, announcing promises to improve coselling motions between direct sales teams and VARs, in addition to investments in digital and automation technologies designed to lower implementation costs and improve partner profitability. With Broadcom’s cost structure in place, investments in VMware resources and training programs for partners could decrease, which, when combined with the already higher prices we can expect for VMware products, will present a challenge for partners across the spectrum.

For Broadcom, it is all about profitability

The proposed acquisition can be viewed as another one of Broadcom’s attempts to diversify its hardware portfolio through high-margin software, and with VMware, Broadcom will use redundant costs and license prices as levers for margin expansion. Profit growth will have to come in the form of cost consolidation as VMware’s top line will decelerate, especially as profitable software maintenance revenue streams erode as customers transition from licenses to subscriptions. For context, in 2021 VMware’s SG&A costs accounted for 40% of revenue, a high percentage relative to peers, leaving room for Broadcom to offload redundant resources, particularly in back-office positions.

Meanwhile, as Broadcom prioritizes margins at the expense of top-line growth, at least in the near term, we can expect the sales and marketing line to be impacted, with Broadcom making use of its existing sales teams and channel distribution partners to sell into existing strategic accounts. R&D is perhaps the biggest question mark weighing on the pro forma company, which we expect will require a minimum 15% reduction in spend to meet EBITDA targets, when applying the S&M and G&A estimates shown in Figure 1. The R&D budget will undoubtedly be cut, but the degree depends on the level of “central engineering” synergies Broadcom is willing to form with VMware to deliver new products, with at least basic CI/CD (continuous integration/continuous delivery) procedures in place.

Leveraging VMware’s relationships with the cloud providers, specifically Amazon Web Services (AWS) (Nasdaq: AMZN), it is possible new product synergies could be formed without driving significant R&D investment. However, it will still require a level of commitment from Broadcom to invest in the VMware portfolio beyond SDDC, which does not appear on the company’s radar. This structure could also impact existing offerings like SASE and Project Monterey, which happens to align with Broadcom’s gradual shift away from x86 architectures. This is especially true as Broadcom figures out where there is overlap between its existing software portfolio, which already has plays in security, infrastructure management and FC SAN (fiber channel storage area network) and VMware.
Broadcom Software acquires VMware
At the end of the day, cost actions will run through the income statement over the next three years in a way that gets Broadcom to $8.5 billion in pro forma adjusted EBITDA. Currently estimated at $4.7 billion for FY22, Broadcom would need to grow adjusted EBITDA by a 22% CAGR to achieve this goal, resulting in a drastic operational change for VMware and potentially a loss of momentum outside vSphere, vSAN, NSX and the vRealize suite, which may not have an impact on near-term results but certainly risks VMware’s long-term attractiveness.

Rival bid seems unlikely despite go-shop provision

While the premium pledged by Broadcom in its bid for VMware is likely to ward off most, if not all, potential rival bids, the current agreement contains a 40-day go-shop provision that allows VMware to explore other buyers. Ultimately, any potential bidder would need to have a significant amount of capital ready to be utilized and be willing to push VMware’s valuation further. Given their respective sizes, a hyperscaler is the most likely candidate, with AWS top of mind considering its strategic reseller and product alliance with VMware.

However, TBR believes this is still unlikely, and if any of the cloud providers were to buy VMware, it would be widely perceived as an attempt to buy IaaS revenue. Further, we believe that the cloud providers, while some are more prone to locking in customers than others, generally respect VMware’s neutral position in the market and are cognizant of the fact that owning VMware could create a host of challenges for customers. It is also plausible some of the hardware vendors would like to get in on the deal, but OEMs could be skeptical following last year’s spinoff by Dell Technologies.

TBR takeaway

Considering Broadcom’s aggressive profit targets and previous history running software businesses, customers, partners and employees appear to share mutual concern regarding what will become of Broadcom Software should the deal close. With cost reductions bound to occur across business functions, including R&D, lack of investment raises questions as to how VMware will remain competitive in markets beyond traditional virtualization.

However, Broadcom management has also indicated that VMware will not operate like Symantec and CA Technologies, given its unique market position — and if VMware can materialize R&D to drive new product synergies, the company could at a minimum maintain its trajectory of midsingle-digit growth. VMware’s well-established relationships with channel partners will also help Broadcom establish a large software empire, but this would be contingent on the company’s willingness to invest in less profitable, yet emerging business units, with the final decision coming down to whether management believes the initiative will be accretive to free cash flow.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

Past is prologue for EY and the blockchain ecosystem

Gathering in person again for the first time since 2019, EY hosted around 200 blockchain enthusiasts for a full day of presentations, panel discussions and deep dives into the technologies, business use cases and ongoing challenges around the entire blockchain ecosystem, from cryptocurrencies to decentralized autonomous organizations (DAOs) to smart contracts. TBR attended EY Blockchain Summit both in person and virtually and spoke with EY leaders, EY clients, and entrepreneurs using the event to better understand blockchain. Following the in-person event, EY held virtual sessions for three additional days, tailored to practitioners and focused on specific use cases and technologies.

Evolving public blockchain for the masses to enterprise-ready solutions positions EY among the key ecosystem enablers

At every EY Blockchain Summit, TBR has been bowled over by the vision, clarity and passion EY brings and the diverse perspectives and commercial opportunities discussed both freely and critically. No good idea goes unspoken, and no questionable idea passes unscathed. In all these aspects, the May 17 summit in New York City — a welcome return to in-person gatherings — echoed previous summits, including an opening presentation by Paul Brody, EY’s unique blockchain proselytizer (and the firm’s global blockchain leader).

The overarching theme, in contrast to past events, centered on unlocking enterprise use cases, with EY facilitating adoption and adequately addressing privacy on the public blockchain. While last year’s summit featured extensive examinations of cryptocurrencies, central bank digital currencies and decentralized finance (DeFi), Brody and EY kept this year’s focus on getting to scaled adoption of blockchain such that blockchains do for business ecosystems what ERP did for the enterprise.

Numerous presenters and panelists took the discussion far afield, into questions such as the future of the dollar and the value of decentralized autonomous organizations, but Brody and his EY colleagues consistently presented a firm with the right strategy, investments, tool sets, alliances and leadership to act as a good shepherd for blockchain, advising clients on adoption and helping to shape a sustained push to Ethereum as the dominant ecosystem platform.

In TBR’s view, unrestrained passion for blockchain, bolstered by R&D investments (see below) and combined with a Big Four mentality around risk, compliance and consulting for large-scale enterprises, will continue to differentiate EY from peers, a separation that will become financially significant should Brody’s optimistic projections for blockchain’s revenue potential play out.

EY plus Polygon Nightfall makes Ethereum enterprise ready

Brody’s opening monologue covered the vast blockchain space, including three “killer apps,” cryptocurrencies, DeFi and DAOs, and predicted exponential growth for blockchain over the next 15 years. He hammered home the dominance of the Ethereum platform, which he described as “demonstrating all the process maturity you would expect from essential infrastructure.” And he described non-fungible tokens (NFTs) as one of the “most mature use cases” and heading for “mainstream adoption.” In this constantly changing space, Brody centered EY’s value on helping enterprises build, run and manage secure business processes on the Ethereum blockchain. To explain EY’s case, Brody helpfully provided his firm’s “secret plan for world domination” and its four component parts — essentially, advise, build, enable, and manage (tax included).

Circling back to a theme that has surfaced repeatedly at these blockchain summits, Brody said that EY understands enterprises will move to public blockchains when they are assured of privacy — not anonymity — and that the firm has worked to make that privacy possible through a partnership with Polygon Nightfall, a “privacy-centric Layer 2 network built on technology developed by EY teams and placed in the public domain.” TBR cannot assess the technological aspects of Polygon Nightfall, but two critical elements stand out from Brody’s presentation of it: First, EY dedicated people and money toward developing the technology, likely included as part of the firm’s planned $200 million in blockchain R&D spend in 2021, up from $100 million in 2021. Second, the firm released the technology into the public domain, demonstrably committing to public blockchains and EY’s role as a positive force in the ecosystem. Critically, Polygon Nightfall neatly complements EY’s existing blockchain solutions EY OpsChain and EY Blockchain Analyzer, which Brody explained the firm had expanded in the last year.

    • EY OpsChain, which notarizes documents, tokenizes assets, mints NFTs, traces raw materials and manages procurement, had a full production launch for traceability and a beta launch for application programming interface (API) services and inventory management. The latter two are critical to connecting networks and enabling the shift to smart supply chains, tying back to Brody’s suggestion that blockchain will be the ERP equivalent change agent for business networks.
    • EY Blockchain Analyzer, previously only available to EY audit clients, has been opened to non-audit clients, broadening the reach of EY blockchain software with an eye toward the 35x investment yield Brody stressed happens as emerging technologies move into early and later majority adoption over a 15-year period. The product, which reconciles transactions, tests smart contracts and calculates capital gains, has added functionality for reviewing and more options for testing smart contracts (see below). Users can now create, save and share custom tests.


Brody netted out the two suites as covering the essentials of every asset, business process and industry, with every transaction consisting, in his words, of “money, stuff, swap, subject to agreement.”

Vision, execution, results: EY’s track record in blockchain has yet to be challenged

Brody’s opening tour d’horizon highlighted the biggest blockchain trends and EY’s latest developments while also, in TBR’s view, subtly understating EY’s core value to its blockchain clients and the blockchain ecosystem. The firm’s investments include R&D and people — not just the techies capable of developing solutions like Blockchain Analyzer and the rest but also the consultants who can explain the business value and the tax, audit and risk experts who can help clients understand the effects of blockchain on their enterprise. The tool sets, which may be the most underrated but critical aspect of EY’s approach, demonstrate EY goes beyond just hyping, advising and implementing others’ technologies and into developing its own solutions and putting the EY brand — trusted, humans at the center — behind those solutions. A yearslong effort, these tools, along with the people, institutional knowledge and stress-tested capabilities, cannot be easily replicated by competitors. In essence, EY brings consulting and trusted technology into a space littered with hype and opportunities.

We cannot help but repeat what we said one year ago: “But trust, along with translating government intentions to trackable compliance checks, will remain the last bastion of business value in an otherwise commoditized state of the technology industry as we will come to know it as more legacy players fall victim to creative destruction and Moore’s Law Economics. EY, and more specifically, Brody, has a more clear line of sight on how public blockchain networks will evolve on par with the way the public internet evolved than anyone in the technology industry today. It would be foolish to bet against them and wise to partner with them.”

TBR did note the seeming absence of at least one of EY’s traditional blockchain partners, indicating the firm’s maturity in this space may be outpacing previously strategic partners, a development TBR will watch closely over the remainder of 2022. After Brody’s opening, the next round of presentations and panels dove deeper into specific themes and challenges in the blockchain space. Everyone — academics, bitcoin bros, bankers and solarpunks — buys into Brody’s assertion that $1 in blockchain revenue today will be $36 to $40 in blockchain revenue in 15 years.

Smart contracts are proven use cases, helping EY scale up its blockchain portfolio

In addition to the morning plenary, TBR attended an afternoon session on testing the functionality of smart contracts on EY’s Blockchain Analyzer. The presentation and demonstration, led by Sam Davies, EY global blockchain platform lead and engineering manager, and Karin Flieswasser, product owner of EY Blockchain Analyzer: Smart Contract & Token Review, helped participants understand EY’s tools, beginning with the strategies and philosophies behind specific capabilities, restrictions and attributes. (Note: TBR has listened to countless product demonstrations and has rarely heard a description of the mindset going into improving a product and the very basic “why” a solution could and should be changed. This was a welcome change from the assumption everyone would know the thinking behind the technology.)

Over the course of the hour, Davies and Flieswasser demonstrated various permutations of a use case that undoubtedly resonates with administrators of smart contracts wondering, “How can I be sure this thing will work the right way?” Davies began the discussion by detailing a few smart contracts gone wrong, and Flieswasser then described how EY’s Blockchain Analyzer Smart Contract Testing and Review system could have forestalled those issues.

In recent years, blockchain clients (and potential adopters) have consistently told TBR that reluctance to adopting smart contracts begins with uncertainty about the human element, not the technology. With that in mind, two elements of Davies and Flieswasser’s presentation stood out for TBR. First, the tool itself appeared to be intuitive and user-friendly, with every option, drop-down, task and function self-explanatory — a welcome respite from the usual hyper-tech talk around blockchain. Considering people tasked with administering smart contracts may more likely reside in procurement, supply chain management or even human resources, keeping the tech simple to use will likely accelerate adoption. Second, all of the testing and review perfectly mimic on-chain realities without actually using, compromising or changing any on-chain data.

While that should be an obvious characteristic, Flieswasser repeatedly emphasized the point — and took clarifying questions on it — leading TBR to believe this feature figures prominently in the risk management concerns of enterprise smart contract administrators. Lastly, the two presenters themselves, hailing from the U.K. and Israel, reinforced the global nature of EY’s blockchain Practice, and during a post-session discussion, Flieswasser noted the Blockchain Analyzer team is relatively small and geographically diverse. In TBR’s view, smart contracts can be a readily understood blockchain use case and may be one of the quiet catalysts for enterprise ecosystems’ blockchain adoption. Making smart contracts less risky by deploying easy-to-use test and review systems will likely be a critical element to accelerating adoption.

Crypto’s hope and hype are dashed by the history of money, bolstering EY’s role as the community shepherd

If the past is also the prologue for EY innovation, then EY’s foray into smart money tied to smart contracts will likely start in the consumer space. Just as EY’s first scaled blockchain use case was assisting Microsoft with tracking developer royalty payments, this concept has test cases starting with loyalty rewards programs and consumer gaming. In this manner, smart money use cases with small-dollar impacts will not roil capital markets. If the technology works, then it can be applied to higher-value situations in both wholesale and retail financial settings.

In his talk Brody made clear a distinction between privacy and anonymity. One blockchain camp stresses anonymity, and Brody and EY are in the privacy camp. To audit and attest business transactions to regulatory agencies, there cannot be anonymity. Privacy, however, protects the information on a need-to-know basis, leaving competitors unable to garner valuable business information regarding private matters such as unit pricing and discount structures.

When it comes to the overall merit of and need for cryptocurrencies, University of Southern California (USC) professor and former U.S. Federal Reserve executive Rodney Ramcharan’s keynote provided a history lesson on the U.S. dollar, offering ample evidence of lessons learned from not having a reserve bank to backstop against runs on a currency. In this regard, fiat currencies and stablecoins tied to fiat currencies rather than to algorithms appear to provide the kind risk mitigation that will be necessary for commerce. Crypto as a wealth store on par with gold is a different application area where risk is unquestionably higher.

In the past two iterations of TBR’s Digital Transformation Blockchain Market Landscape, we have provided some initial analysis on central bank digital currencies (CBDCs) and DeFi with a few developments worth noting, including the recently published paper by the Federal Reserve Board focused on CBDCs, in particular the digital dollar; the U.S. Security and Exchange Commission’s approval of a Boston-based exchange — BOX Exchange — that will use blockchain for faster settlements and potentially enable exchange tokenized securities; and lastly President Joe Biden’s executive order ensuring the responsible development of digital assets, including CBDCs.

The U.S. government’s awareness of and initial interest in CBDCs are steps in the right direction toward recognizing the implications of digital assets for the economy and everyday consumers. However, given the complexity, particularly around reaching consensus among community participants on the governance side, we believe it will be a while before a digital U.S. dollar will be deployed at scale for everyday merchant transactions and trade. The implications between wholesale and retail CBDCs carry risks, scale, speed and rewards. Connecting Main Street and Wall Street economies through blockchain is a necessary step that we believe will have a bigger, broader impact on enterprise buyers’ digital transformation (DT) initiatives. One might see such a framework as a bit of a long shot, but historically, financial services institutions have paved the way in new tech adoption.

Below is a direct quote from a CTO and a blockchain executive we recently spoke to that perfectly summarizes the implications around CBDCs.

“First, you have to differentiate between wholesale and retail. So if I’m talking about wholesale, then I’m probably talking about cross-border transactions between central banks or Tier 1 banks, for example. And so those are low transaction volume but high-value transactions. So that’s very important to get that right, more than anything else. And I can’t afford to have that hack because we’re talking billions of dollars. So, again, the experiments have proven that it can be done cross-protocol. I know I’ve seen some standards proposed in this space, mostly by some folks at Bank for International Settlements.

So they’ve done a lot of CBDC work. There’s a gentleman in Singapore who has proposed that, if you peel back the covers, he’s basically proposing everything should be on Quorum, everything should be on JPM Coin, which I don’t think that’s going to happen. But nice try, buddy. But you could maybe argue, OK, somebody like SWIFT could say, ‘OK, for international banking, at the wholesale level delivery versus payment kinds of scenarios or end up day netting between multiple banks, we can help you come up with a standard between the banks.’ Again, the technology will have to evolve to meet that because if you’re doing integration between the two different protocols, that’s a weak spot. That’s an attack vector for a hack right off the bat. So if I’m a hacker, I’d be looking at that kind of cross-border protocol switching, or integration play.

“Now at the retail level, let’s say we’re talking about replacing U.S. dollars, for example, with digital dollars, whatever. First of all, I’ll believe it when I see it, because the technology has to scale up to those, that level of transaction. But same thing, it could be, ‘OK, I’ve got my digital dollar, I’ve got an app on my iPhone, now I traveled to Japan, should there be an app, or should there be some bridge between the digital yen and the digital dollar?’ I think that’s decades off. If I’m a central bank in Japan, I’m going to be really, really careful about letting people plug into my letting travelers, for example, plug into my network or do conversions of a digital dollar to a digital yen, just again, for fear of the hacks, the fear of attacks. That loss of control, perhaps over the circulation of that digital yen, the only place where that might work. And now we’re really getting political here.

But you could probably argue that the whole reason that China’s doing its digital yuan, for example, is really about social control. So they have the social scoring in China, where, OK, if [someone] talks negatively about the Communist Party, then he gets points added to this, or points deducted from a social score, however it works. But it prevents you from getting credit, for example, prevents you from getting a plane ticket, things like that.

So they’re really trying to control behavior, social behavior with this point scoring system. And forcing everybody to use digital money really plays into that, because OK, now that [someone] has a negative score, I can block his account, I can prevent him from spending money, I can deduct money from his account, that sort of thing. To me, it seems like the digital currency in China really is just an extension of control of the population. And so maybe in that sense, like, if I go visit China, they really would want me to convert to their digital currency, because they could control it. They could see what I do, they could see where I spend it. And they could block me from accessing it if they want to. So yeah, that’s the negative side of that integration that you were talking about. OK. They would let me use their digital currency because they have ulterior motives for doing so.”

Conclusion

In-person events provide opportunities to gather insights and information not shared on a screen or on the plenary stage. Perhaps the two-year absence from being live in New York City helped make the participants more eager to talk. From conversations with blockchain entrepreneurs, crypto-enthusiasts and EY professionals, TBR heard two common themes.

First, the skepticism around cryptocurrencies has not been skeptical enough for what is out there and what is coming. The current split on crypto falls along the lines of regulation versus total anonymity, with regulated, stable currencies having greater potential than the unregulated coins that have roiled capital markets of late. Further, bad actors, present in any ecosystem, would be shaken out if governments regulate the new instruments (history as prologue), provided total anonymity does not win out.

Second, enterprises and the blockchain providers servicing them increasingly see smart contracts as the use case most likely to scale and accelerate blockchain adoption across the enterprise ecosystem. A final nugget specific to EY made the (persuasive) argument that EY’s most successful blockchain-related engagements to date reside in the firm’s Tax and Risk practices. In TBR’s view, the fact that EY is doubling its R&D spend in blockchain yet earning the most blockchain-related revenue in its legacy practices may be the most compelling evidence of the firm’s all-in bet on blockchain.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

Instantaneous interconnectivity: Inside the Department of Defense’s ambitious plan for JADC2

What is Joint All-Domain Command and Control?

Joint All-Domain Command and Control (JADC2) is an evolving Department of Defense (DOD) vision to revamp the Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) programs currently in use across all U.S. military branches. The infrastructures in place at, for example, the U.S. Army, are largely unable to function at a seamless level with the networks of other branches, such as the U.S. Space Force. Additionally, these infrastructures do not meet the DOD’s requirements to handle rapidly evolving and highly complex new-age battlefield situations that require urgent, coordinated responses from U.S. armed forces.

 

JADC2 is an effort to rectify these dilemmas by creating a cloudlike environment that enables the rapid receipt and transmission of intelligence, surveillance and reconnaissance (ISR) data to interconnected networks. By developing a unified network that enables sensors on Internet of Military Things (IoMT) devices to instantly pass on mission-critical information to leaders, more informed and coordinated decision making is possible across the U.S. military’s branches. Decision makers can act faster and establish more cohesive battlefield tactics, factoring in land, sea and air threats with additional support from each other’s assets due to this common operating picture (COP) being immediately relayed to the relevant parties via machine learning (ML) and AI support.

 

Vendors covered in TBR’s series of Public Sector and Mission Systems reports have been increasingly involved in JADC2. It provides a sizable opportunity for vendors with these areas of expertise.

What will be needed to enable JADC2?

In March, the Pentagon published its official JADC2 strategy, which included five “lines of effort” that the JADC2 Cross-Functional Team (CFT) will work on to bring the DOD’s vision closer to reality. The first goal is to set up a uniform “data enterprise,” which includes creating guidelines for baseline metadata tagging. Next, the JADC2 CFT will leverage digital tools like AI to support decision makers and engage in efforts to advance integral technology. The Space Development Agency (SDA) will then establish a network that enables communication across branches and weave nuclear command, control and communication (N3) systems into the overarching JADC2 program. Lastly, the DOD will strive to better connect mission partners by streamlining the exchange of data.

 

This lofty goal of rapidly parsing relevant data from battlefield situations and enabling decision makers to be more agile will require a lot of support. For example, DevSecOps will build out customizable capabilities for JADC2 based on a department’s needs. The electromagnetic battle management system (EMBM), a core piece of the DOD’s vision, will be underpinned by DevSecOps using electromagnetics that will aid branches of the U.S. military, such as the U.S. Air Force, with tasks like identifying and connecting data. Advancing AI technology will also be critical to JADC2’s success and require contractors to increasingly expand their capabilities.

For example, Booz Allen Hamilton (NYSE: BAH) has been positioning itself to capitalize on AI and analytics demand since 2018 with a series of inorganic and organic investments. TBR anticipates Booz Allen Hamilton will play a key role in helping to produce new tactical support systems leveraging AI and familiarize warfighters with newer technologies like directed energy weapons. Additionally, Peraton Labs has been building out its Operational Spectrum Comprehension, Analytics and Response (OSCAR) solution, which will bolster the DOD’s efforts to bring interoperability across the nation’s military branches by leveraging AI as well as 5G technologies.

 

JADC2 will also require an anti-fragile cloud environment underpinned by 5G technology, which is where military contractors like Lockheed Martin (NYSE: LMT) and Northrop Grumman (NYSE: NOC) have been looking to capitalize. In November 2021 Lockheed Martin formed an alliance with Verizon (NYSE: VZ) to enable interoperability among legacy networks and devices already in use as part of the contractor’s efforts to provide 5G connectivity through its 5G.MIL unified infrastructure. Lockheed Martin has since expanded its partner network to include Keysight Technologies (Nasdaq: KEYS), Microsoft (Nasdaq: MSFT), Intel (Nasdaq: INTC) and Omnispace to assist with 5G.MIL, streamlining network communications for both IP and non-IP users.

Meanwhile Northrop Grumman formed an alliance with AT&T (NYSE: T) in April to analyze digital battle networks and integrate Northrop Grumman’s systems with 5G commercial capabilities and AT&T’s 5G private networks to establish a scalable open architecture for the DOD. To do this at the scale the DOD wants, Lockheed Martin and Northrop Grumman will need to build out their partner networks among startups and fringe players while continuing to build out relationships with major names like Verizon and Microsoft.

 

The military/DOD will increasingly require IT assistance to underpin the JADC2 initiative. While the military’s outsourcing efforts will certainly play a part in bringing JADC2 closer to fruition, the branches are expected to bring on more IT workers of their own and invest in systems integration as well as methods to educate these employees and retain them to help build, maintain and troubleshoot applications.

 

Currently, the military branches are working on their own programs compatible with the DOD’s JADC2 vision. For example, the U.S. Air Force is developing its Advanced Battle Management System (ABMS), which has undergone periodic testing in public since December 2019. Recent efforts indicate the U.S. Air Force is trying to fit KC-46 Pegasus tanker aircraft with pods linking F-22 aircraft and other solutions on the ABMS network, which would allow more information to be exchanged. Meanwhile, the U.S. Navy has been working on Project Overwatch while the U.S. Army has been expanding Project Convergence to include additional features that will contribute to its success. For example, the Army’s FIRESTORM system leverages AI that scans relevant points with sensors, maps out a digital battleground, tags hostiles and selects the optimal weapon for the circumstances.

What are the fears surrounding JADC2?

While JADC2 has a lot of potential, there are several concerns with the DOD’s vision, beyond just getting these systems to communicate through one language.

Security

Fears about JADC2’s adaptability and resiliency are prevalent, particularly because China and other countries have invested in disruptive technologies like an anti-access/area denial (A2/AD) conflict deterrence system that could impede JADC2 and other communication networks’ functions. There has been very little discussion about how JADC2 would combat these disruptions or function in these contested environments outside of test settings when facing the brunt of foreign adversaries’ disruptive technologies. The DOD will need to ensure it can generate as much relevant information as possible from a limited number of sensors while maintaining undetectable networks capable of surviving enemies’ efforts to degrade or disrupt the relaying of information.

Design

Accenture (NYSE: ACN) Federal Services Managing Director Bill Marion also emphasized that human-centered design will be necessary throughout JADC2’s framework to ensure that warfighters and decision makers can easily navigate these interconnected networks and learn about all of their capabilities to maximize their use.

Affordability

Targeted internal investments are necessary to implement JADC2. Companies like Raytheon Intelligence & Space of Raytheon Technologies (NYSE: RTX) will need to develop and connect new IT infrastructure and update legacy systems to ensure they are compatible with JADC2 utilizing a cost-effective approach. Simultaneously, affordable and functioning multilevel cybersecurity solutions that can support the DOD’s desired instantaneous relaying of data and commands will be needed. Currently, there are concerns about enemies being able to hack into the MIL-STD-1553 serial data busses found in IoMT weapon systems. External parties might be able to breach the 1553 data bus and either shut down or actively use these connected armaments on U.S. personnel.

Contractors will need to find ways to protect the 1553 data bus from these threats, and Peraton Labs is already collaborating with military branches to establish Bus Defender capabilities. With the DOD looking to interconnect IT systems across all military branches, TBR anticipates that General Dynamics (NASDAQ: GD) Technologies is aiming to be the DOD’s preferred IT vendor by utilizing Agile methods to expedite the construction of tailored prototypes after first consulting with clients and showcasing the contractor’s base zero-trust solutions.

Ultimately, the journey to JADC2’s implementation will be long and complex. The DOD’s ambitious project will certainly face an ever-shifting road to implementation as there is no true endpoint for the project. Key components like hardware will need to be updated, policies will be amended, and the scope of JADC2 will grow, especially as the U.S. eyes getting allies involved with JADC2 in the future to establish a more unified cloudlike environment capable of streamlining the transference of data to all nations. If all goes well, the U.S. will be able to truly integrate its military branches, allowing them to overwhelm adversaries by using mission-critical data to make better, more informed and coordinated tactical decisions. The U.S. will aim to control the next-generation battlefield by gaining the upper hand on intelligence and rapid communication.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

Cloud ERP picking up speed: How vendors are capitalizing on vast opportunity

It’s clear now that ERP solutions, which are key to solving business problems, are fair game for cloud migration. The barriers of performance, security and compliance are all but gone, leaving only the pace of change organizations can muster as the determiner of when ERP workloads can be reimaged with cloud technologies.

 

Join Practice Manager Allan Krans, Senior Analyst Evan Woollacott and Senior Analyst Catie Merrill Thursday, June 16, 2022, for a look at how end customers are using cloud technologies to solve real-world business problems.

 

In this FREE webinar you’ll learn:

  • How barriers to ERP migration have eroded, opening the flood gates to cloud deployments, even in regulated industries
  • How industry customization of cloud technologies is becoming an expectation from customers
  • How platform vendors like Amazon Web Services, Microsoft and Google are encroaching on applications and services markets in the quest for growth

 

Mark your calendars for Thursday, June 16, 2022, at 1 p.m. EDT,
and REGISTER to reserve your space.


Related content:

  1. Hyperscalers’ cloud-based modern network architecture provides strategic advantage over legacy network technologies
  2. Russian aggression will not dampen pandemic-driven cloud demand

 

Click here to register for more TBR Webinars

WEBINAR FAQs

 

 

 

Top priorities for IT infrastructure investments: What’s more important than business transformation?

TBR’s recently launched Infrastructure Strategy Customer Research report surveys 300 IT decision makers responsible for IT infrastructure globally and by industry vertical, such as technology, public sector and healthcare & life sciences, and by organization size, including small, medium and enterprise.

 

Join Principal Analyst and Engagement Manager Angela Lambert for insights, data and analysis on exactly what IT buyers are concerned with in the post-COVID-19 transition, with billions of dollars of IT investment on the line. Angela will discuss the challenges and priorities guiding investment plans, key areas of infrastructure expansion, plans for data center consolidation, and expectations for edge computing and multicloud adoption.

 

In this FREE webinar you’ll learn:

  • The top priorities influencing IT infrastructure investments today, and the top challenges slowing business transformation
  • Key insights for OEM, ODEM, cloud, service provider, software and security professionals
  • Differences in needs across small, midsize and enterprise businesses
  • How data center consolidation will impact infrastructure investment, edge adoption and shifts to public cloud resources

 

Mark your calendars for Thursday, June 30, 2022, at 1 p.m. EDT,
and REGISTER to reserve your space.


Related content:

  1. Free Copy: Top Predictions for Data Center in 2022

 

Click here to register for more TBR Webinars

WEBINAR FAQs

 

 

 

Top vendors positioned to capitalize on TIS market trends

Communication service providers (CSPs) are in the middle of a robust investment cycle that requires significant spend on telecom infrastructure services (TIS). Current trends are expected to play out over the remainder of this decade. Spend is expected to peak in the next few years.

 

Join Senior Analyst Michael Soper for a deep dive into growth catalysts such as 5G deployments globally, network and IT application migration to the cloud, and an influx of government funds to close the digital divide. Michael will also look at geopolitics driving government spend and CSPs’ actions to rip and replace Huawei infrastructure over the next few years.

 

In this FREE webinar you’ll learn:

  • Key growth drivers and growth detractors expected in the TIS market through 2026, according to findings in TBR’s Telecom Infrastructure Services Global Market Forecast 2021-2026
  • How government spend and geopolitics will influence the TIS market
  • Which vendors are positioned to capitalize on trends in the TIS market

 

Mark your calendars for Thursday, June 23, 2022, at 1 p.m. EDT,
and REGISTER to reserve your space.

 

Related content:

  1. Free copy: Top 3 Predictions for Telecom in 2022

 

Click here to register for more TBR Webinars

WEBINAR FAQs