Posts

How Informatica uses the cloud to empower a data-driven enterprise

Overview

Setting the stage for what ended up being the primary theme at Informatica World 2022 — Data is your platform — Informatica CEO Amit Walia walked attendees through two emerging trends: the importance of scalable infrastructure through cloud computing, and how AI and machine learning (ML) are no longer just about automating processes but also about enabling predictive intelligence. These trends, while well recognized in theory, are more challenging for businesses to put into practice, particularly due to the proliferation of data and the number of users looking to access said data, including both technical and nontechnical personas.

Informatica’s solution to data complexity is rooted in one of the company’s core values — platform centricity — but the move to essentially replace Intelligent Data Platform with IDMC, after years of innovation and a slight disruption from COVID-19, is now taking Informatica’s approach to data management and integration to new heights. With IDMC in the cloud, Informatica is better positioning itself to help clients translate data into valuable insights at a level that cannot be realized on premises.

In addition to being cloud-native, IDMC is infused with AI, addressing the other emerging trend called out by Walia — the need for AI-powered intelligence. All Informatica capabilities are built on CLAIRE, an automated metadata engine that processes 32 trillion transactions per month, and tie back into IDMC. While the ROI for AI technology is still hard to justify for many businesses, another key factor in the low adoption of the technology is that many businesses are working with complex, siloed data, which means AI models could fall short and lead to inaccuracies.

CLAIRE is designed to address a range of operational, runtime and automation use cases — from auto-scaling to anomaly detection — and acts as a wrapper around IDMC to enable fully automated data management and governance processes. By bringing the power of cloud and AI into one integrated platform, Informatica uses IDMC to help customers focus on the only thing they truly own in the cloud: their data. The result of a $1 billion, six-year investment, IDMC consists of seven core modules, with its value proposition largely stemming from its modularity and the ability to allow customers to pick and choose capabilities and services based on their industry, business and use case.

Informatica expands platform capabilities, driving additional value for its comprehensive, cloud-native solution

New innovations emphasize uniting IT and business functions to improve efficiency

With IDMC, Informatica has solidified its platform approach, but as cited by various customers, the company’s ability to continually offer new capabilities is what drives additional value, by addressing more horizontal and vertical use cases in the data life cycle. Perhaps the most notable announcement at Informatica World 2022, which seemed to garner particular excitement from product leaders and customers, was the general availability of Informatica Data Loader. Jitesh Ghai, Informatica’s chief product officer, led a demo of Data Loader, which is a free, self-service tool that ingests data from over 30 out-of-the-box systems into Google Cloud’s popular data warehouse solution, BigQuery.

As part of the demo, we saw a scenario play out where a marketing analyst needs access to more data to effectively run a campaign. The hypothetical marketing analyst then accesses the Integration module within IDMC to pull data from Marketo using a drop-down tool to access BigQuery through which data can be loaded in only a few steps. This integration could end up acting as a time-saver for large organizations and speaks to the innovative ways Informatica is getting data into the hands of line-of-business teams.

At the event, Informatica also announced INFACore, which targets more technical users, such as data scientists and engineers, allowing them to clean and manage data in a single function. As a low-code plug-in for popular frameworks, such as Jupyter notebooks, INFACore is designed to improve the productivity of the technical user, but naturally this productivity trickles up to business functions. For instance, after using INFACore to cleanse data through a single function, the data scientist can publish a clean data set to the Informatica Marketplace, where other teams within an organization can access it.

Another key innovation called out in opening talks with Ghai was ModelServe, which allows users to upload, monitor and manage ML models within their Informatica data pipelines. There are many ML models in production, but businesses are still looking for ways to scale them from an operational perspective. In talks with more than one customer at the event, the common interface within IDMC came up as a value-add when attempting to scale a data team, suggesting customers are awaiting ModelServe’s general availability as it will allow users to register and manage ML models directly within IDMC.

Informatica strengthens SaaS portfolio, building in intelligence from the data model up

While Informatica’s platform capabilities get much of the market’s attention, the company also has a broad portfolio of IDMC-enabled SaaS offerings, which play a key role in the data management journey, complementing warehousing, integration and automation. As a native service within Informatica’s Master Data Management (MDM) solution, 360 applications act as a gateway for transforming customer experience in the cloud, something we saw in action through the product demo of Supplier 360 SaaS.

Through IDMC, CLAIRE recognized a defective part from a supplier of a hypothetical company, and teams were able to use Supplier 360 SaaS to identify which customers were impacted by the faulty part and automatically notify customer service so they can launch a refund program to keep customers satisfied. Informatica also released various industry and domain extensions for its 360 applications and will continue to offer new packaged offerings available in a SaaS model, providing customers more ways to onboard and manage data.

Joining the industry cloud bandwagon, Informatica verticalizes IDMC

It is no secret that industry specialization is re-emerging as a leading trend in the cloud space, as a maturing enterprise customer base demands solutions that suit their unique IT and business processes. During the event, Informatica unveiled new IDMC customizations for financial services, healthcare and life sciences. These three offerings join IDMC for Retail in Informatica’s industry cloud portfolio to further address demand for purpose-built solutions that will limit the need for customization.

Findings from TBR’s Cloud Infrastructure & Platforms Customer Research continue to indicate that some enterprises are wary of industry cloud solutions, dismissing them as marketing ploys. Other enterprises, however, find them worth evaluating. For instance, in talks with a representative from a hedge fund, we found that the company initially chose a competing MDM solution because it specialized in asset management with its own specific data dictionary but was torn as it viewed Informatica’s MDM as ahead of the competition in terms of capabilities. We can expect Informatica to expand in other industries, including specific subverticals, with additional data models, custom user interfaces and data quality rules to appeal to these customers.

Continued integrations and go-to-market synergies with hyperscalers help Informatica maintain data neutrality

For a company that markets itself as the “Switzerland of data,” Informatica’s ability to make its offerings accessible across leading cloud platforms is critical. Partnering across the cloud landscape is no longer a differentiator, it is a necessity and something customers clearly find value in as they gravitate toward multicloud environments. During the event, Walia welcomed several partner executives both in-person and virtually to discuss new joint offerings and go-to-market synergies the company is forming with cloud service providers to deliver more choice and flexibility and for joint clients.

      • The ubiquity of Microsoft’s cloud portfolio allows Informatica to provide clients a unified data architecture. Informatica and Microsoft (Nasdaq: MSFT) have a well-established relationship, which at its core is focused on migrating data warehouses to the cloud but is evolving and making Informatica relevant across the Microsoft Cloud stack, including Azure, Power Platform and 365 applications. For example, Informatica is typically well known for its integration with Azure Synapse, but the company also integrates with the Dynamics 365 SaaS data model to enable Customer 360 analytics. Expanding its presence throughout the Microsoft Cloud stack, Informatica announced MDM on Azure. With this announcement, customers can deploy MDM as a SaaS offering on Azure via the Azure Marketplace, which could appeal to the large number of Microsoft shops looking to enhance their Azure Data Lakes with a feature-rich MDM solution. Both companies also launched Informatica Data Governance with Power BI, which, as highlighted by Scott Guthrie, EVP of Cloud and AI at Microsoft, brings Informatica’s data catalog scanners to Power BI, allowing customers to have a single view of their data processes from ingestion to consumption. This offering could serve as a more strategic way for customers to modernize their analytics workloads through Azure.
      • Given their respective strengths in data analytics and data management, Google Cloud and Informatica are complementary partners. The Google Cloud-Informatica relationship took a major step forward with the launch of Informatica Data Loader, which could expand client usage of BigQuery and help Google Cloud (Nasdaq: GOOGL) address a wider set of customer needs, including those outside the IT department. In TBR’s own discussions with enterprise buyers, BigQuery is often cited as a leading solution due to its ability to handle petabytes of data at a favorable price point. Walia reaffirmed this notion in discussions with two customers, ADT and Telus, both of which are migrating legacy data warehouses and/or front-end ETL (extract, transform, load) capabilities into their BigQuery instances and using IDMC for cloud-based data management.
      • Oracle awards Informatica preferred partner status for data integration. Informatica and Oracle (NYSE: ORCL) struck a new partnership agreement that offers IDMC on Oracle Cloud Infrastructure (OCI). Addressing the large number of customers running legacy Oracle databases and potentially those that are also deploying on-premises Informatica products, IDMC on OCI provides customers an integrated gateway to the cloud by enabling back-end connections with Oracle Autonomous Database and Exadata Database Service and OCI Object Storage. For example, with IDMC on OCI, customers can import data from legacy Oracle E-Business Suite applications into Autonomous Database and connect to other data sources, such as Azure SQL or Amazon RedShift, through IDMC. As a preferred Oracle partner, Informatica will recommend customers use IDMC with Oracle’s cloud services. Oracle’s EVP of database server technologies, Andy Mendelsohn, walked through numerous incentives to assist customers’ cloud migrations, such as Bring Your Own License, Informatica Migration Factory and Oracle Cloud Lift Services.

Informatica also has close relationships with Amazon Web Services (AWS) (Nasdaq: AMZN), Snowflake (NYSE: SNOW) and Databricks, all of which are expanding their commitments to Informatica to help customers look beyond ETL and handle data in an end-to-end fashion. Given Informatica offers analytics, integration, automation, governance and management capabilities across leading clouds, naturally the company runs up against a high degree of competitive overlap with its partners, which offer similar native tooling as part of a customer’s environment.

However, in talks with customers, the general perception seems to be that the hyperscalers’ capabilities are still relatively immature and that there is also significant value in deploying a vendor-neutral platform like IDMC to avoid vendor lock-in and address the training and skill challenges typically associated with a multicloud environment. While we can expect the hyperscalers to enhance their capabilities, at the end of the day, the primary goal for AWS, Microsoft and Google Cloud is to win compute, so the benefits of partnering with Informatica to capture legacy platform-layer workloads outweigh the downsides of coopetition.

Conclusion

With IDMC, Informatica has fostered a value proposition catered to three core areas: platform-centricity, connecting IT and business ecosystems, and infrastructure agnosticism. The numerous announcements made at Informatica World 2022 show the data management company is building on these strategic pillars by better aligning with cutting-edge trends in the cloud industry, such as industry customization, out-of-the-box integrations and data democratization. With these enhancements in place, along with close partnerships across the IaaS ecosystem, Informatica is positioning itself favorably to assist clients with the large number of on-premises workloads ready to be migrated and modernized in the cloud while enabling the cloud-native enterprise to transition from digital to data-driven.



Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

Instantaneous interconnectivity: Inside the Department of Defense’s ambitious plan for JADC2

What is Joint All-Domain Command and Control?

Joint All-Domain Command and Control (JADC2) is an evolving Department of Defense (DOD) vision to revamp the Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) programs currently in use across all U.S. military branches. The infrastructures in place at, for example, the U.S. Army, are largely unable to function at a seamless level with the networks of other branches, such as the U.S. Space Force. Additionally, these infrastructures do not meet the DOD’s requirements to handle rapidly evolving and highly complex new-age battlefield situations that require urgent, coordinated responses from U.S. armed forces.

 

JADC2 is an effort to rectify these dilemmas by creating a cloudlike environment that enables the rapid receipt and transmission of intelligence, surveillance and reconnaissance (ISR) data to interconnected networks. By developing a unified network that enables sensors on Internet of Military Things (IoMT) devices to instantly pass on mission-critical information to leaders, more informed and coordinated decision making is possible across the U.S. military’s branches. Decision makers can act faster and establish more cohesive battlefield tactics, factoring in land, sea and air threats with additional support from each other’s assets due to this common operating picture (COP) being immediately relayed to the relevant parties via machine learning (ML) and AI support.

 

Vendors covered in TBR’s series of Public Sector and Mission Systems reports have been increasingly involved in JADC2. It provides a sizable opportunity for vendors with these areas of expertise.

What will be needed to enable JADC2?

In March, the Pentagon published its official JADC2 strategy, which included five “lines of effort” that the JADC2 Cross-Functional Team (CFT) will work on to bring the DOD’s vision closer to reality. The first goal is to set up a uniform “data enterprise,” which includes creating guidelines for baseline metadata tagging. Next, the JADC2 CFT will leverage digital tools like AI to support decision makers and engage in efforts to advance integral technology. The Space Development Agency (SDA) will then establish a network that enables communication across branches and weave nuclear command, control and communication (N3) systems into the overarching JADC2 program. Lastly, the DOD will strive to better connect mission partners by streamlining the exchange of data.

 

This lofty goal of rapidly parsing relevant data from battlefield situations and enabling decision makers to be more agile will require a lot of support. For example, DevSecOps will build out customizable capabilities for JADC2 based on a department’s needs. The electromagnetic battle management system (EMBM), a core piece of the DOD’s vision, will be underpinned by DevSecOps using electromagnetics that will aid branches of the U.S. military, such as the U.S. Air Force, with tasks like identifying and connecting data. Advancing AI technology will also be critical to JADC2’s success and require contractors to increasingly expand their capabilities.

For example, Booz Allen Hamilton (NYSE: BAH) has been positioning itself to capitalize on AI and analytics demand since 2018 with a series of inorganic and organic investments. TBR anticipates Booz Allen Hamilton will play a key role in helping to produce new tactical support systems leveraging AI and familiarize warfighters with newer technologies like directed energy weapons. Additionally, Peraton Labs has been building out its Operational Spectrum Comprehension, Analytics and Response (OSCAR) solution, which will bolster the DOD’s efforts to bring interoperability across the nation’s military branches by leveraging AI as well as 5G technologies.

 

JADC2 will also require an anti-fragile cloud environment underpinned by 5G technology, which is where military contractors like Lockheed Martin (NYSE: LMT) and Northrop Grumman (NYSE: NOC) have been looking to capitalize. In November 2021 Lockheed Martin formed an alliance with Verizon (NYSE: VZ) to enable interoperability among legacy networks and devices already in use as part of the contractor’s efforts to provide 5G connectivity through its 5G.MIL unified infrastructure. Lockheed Martin has since expanded its partner network to include Keysight Technologies (Nasdaq: KEYS), Microsoft (Nasdaq: MSFT), Intel (Nasdaq: INTC) and Omnispace to assist with 5G.MIL, streamlining network communications for both IP and non-IP users.

Meanwhile Northrop Grumman formed an alliance with AT&T (NYSE: T) in April to analyze digital battle networks and integrate Northrop Grumman’s systems with 5G commercial capabilities and AT&T’s 5G private networks to establish a scalable open architecture for the DOD. To do this at the scale the DOD wants, Lockheed Martin and Northrop Grumman will need to build out their partner networks among startups and fringe players while continuing to build out relationships with major names like Verizon and Microsoft.

 

The military/DOD will increasingly require IT assistance to underpin the JADC2 initiative. While the military’s outsourcing efforts will certainly play a part in bringing JADC2 closer to fruition, the branches are expected to bring on more IT workers of their own and invest in systems integration as well as methods to educate these employees and retain them to help build, maintain and troubleshoot applications.

 

Currently, the military branches are working on their own programs compatible with the DOD’s JADC2 vision. For example, the U.S. Air Force is developing its Advanced Battle Management System (ABMS), which has undergone periodic testing in public since December 2019. Recent efforts indicate the U.S. Air Force is trying to fit KC-46 Pegasus tanker aircraft with pods linking F-22 aircraft and other solutions on the ABMS network, which would allow more information to be exchanged. Meanwhile, the U.S. Navy has been working on Project Overwatch while the U.S. Army has been expanding Project Convergence to include additional features that will contribute to its success. For example, the Army’s FIRESTORM system leverages AI that scans relevant points with sensors, maps out a digital battleground, tags hostiles and selects the optimal weapon for the circumstances.

What are the fears surrounding JADC2?

While JADC2 has a lot of potential, there are several concerns with the DOD’s vision, beyond just getting these systems to communicate through one language.

Security

Fears about JADC2’s adaptability and resiliency are prevalent, particularly because China and other countries have invested in disruptive technologies like an anti-access/area denial (A2/AD) conflict deterrence system that could impede JADC2 and other communication networks’ functions. There has been very little discussion about how JADC2 would combat these disruptions or function in these contested environments outside of test settings when facing the brunt of foreign adversaries’ disruptive technologies. The DOD will need to ensure it can generate as much relevant information as possible from a limited number of sensors while maintaining undetectable networks capable of surviving enemies’ efforts to degrade or disrupt the relaying of information.

Design

Accenture (NYSE: ACN) Federal Services Managing Director Bill Marion also emphasized that human-centered design will be necessary throughout JADC2’s framework to ensure that warfighters and decision makers can easily navigate these interconnected networks and learn about all of their capabilities to maximize their use.

Affordability

Targeted internal investments are necessary to implement JADC2. Companies like Raytheon Intelligence & Space of Raytheon Technologies (NYSE: RTX) will need to develop and connect new IT infrastructure and update legacy systems to ensure they are compatible with JADC2 utilizing a cost-effective approach. Simultaneously, affordable and functioning multilevel cybersecurity solutions that can support the DOD’s desired instantaneous relaying of data and commands will be needed. Currently, there are concerns about enemies being able to hack into the MIL-STD-1553 serial data busses found in IoMT weapon systems. External parties might be able to breach the 1553 data bus and either shut down or actively use these connected armaments on U.S. personnel.

Contractors will need to find ways to protect the 1553 data bus from these threats, and Peraton Labs is already collaborating with military branches to establish Bus Defender capabilities. With the DOD looking to interconnect IT systems across all military branches, TBR anticipates that General Dynamics (NASDAQ: GD) Technologies is aiming to be the DOD’s preferred IT vendor by utilizing Agile methods to expedite the construction of tailored prototypes after first consulting with clients and showcasing the contractor’s base zero-trust solutions.

Ultimately, the journey to JADC2’s implementation will be long and complex. The DOD’s ambitious project will certainly face an ever-shifting road to implementation as there is no true endpoint for the project. Key components like hardware will need to be updated, policies will be amended, and the scope of JADC2 will grow, especially as the U.S. eyes getting allies involved with JADC2 in the future to establish a more unified cloudlike environment capable of streamlining the transference of data to all nations. If all goes well, the U.S. will be able to truly integrate its military branches, allowing them to overwhelm adversaries by using mission-critical data to make better, more informed and coordinated tactical decisions. The U.S. will aim to control the next-generation battlefield by gaining the upper hand on intelligence and rapid communication.


Market and competitive intelligence straight to your inbox each month, absolutely FREE

Subscribe to Insights Flight today!

EY on sustainability reporting: Data, credibility and transformation  

Lessons learned from the EY-hosted webcast “How the Corporate Sustainability Reporting Directive will transform your organization”

Atos future-proofs compute ahead of Great Acceleration

As the world awaits the scientific discoveries needed to bring quantum processors to commercial applicability, Atos’ BullSequana XH3000 allows for ecosystem participation within the compute platform itself and future-proofs any early buyer investments. In its Feb. 16 official announcement of the XH3000 supercomputer, for which TBR was provided pre-briefing access, Atos claims the product will have a six-year life cycle and that it is an open architecture capable of housing up to 38 blades. The blades can accommodate a mix of different XPU processors, with more under consideration and development.

The rapid rise in large data sets and evolving AI/machine learning (ML) algorithms have driven this global appetite for greater compute capacity — an appetite that many data scientists believe will only be sated once quantum computers reach commercial viability. Atos’ early lead in quantum simulators and alliances with various quantum systems vendors imply the company will be capable of pivoting its high-performance computing (HPC) offerings quickly to accommodate the addition of commercial-grade quantum processors when they arrive. Atos’ flexible hybrid supercomputing architecture will sell well in Europe for a variety of reasons and may enable Atos to gain share against notable HPC vendors in North America and Asia.

Data and AI require new compute platforms to address intractable problems

Atos correctly asserts the state of compute trails the size of the data sets that are available to run algorithms. Specifically, the world is running out of computational capacity to address the complex problems that can now be simulated and analyzed through increasing digitization.

Proof points offered in the Atos announcement included:

  • Average HPC job durations grow as larger data sets will be applied against systems with as many as 10,000 nodes and 25,000 endpoints.
  • Application refactoring and algorithm refinements can provide as much as a 22x speed improvement.
  • Data centricity and edge processing grow in use case applicability, requiring greater hierarchical depth and more localized compute near the application.
  • Hybrid Sim/AI Workflows for approximate computing are nearing reality. Atos offered the example of Alphafold 2 for protein folding prediction reaching over 90% accuracy, whereas classical methods currently achieve between 30% to 40% accuracy.
  • Yet another industry prediction of reaching the physical limits of Moore’s law now that the industry is at 3nm technology.
  • Extending the performance gains from classical computing while quantum discovery and commercialization advance will require greater innovation around multiple XPU architectures. These hybrid or heterogenous compute architectures need a new compute system structure, which Atos believes the XH3000 system provides.

The Atos Exascale strategy is a hybrid approach that serves many masters

Atos states the future of supercomputing will be hybrid. According to Atos, the future of supercomputing will involve a hybrid approach, consisting in the near term of a blend of classical CPU configurations and specialized processor architectures to address specific workload requirements. Presently, Atos collaborates with AMD (Nasdaq: AMD), Intel (Nasdaq: INTC), Nvidia (Nasdaq: NVDA), SiPearl and Graphcore, among others. Eurocentric chips based on ARM designs are also in the news and have been discussed by Atos.

Atos has addressed the need for future-proof flexibility in its designs by building the standard chassis of the BullSequana XH3000 to accommodate up to 38 compute/switch blades on one rack to be mixed and matched as workflows require from the different blades currently available and available in the future.

This hybrid architectural design approach serves many masters, such as those addressing:

  • Sustainability: Different cooling and processing designs not only generate greater computational capacity but also, when coupled with the hybrid configurations and algorithm innovations, can lead to lower power consumption, and therefore lower carbon footprints.
  • Sovereignty: Technonationalism is not going away, and Atos is a flagship European technology vendor. Former Atos CEO Thierry Breton is now the commissioner for internal market affairs within the European Union (EU) and has been tasked with managing many elements pertinent to digitization and “enhancing Europe’s technical sovereignty.” The EU has clearly stated its intentions to ensure there are European-controlled processors in market. Hybrid computing structures enable companies to select different processors to address the computational requirements amid the increased attention nation states place on compute access as a strategic national interest.
  • Higher performance: The HPC market increasingly takes on the dynamics of emerging ecosystem business models and requires a physical compute stack that can accommodate the many tech stack variations the ecosystem can create to address the world’s compute and AI challenges. Atos claims it also has built the architecture to be resilient and adaptable for six years without forklift upgrades. This flexibility, Atos asserts, can accommodate new discoveries as the unknowns around deep learning, algorithm development and new processor developments in the classical and quantum computing realms come into view.

Informatica returns to the public market with an emphasis on data democratization and hyperscale partnerships

Informatica’s fall 2021 launch, which consisted of a new cloud-native marketplace, automated data quality features and new data scanners, comes alongside the company’s return to the market in an $840 million IPO. The announces offerings, from new services to partner integrations, largely complement the Intelligent Data Management Cloud (IDMC) platform — the key announcement at Informatica World 2021 in April — and align with what is now Informatica’s cloud-first approach to data governance and management. After six years under private ownership and a significant business model shift to subscription-based revenue, which now contributes over 90% of total revenue, Informatica returns to the public eye ready to convince investors it is fully embracing cloud as the operating model required for a successful, data-led business strategy.

Fall 2021 release targets data consumers

Informatica’s new offerings hit the market at a time when distributed workforces continue to be the norm in light of the COVID-19 pandemic and businesses are requiring more and more data to make critical decisions. In addition, a persistent lack of technical skills is weighing on business leaders and pushing them to look to third-party sources, such as marketplaces, to improve data literacy. Informatica hopes to support an underserved audience of citizen analysts and lines of business (LOBs) while staying true to its technical roots by offering developers a new set of automated tools and features.

Announcing Cloud Data Marketplace

One of the key announcements in Informatica’s fall 2021 launch was Cloud Data Marketplace, a one-stop data shop helping to meet the vast demand for a simpler data delivery process. Available as a service within IDMC, Cloud Data Marketplace allows data owners to publish assets from various on-premises and cloud data catalogs and offer analytics, AI and machine learning (ML) models to end users. The one-stop-shop experience is targeted to data consumers, which may include LOB leaders and their key stakeholders looking for packages (AI models and data sets) to support a number of data-driven use cases from price optimization to improved operational efficiency. When marketplace users ask for a data set that best fits their particular need, program administrators have the ability to approve the request and ask for patterns and data usage.

By bridging the gaps between technical specialists and business leaders, Informatica strives to make data more readily accessible across the enterprise. Cloud Data Marketplace will support this strategy by complementing Informatica’s expertise in the early phases of the data pipeline — from data discovery to manipulation — and will place the company’s metadata catalog in front of business leaders.

Ensuring data quality in the cloud

Informatica remains committed to data and analytics governance, leveraging its embedded AI engine CLAIRE to help automate tasks throughout the data process and provide clients with better control over their data. In the fall 2021 launch, Informatica brought many features previously available within legacy Informatica Data Platform (IDP) to IDMC. For instance, Informatica is offering its existing Data Quality tool to enable customers to profile, transform and manage data in the cloud the same way they could with on-premises data. Customers can also leverage natural-language processing (NLP) capabilities in the back end to create rules, such as setting up their own Data Quality and Business Users. Lastly, Informatica is infusing more automation in the platform, eliminating the need to manually create Data Quality tasks, such as applying health checks.  

Informatica reaffirms commitment to cloud partners

To protect its position as a neutral vendor supporting customers regardless of underlying infrastructure or deployment method, Informatica closely aligns itself with leading hyperscalers, offering native integrations with cloud providers’ well-known platform and infrastructure offerings. Expanding on its strategic, multiyear relationship with Amazon Web Services (AWS), Informatica announced it is supporting AWS Graviton, the company’s own processors based on the Arm architecture. This will help Informatica position as a viable integration option for customers looking to run general-purpose workloads as well as compute-intensive applications, such as high-performance computing (HPC), AI and ML. AWS has been emphasizing its Graviton processors for some time, especially as it looks to push out more modern Elastic Compute Cloud (EC2) instance types to customers and capture more critical workloads.

TBR notes Informatica is early to market as many of AWS’ other data partners and Informatica competitors have yet to offer support for Graviton instances. Further, Informatica introduced application ingestion capabilities, a module under Cloud Mass Ingestion (CMI), to allow customers to ingest and synchronize data from SaaS and on-premises application sources into Cloud Data Warehouses. These capabilities support Informatica’s partner strategy, specifically with vendors like Microsoft, which continues to work with Informatica to move clients’ data warehouses to the cloud. Additional partner announcements in the fall launch included the ability to scan data from Amazon Redshift, Azure Data Factory for cloud ETL (Extract, Transform, Load), and SAP Business Object Data Services into Informatica’s AI-powered data catalog offering.

TBR releases exclusive webinar content from July 2021

Technology Business Research, Inc. (TBR) announces on-demand availability of its July 2021 webinars for market intelligence and competitive intelligence teams. July webinars include a demonstration of TBR’s new data visualization tool and a look at how management consultancies are adjusting to post-pandemic challenges around ways of working.

TBR Insight Center™: An overview

Senior Data Analyst Matt Bowden and Senior Vice President of Sales & Marketing Dan Demers demonstrate TBR’s new digital platform, TBR Insight Center. TBR Insight Center™ is a powerful data visualization tool that allows clients to configure and curate analysis customized to their needs using a simple and intuitive interface.

Innovation requires in-person: Digital transformation consulting in a post-pandemic world

Principal Analyst and Practice Manager Patrick Heffernan, Senior Analyst Kelly Lesiczka and Analyst John Croll discuss how management consultancies have adjusted to post-pandemic challenges around hybrid and in-person innovation sessions, digital burnout and shifting client needs, particularly regarding technology and strategy consulting. 

TBR webinars are typically held Wednesdays at 1 p.m. EST and include a 15-minute Q&A following the main presentation. To find out what we are discussing next month, check out the Webinars page of our website.

Interested in a one-on-one discussion with one of the above subject-matter experts or a private webinar with one or more of our teams?

Contact us today for more information on our free 60-day trial

Innovation, Amsterdam and an arena: How KPMG teams excel at transformations and technology

After KPMG highlighted the firm’s relationship with Johan Cruijff Arena in Amsterdam at a recent analyst event, TBR requested a follow-up discussion to better understand how the innovation team at the arena had been excelling at many of the key characteristics TBR has identified in consultancies’ and IT services vendors’ innovation and transformation centers around the world. TBR met with Sander van Stiphout, the arena’s innovation lead, and Wilco Leenslag, the KPMG partner leading his firm’s efforts with the arena.

Framed within the context of TBR’s recently published Innovation and Transformation Centers Market Landscape, three key elements of van Stiphout’s work at the arena stood out

Trust is crucial  

First, the arena’s innovation team works with external clients on a subscription basis, a business model rarely deployed by consultancies and IT services vendors. The arena’s clients, which include startups and enterprises testing new technologies and means of enhancing the customer experience, have to fully trust the arena’s innovation team will deliver value for the investment they are paying in subscriptions. TBR believes this business model may be directly related to the unique nature of an arena but could be replicated by a consulting firm or IT services vendor that is willing to bet on collaboration consistently leading to valuable, and deployable, innovation.   

Make your pitch and test your tech  

A second key element that stood out was how the Johan Cruijff Arena serves as a test bed in multiple ways, benefiting the arena’s clients that are startups and the arena itself. Startups not only test their technology solutions in a real-world environment with continuous access to all the variables found in any sporting or entertainment event, but van Stiphout noted that startups also pitch the solutions to internal operational professionals at the arena. For example, the arena’s marketing department must approve a marketing solution prior to testing, enabling startups to pitch and refine solutions with a real-world client before taking it to other clients.

Innovations, particularly from startups, often stall when they meet real-world requirements and clients making investment decisions beyond prototypes. By creating a stage for startups to test run both their product pitch and their product, the Johan Cruijff Arena innovation team helps these companies overcome that innovation roadblock. Additionally, this prototyping method helps to overcome issues associated with the traditional engagement model of working with clients’ innovation departments on pilot projects. Specifically, TBR often hears of emerging technologies becoming “stuck in pilot mode,” a challenge that we feel is directly related to the sheer number of ecosystem participants that are required to scale a solution after proving its value to a client. With the arena-led engagement model, ecosystem entities must first work together ahead of a live trial in the arena, addressing the issue of scaling before testing, not after. 

PwC accelerates SaaS strategy as latest round of solutions aim to solve marketers’ business challenges

In a series of conversations with PwC leaders during the past quarter, TBR learned more about the company’s growing products portfolio, including PwC Customer Link and PwC Media Intelligence, in addition to receiving an update on PwC’s CMO advisory practice. TBR spoke with Brian Morris, Customer Analytics and Marketing lead overseeing PwC Customer Link, and Derek Baker, CMO Advisory lead overseeing PwC Media Intelligence. While each capability serves a specific client need, a common approach and business models suggest PwC is accelerating its portfolio transformation without losing sight of the need to deliver outcomes.

Productizing knowledge while relying on trust expands PwC’s addressable market opportunities with the marketing department and beyond

As PwC continues to evolve its business model, the firm’s push into selling products not only expands PwC’s addressable market opportunities but also elevates its brand, compelling software incumbents to pay closer attention. Both the PwC Customer Link and PwC Media Intelligence solutions are part of the PwC Products catalog and support the firm’s goal of driving SaaS and managed services sales. While both products enable marketing departments’ transformation discussions, each also bolsters PwC’s value proposition with noncore buyers, including chief digital officers and chief data officers, as well as internal audit departments in the case of PwC Media Intelligence.

Relying heavily on its PwC CMO Advisory practice, as well as other areas of the firm, such as its network of Experience Centers, as the medium to introduce these offerings helps PwC drive conversations for cross-selling and upselling services. Solving complex issues around managing customer data is an ever-challenging task for clients. Productizing knowledge through the development of pointed solutions helps PwC address client pain points and close business technology gaps. As PwC continues to build client use cases by selling, deploying and managing these solutions, we expect the firm to continue to approach clients through its fundamental lens: helping marketers solve business challenges.  

Solution overview

PwC Customer Link differentiates on its ability to not only connect offline and online data but also to integrate third-party data and provide analytics around it, as the solution uses various data depositories. Key features include Data Manager that handles first-party and all digital data; Insights Manager that allows PwC to perform better analytics segmentation down to the audience level; and Orchestration Manager that supports buyers’ omnichannel campaigns. Additional features include PwC’s ability to work through a technology-agnostic lens and offer supplemental capabilities with cloud data providers such as Salesforce and Adobe.

Aiven’s managed services capabilities bring the best of open-source data technologies to multicloud enterprises

With a core portfolio of platform services, Aiven meets the needs of developers, partners and the cloud-native enterprise  

Aiven was founded in 2016 by a team of open-source and cloud experts based in Helsinki who sought to develop a data management platform that capitalizes on the needs of more mature customers who are increasingly leveraging open-source software. As such, many of Aiven’s clients come already knowing what they want in terms of stream processing frameworks, databases, search engines, visualization and analytics. The core driver of value is Aiven’s ability to orchestrate data on a single management platform, which entails getting customers up and running with minimal deployment lag and enabling the integration with existing tool sets on any cloud. The dedication to a robust support model and transparent pricing with lower costs than many competitors are additional underlying factors that position Aiven to continue growing on pace with the expanding Database as a Service market (DBaaS); for reference, in just the last eight months the company has doubled in size to about 150 employees and is backed by a strong venture capital engine, including the company’s latest round of Series C funding worth $100 million, as well as solid new and recurring revenue streams.

Despite coopetive dynamics, Aiven benefits from allying with leading hyperscalers to support clients’ need for multicloud

As multicloud is a core component of its value proposition, Aiven provides customers availability by partnering with all three major cloud service providers (CSPs), including Amazon Web Services (AWS) (Nasdaq: AMZN), Microsoft (Nasdaq: MSFT) and Google Cloud (Nasdaq: GOOGL). In addition to Aiven’s services being made publicly available on the marketplaces of both Google Cloud and AWS, these relationships allow Aiven to provide enterprises with a way to build the services and applications that are enabled by databases on leading public cloud infrastructures, as well as offer a simple migration path for legacy customers. TBR notes that over 10% of Aiven’s customers are provisioning different services to multiple clouds and that many of Aiven’s adopters come knowing specifically which databases or monitoring tools they want to use and where they want to deploy them. While many customers often start with a preferred cloud partner, they ultimately seek to expand to other platforms for greater development autonomy and to avoid vendor lock-in. As a result, TBR believes Aiven’s role as an orchestrator for multiple database services across clouds positions the company uniquely in the market, as Aiven provides customers the degree of neutrality and third-party support required to navigate and manage various dispersed open-source projects.

However, as Aiven offers nine core services — including the widely deployed open-source platform tools Apache Kafka, PostgreSQL, MySQL, Cassandra, Redis, Elasticsearch, InfluxDB, M3 and Grafana — there is a large degree of coopetition as Aiven’s partners offer related services on their clouds and, in some cases, the clouds of other CSPs. The increasingly open, hybrid multicloud approaches of vendors like Google Cloud and even IBM (NYSE: IBM) will prove competitive, yet TBR believes Aiven still challenges its partners when it comes to enabling open-source innovation and helping enterprises deliver this innovation at scale. Meanwhile, as customers increasingly look for a partner to avoid vendor lock-in, Aiven is well positioned to challenge many vendors that trail the market in providing a degree of vendor-agnosticism.  

Open-source technology has become less of a value differentiator and more of a foundational attribute that customers in the cloud database market have come to expect. Vendors in the space must now embed other feature sets and functionality to stand out and navigate the common challenges faced when it comes to modern app development and operational management in the cloud. However, customer expectations go beyond avoiding vendor lock-in, one of the known benefits of open-source technology, to include reducing TCO while improving time to market, security and reliability. Aiven is a managed cloud database services vendor that delivers a unified data platform for both traditional and cloud-native customers looking to deploy data architectures seamlessly and across multiple clouds. By capitalizing on managed cloud services, Aiven has created a way for customers to build, deploy and manage various open-source database management and analytics tools in a self-service manner. With a variety of deployment methods available to customers in conjunction with the benefits of automated security, scalability and resilience, Aiven has demonstrated this value proposition by building a customer base that crosses multiple industries and highlights both customer-facing and back-office analytics use cases.

Eyeing the future: Accenture’s fundamentals drive human-centric technology change at scale

‘Leaders Wanted — Masters of Change at a Moment of Truth’

Accenture’s (NYSE: ACN) recent virtual event to introduce its Accenture Technology Vision 2021 kicked off with a quick recap of the socioeconomic headwinds of 2020. These headwinds include four new concerns facing people personally and professionally: an increasing global population driving a need for new ways of interacting; the evolution of “Every business is a tech business” as technology’s role changes with the changing environment; the workforce of the future; and sustainability. Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty then outlined in detail the five major trends of its 2021 vision.

Delivered under the slogan “Leaders Wanted — Masters of Change at a Moment of Truth,” the vision highlights five key areas, which we expect to drive investments not just from Accenture but also peers and enterprises, given the company’s market-making status in multiple domains.

  1. Stack strategically: While this trend at its core applies to architecting and redesigning organizations’ technology stacks to support the enterprise of the future, which includes attributes from the customer experience to the security layer, it also maps to Accenture’s core value proposition of joining consultants, designers, researchers, solution architects and delivery personnel, all through the umbrella of Accenture Innovation Architecture.
  2. Mirrored world: The resurgence of the digital twin is moving beyond experimental phases, and large enterprises are seeing an opportunity to invest in an area that, in the era of COVID-19, which has led to social distancing and reduced access to physical plants, will allow them to use IoT techniques to enable remote monitoring and control. Accenture’s ongoing investments in mobility and IoT service offerings over the past five years, along with the recent push into product engineering offerings, largely enabled through acquisitions, will enable the company to address demand and increase client stickiness.
  3. I, technologist: The democratization of technology, which has enabled workforces to do more with less and orient their productivity to higher-value tasks largely enabled by automation, while not a new trend, has certainly reached a pivotal point, given the changes over the past 12 months in how employees perform their work. Accenture’s rigorous approach to and ongoing investments in training — including spending $1 billion per year on reskilling and upskilling personnel, with efforts most recently focused on building cloud consulting, architecting and delivery skills — enable it to drive internal change at scale, and then sell its capabilities “as a Service” to clients.

On Feb. 17, 2021, Accenture held a one-hour virtual session introducing its Accenture Technology Vision 2021. While the format was different than in previous years, the 21st iteration of the summit had a similar goal: to portray Accenture’s technology prowess and appetite for innovation and scale. Hosted by Accenture Group Chief Executive – Technology and Chief Technology Officer Paul Daugherty, Accenture Senior Managing Director and Lead – Technology Innovation and Accenture Labs Marc Carrel-Billiard, and Managing Director – Accenture Technology Vision Michael Blitz, the virtual delivery of the content was both a sign of times and a demonstration of Accenture’s ability to coordinate, deliver and manage virtual events in collaboration with ecosystem partners — in this case, Touchcast.