ICO as a ‘medicine show’: EY finds abysmal performance in wild west of initial coin offerings

Last December, EY Global Blockchain Leader Paul Brody recognized the breakout market for initial coin offerings (ICOs) and launched a longitudinal study, centered on class of 2017 companies that is fueled by this new way of raising money for software startups. One year later, as detailed in EY’s report published today, market valuations for the top 10 ICOs were off 55% — abysmal performance by any standard. Buried in the bad news for almost all the companies, one can find a few bits of success, particularly with companies providing blockchain infrastructure. The incredibly poor performance around incubation makes a strong case, to use a “Deadwood” metaphor, that snake oil salesmen made up most of those 2017ers. As this year comes to a close, around one-quarter of the initial ICO-backed companies have a product in the market, further evidence the breakout included a number of outright frauds. In addition, of the 25 companies that had products, seven devalued the use of utility tokens by allowing payment in fiat currency, facing up to enterprises’ persistent reluctance to conduct business transactions in anything but hard currencies. Curiously, paying in tokens, according to Brody in a discussion with TBR prior to today’s announcement, came across as only the second-biggest obstacle to commercial adoption, with the first being the desire for transaction privacy — a desire pure public blockchains cannot satisfy. In EY’s previous report on ICOs, issued last December, the firm anticipated the third-greatest objection, concerns over full regulatory compliance, an insight that tracks closely with EY’s tax and audit credentials.

Today’s report includes a few nuggets revealing the depth of EY’s study:

  • “Companies that have made meaningful progress toward working products only increased by 13% in 2018. 71% have no offering in the market at all. Typically, within one year of a traditional venture-backed software startup, you would expect to see a significantly higher percentage of the companies with a functional early stage product.”
  • “Seven out of 25 reviewed projects accept other currencies, rendering utility tokens less valuable. Some projects have altogether dropped their utility tokens to focus on functionality. To become a means of payment, utility tokens have to be stable. If it remains stable, the token is of little interest to speculative investors.”
  • “Globally, sources of funding will likely shift away from retail investors toward entities that can understand and manage the downside risks, such as venture capital and digital asset-focused investment funds.”

Will next year be better? The blockchain infrastructure companies will likely be surpassed by a second wave of ICO-funded companies, with most of these taking an asset-backed approach to token issuance, essentially creating a product that is enterprise-ready at a time when buyers are not convinced of the benefits of placing all their assets on the public blockchain domain. This then raises the question: Do new wave ICO-funded companies need to rip pages from Ethereum’s playbook or simply play within its orbit? Ethereum is not a one-size-fits-all solution, but it certainly provides a solid foundation for many to learn from, especially around its “smart” contact functionality. Further advancing along some of the must-do steps EY pointed out in its December 2017 report, this second wave will more adequately address the need for clear justifications for blockchains and tokens; an ICO process more closely aligned to the initial public offering (IPO) process; enhanced security; and something close to legal compliance, or the regulators will simply begin enforcement substantial enforcement. In short, privacy trumps transactability.

The regulatory aspect piques my interest, in part because of the know-your-customer (KYC) aspects of post-ICO-linked financial transactions and recent efforts of EY, among others, to better incorporate emerging technologies into anti-money-laundering and KYC operations.

In this wild west, with its unregulated moral hazard, where does EY fit in?

My initial thoughts had the consultancy as the “Deadwood” preacher, known to all and trusted, but neither the law nor the bank. My colleagues convinced me EY will be more like the General Store, providing certified, trustworthy services and goods, helping clients mine for gold without shortcuts and faulty equipment that bring down the whole operation. Now imagine artificial-intelligence-enhanced, blockchain-powered resupply brought into Deadwood.

Four by four by four equals four

Four straight weeks of traveling for work, to four different cities to meet with four different clients, brought out four thoughts about where the IT services and consulting markets stand as we move into the autumn rush of 2018.

First, the next few years will finally bring the shift in the consulting model that we’ve been anticipating for the last decade (when I worked Deloitte and the partners tasked our team with understanding the other Big Four firms’ moves back into consulting). Outcomes-based pricing won’t become the norm because clients want transparency or because consultancies readily put their own fees and reputations at risk, but because the technology around assessing, measuring and even metering outcomes has improved dramatically in the last couple of years. And asset-based consulting will become the norm because consultancies can finally fully marry their intellectual talent to repeatable, scalable, configurable solutions infused with more than just methodologies and industry knowledge.

Second, the word “maturity” has started creeping into conversations about emerging technologies such as artificial intelligence, enhanced analytics, Internet of Things and even blockchain. The smart consultancies have recognized the buzz around emerging tech has produced clients that have enough experience, both good and bad, to think of themselves as more than just novices needing consulting help to understand the emerging new-world customer and employee. These clients don’t want to be amazed by cool tech. They want their experience to be acknowledged and built upon, and they want to move faster. Recognizing maturity means talking about deeper, more lasting — and more expensive — engagements, much to the benefit of consultancies.

Third, I anticipate an era of internally splintered consultancies competing with globally managed firms, creating a weird market with nominally global players drifting toward highly localized operations, while a couple of large consultancies maintain centralized, uniform cultures and organizations. Mirroring the political, economic and demographic forces behind the recent rise of populism and nationalism, some global consultancies could see local and regional practices pressured by trends in data sovereignty and cybersecurity, combined with a spillover from political populism and accelerated by agile technologies that can be spread rapidly and customized for micro-differences more quickly than before. If this trend develops, the consultancies opting to go all-in on one approach or the other will succeed. Those slow to decide or trying to muddle through a middle-ground arrangement will see the market surpass them.

And fourth, I come back repeatedly to leadership, a topic I’ve written about extensively in our analysis and in a couple of blog posts and special reports. Leading a consulting or IT innovation or systems implementation team today requires mastering new technologies, understanding a client’s industry and their position within it, and navigating shifting centers of budget and decision making. None of that differs greatly from previous generations of IT, except that today the diversity of talent demands more capable leaders and the speed of technological change demands increased humility and adaptability, plus a greater willingness to form, manage and lead flexible teams. Companies I see that recognize the talent shift, including changes brought on by millennials, and understand the impact on their leaders — and the company’s imperative to train and equip those leaders — repeatedly stand apart from the pack.

Postcards from the edge: Complexity is here — wish it were not

“Analog dollars to digital pennies” is a phrase used to discuss the continued compression on technology price points as Moore’s Law economics, coupled with continued IP abstraction, creates economic trigger events aimed squarely at legacy business model best practices. Recently, I attended analyst events in New York City — one sponsored by Lenovo and one by Canonical — that outline these economic trigger events, albeit from different sides of the same coin.

Canonical CEO Mark Shuttleworth used the term “economic trigger events” often in his opening remarks. The idea is that technologies and new price points create trigger events that result in new economic fundamentals where some participants will be disruptors and some will be disrupted.

The rise in the hype cycle around edge computing as it joins forces with cloud, artificial intelligence (AI)/machine learning, and Internet of Things (IoT) creates a veritable Gatling gun of economic trigger events. These events accelerate business model disruption as we pivot to the Business of One era.

The disruption sits atop the continued economic pressure from commoditized hardware. What’s behind it all is that while infrastructure is valuable, it is not valued. In short, the margin moves out of infrastructure and into the business outcome. Technology enablement is less constrained by affordability and more by determining what business value can be derived from the application or use case.

Rather than being the lead decision in business investment decision making, infrastructure acquisition becomes the derived decision. The service attach rate or services drag becomes the fuzzy guide point for new inventions and new business models. Broad, ubiquitous ecosystems become imperative to generating sufficient margin in the digital penny world to justify the ongoing development, monitoring, and maintenance of secure flexible infrastructures that won’t break and will keep data secure and private.

For Canonical, this means focusing on the connection between operating system and cloud control planes to ensure a single code set operates silicon as large as high-performance computers (HPCs) and as small as single-purpose IoT devices. Compute infrastructure is assumed to work, until it breaks, and then users realize just how valuable that hidden infrastructure provisioning is.

As Canonical is hardening the abstraction layer to ensure seamless interoperability, Lenovo (and many other hardware manufacturers) create purpose-built appliances optimized for edge workloads. In some instances, these will be small appliances simply capturing data and routing it back to clouds for ingestion into massive analytics engines. In other situations, it will be very-high-performance compute engines with GPU accelerators in simple, easy-to-operate form factors where AI inference in real time has to be performed at the edge. Here again, the assumption will be that the edge appliance can operate (in a retail convenience store, for example) without the need for any technically savvy personnel to monitor, manage or provision the device on-site.

Look for more detailed special reports from TBR on the Lenovo and Canonical industry events in the next few weeks.

 

As the 2018 finish line nears, will McKinsey continue its breakaway or slip back into the peloton?

Every six months, we review McKinsey’s performance, analyze its strategy and consider likely scenarios for the firm in the near term

Back in June, we said, “If McKinsey truly gains market permission to take digital transformation from design through execution, its peers will be pressured to emulate the firm.” In advance of the November profile, we’ve considered how the firm performed through the first part of 2018 and put together a couple quick thoughts as the end of the year nears, specifically around some McKinsey moves in France and a potentially pressing market trend.

As part of a burst in recruiting at McKinsey France, three partners with specializations in robotics and automation relocated to the firm’s Paris office, filling in portfolio gaps. McKinsey also launched a high-profile and seemingly civic-minded skills-development program to combat stubbornly high unemployment across France. Along with Bain & Co., Oliver Wyman and others, McKinsey seems to be sensing new growth in the France consulting market, likely driven by Brexit and digital transformation. Brexit’s pending implementation constrains consultants working between the U.K. and members of the European Union, making the French consulting market more congested as well as more lucrative for well-positioned and staffed consultancies. French companies have also accelerated efforts around digitization and digital transformation, driving consultancies such as McKinsey to place more emerging technology resources on the ground in Paris.

Not all will be wine and roses — or champagne and irises — as McKinsey faces a market trending toward transparency even as the firm manages the fallout from several high-profile mistakes (e.g., South Africa, Eskom)

Competitors have always complained about McKinsey’s business practices, and clients have always complained about the firm’s pricing and failure to follow through on implementations. Customers and/or markets emphasizing transparency may be far less forgiving of McKinsey’s methods, and larger consultancies with positive brand strategies and market presence — such as EY, PwC and Deloitte — may be well equipped to displace McKinsey, given the opportunities. Even as McKinsey adopts risk-sharing and outcomes-based pricing for some engagements, tech trends such as cloud, Global Data Protection Regulation compliance, analytics and SaaS, point to increased transparency and visibility into actual results, with clients potentially assessing whether McKinsey’s high consulting fees are justified.

We will be marking all these moves in our upcoming Management Consulting Benchmark, comparing McKinsey against 12 other consultancies across four service lines and countless metrics. In June we quoted a longtime partner at a Big Four firm, who said, “I don’t know what McKinsey will do next. I do know they’ve consistently been a couple steps ahead of us.” Maybe the timing is right for those competitors to catch up.

 

VMware leans on partners IBM and AWS to go increasingly all-in on cloud

What’s new from VMworld on the cloud front?

VMworld 2018 in Las Vegas came to a close just a few short weeks ago, but the impact from the slew of cloud-related announcements from VMware and its partners continues to reverberate. After multiple changes in course over the past 10 years as VMware reacted to the shift toward cloud computing, the company has found a strategy that works. VMworld 2018 showed the company doubling down on its partnerships with leading cloud providers and addressing customers’ cloud management pain points.

To extend VMware’s relevance in the cloud management space, the company announced both an acquisition and a host of organic portfolio updates during the conference. Notably, the company announced its intent to acquire CloudHealth Technologies to further its multicloud management and operations capabilities with CloudHealth’s platform and expertise in Microsoft, Google and Amazon Web Services (AWS) clouds.

Portfolio updates were announced across the vendor’s Workspace ONE, VMware Edge, vRealize, vSAN, vSphere, VMware Cloud Foundation and vCloud Director portfolios. Additionally, partner program enhancements through the VMware Cloud Provider Program and announcements with key partners AWS and IBM in regard to partner-based cloud solutions were made as well. While the portfolio updates are notable, much of VMware’s relevance in the cloud space as of late is coming from its alliances and partnerships. It took the shutdown of VMware’s own vCloud Air service, but the vendor’s partner-led cloud strategy reinforces the value of VMware in cloud and hybrid environments.

VMware’s alliance with IBM is a decades-long, increasingly strategic partnership that now spans customers’ and cloud data centers as the two companies work together to help enterprises modernize their traditional and virtual environments into truly hybrid environments. VMware and IBM look to optimize customers’ existing IT assets with strategic cloud workloads and functions as well as the help of thousands of VMware specialists within IBM Services. Additionally, their tenured relationship turned even more strategic in 2016 as IBM helped VMware re-enter the public cloud market with IBM Cloud for VMware Solutions. At the conference, the two announced vCloud Availability for vCloud Director on IBM Cloud, a disaster recovery solution for multitenant environments that enables IBM Cloud to serve as a failover for workloads on VMware environments. After the successful migration of a few key legacy applications to VMware HCX on IBM Cloud for American Airlines in recent months, IBM and VMware also unveiled JumpStart enhancements, announced at VMworld, to help customers migrate their existing on-premises VMware workloads to IBM Cloud.

VMware’s partnership with AWS is very well known, in large part due to the number of customers using technologies from each of the two vendors and because of both companies’ former reluctance to address new delivery methods. VMware was very much on-premises focused while AWS was solely public-cloud focused, making the partnership and VMware Cloud on AWS a notable shift for both vendors. At VMworld 2018, the two companies announced the expansion of their relationship, including the global extension of VMware Cloud on AWS into Australia, a Cloud Provider Hub that allows partners to offer VMware Cloud on AWS as a managed service, AWS Relational Database Service on VMware, NSX integrations, price reductions, Log Intelligence for VMware Cloud on AWS, Instant Data Center Evacuation and more.

Let’s dig a little deeper into these two partnerships and solution sets

While much of the focus as of late in terms of VMware Cloud partner developments has been on AWS, VMware’s partnership with IBM has existed for a longer period of time and encompasses more than IBM Cloud for VMware Solutions, thus the bulk of the integrations are well established and the two announce new features, integrations and functionality as their portfolios evolve. To note, back in August 2016, VMware announced that IBM would provide the first service offering for VMware Cloud Foundation and also train more than 4,000 services professionals with expertise around VMware solutions.

In line with market trends, VMware is partnering with as many leading vendors in their respective technology markets as possible, ultimately to meet and exceed the demands from its customers for multivendor environments, integrations and interoperability across environments. Each of the aforementioned partnerships fits its own customer set, albeit with slight overlap, and addresses specific customer pain points. VMware and AWS are poised to capitalize on midmarket and small enterprise opportunities, with an emphasis on cloud specifically, while VMware and IBM are poised to capitalize on opportunities in the midsize and large enterprise sectors, with a hybrid IT emphasis, optimizing customers’ blends of cloud and legacy IT assets.

While VMware’s partnerships with IBM and AWS may seem like six of one and half a dozen of the other, the differences themselves when looking at hybrid IT as a whole rather than cloud only, where IBM and VMware naturally have a longer, more strategic relationship that encompasses virtual and cloud environments spanning customer and vendor data centers.

To make this a little easier to digest, we’ve developed a table that includes some key solutions recently announced by VMware and AWS and compares them to existing and new IBM and VMware solutions in regard to how customer pain points can be addressed. While the technical functions available from both VMware partners are aligned, many of the target customers will be different.

 

It is our perception that the VMware and AWS partnership better suits organizations that embrace public cloud, whether for budgetary reasons, risk sharing or lack of IT staff. Alternatively, IBM is the partner of choice for IBM and VMware large enterprise customers. Joint IBM and VMware solutions are tailor-made for organizations with large on-premises data centers that remain fully functional and thus are not yet ready to be shut down in favor of public cloud only, serving instead as a blend between the old and new.

Sponsored by IBM

The JEDI contract: We don’t see the force; do you?

Considerable ink has been sacrificed parsing both the wisdom and the potential winners and losers of the Joint Enterprise Defense Infrastructure (JEDI) contract award, which is estimated to be worth up to $10 billion over 10 years. TBR wrote two commentaries around the topic in June that handicapped the potential bidders and outlined the fundamental consumption model shifts triggered by continued technological innovations changing public sector procurement factors from affordability to governance compliance — or from “wallet to will.”

The impetus for revisiting the contract flows from the Pentagon again pushing back the timeline on submitting proposals by three weeks. That the Department of Defense once again pumped the brakes despite a desire to accelerate modern technology procurement, while simultaneously ending the Q&A portion of the solicitation period, highlights the tensions and challenges created by the shift from wallet to will.

At TBR, we continue to question the efficacy of a single-source contract for cloud infrastructure services. This concept of working with one cloud vendor for compute was leading edge in the commercial space about a decade ago as enterprises wrestled with how to automate the seamless movement of applications and data between on-premises and cloud compute instances. With this technological problem largely addressed in the hybrid cloud era, the new technological challenge facing leading enterprises is automating that seamless deployment across multiple cloud environments.

In this adoption of new, consumerized technologies, we see the disruptive forces aiming at the public sector IT market opportunity. The Pentagon seeks a single-source captive solution, betting on one firm’s ability to stay ahead of the market on innovation for a decade. Such a bet makes little sense to TBR or to the industry executives with whom TBR has discussed the JEDI contract structure. Furthermore, much has been written lately about the concept of asymmetric competition, which postulates that open platforms actually shorten the competitive advantage windows technological innovations provide to technology vendors. In short, being able to exploit leading edge technology requires that companies lessen their reliance on single-source vendors rather than doubling down on them.

It has been argued that military strategists plan how to fight battles without fully appreciating the changes in warfare they will face going forward. Awarding a single-source, 10-year contract for technology innovation sounds like a continuation of that misaligned planning process. In the private sector, such deals trigger business exits from improper planning. In the public sector, the exposure to lagging strategic planning manifests in threats to national security.

SaaS sweetens the cloud pot but requires vendors to up their ante to participate

Despite the simple graph in Figure 1 depicting SaaS market size, the space remains difficult to sum up. In the eyes of customers, SaaS options are proliferating and spanning a wide swath of business functions and stakeholders. Yes, SaaS is the largest segment of the “as a Service” cloud market—and yes, it will continue to expand. Beyond that, however, SaaS will remain a collection of separate markets, with most vendors specializing in one or two core and adjacent areas, instead of one unified opportunity. Some examples of this fragmented and overlapping landscape include Microsoft leveraging collaboration dominance to reinvigorate its CRM strategy with cloud delivers, SAP returning its focus to SaaS CRM after ceding the market to Salesforce, and Workday investing to build out a financials-focused SaaS business from its HR roots.

The market behaves in contrast to the IaaS market, which is highly consolidated around a standard set of often interconnected services and a small collection of vendors. In the SaaS market, growth will be achieved by new vendors addressing new workloads and features. From a vendor standpoint, there will be greater presence from legacy application providers such as SAP, Oracle and Microsoft, but also plenty of room for more niche providers as functional and regional niches develop.

While SaaS will grow the overall cloud opportunity, the challenge for vendors is that the SaaS opportunity will be more difficult to capture. That is not to say the historical model for SaaS adoption will cease to exist; there will still be SaaS purchases that are driven by lines of business (LOBs), transacted with a credit card in some cases, and deployed separately from legacy systems. At least some of the growth will continue to occur in that shadow IT model. However, much of the growth will be from SaaS solutions that deliver more critical services, are procured by joint IT and LOB teams, and are tightly integrated with legacy systems. These scenarios will require vendors both large and small to up their ante, bringing more sales, integration and support services to the table to win these more complex deals.

‘Best of breed’ spawns diversity in the SaaS provider landscape

The vendor landscape may be consolidating on the IaaS side of the cloud market, but that is not the case for SaaS. As seen in Figure 2, customers are most likely to increase the number of SaaS vendors utilized over the next two years, supported by a number of market trends, including new workload and feature adoption, platform ecosystems, and integrated multicloud deployments.

For workload adoption, there is a leveling of the playing field for which services customers are considering cloud as a deployment method. ERP, for example, used to lag in public cloud adoption but is now much closer to par with often adopted services like CRM and HR. Much of this increased consideration comes from enhanced comfort on behalf of customers for delivering sensitive workloads from cloud providers versus their on-premises data centers.

The other factor is the proliferation of complementary services available via PaaS ecosystems. The most tenured and largest example of this comes from the Salesforce Platform, which supports thousands of ISVs developing and selling solutions that complement and extend core CRM. Salesforce may have been the first, but other SaaS vendors, including SAP, Workday, Microsoft and ServiceNow, are taking the same approach, exponentially growing available SaaS services. The last driver is the continued rise of best-of-breed customer purchasing. For contracting and performance reasons, customers have long yearned for multivendor application environments, and now vendors are actually moving to accommodate that desire. Salesforce’s acquisition of MuleSoft and SAP’s introduction of the Intelligent Enterprise vision are the latest examples of how vendors are supporting customers in choosing and integrating solutions from numerous providers.

Expectation inflation raises the bar for SaaS providers

There may be a growing pool of revenue and room for more providers, but meeting customer expectations for SaaS solutions is anything but easy. Expectations have been on the rise, stoked by the greater control buyers have with cloud solutions versus on-premises software. The days of long-term software contract risk falling entirely on the customer are quickly coming to an end. Not only has the power dynamic shifted, but, as shown in the graph below, customers are successfully using more of their IT dollars to fund innovation over maintenance of existing systems. As a result, different evaluation criteria are being used for IT investments. Up front, there is a much more collaborative process between IT and LOB teams as they decide which offerings meet their underlying business need, not just what fits into their existing footprint. Calculating the benefits and return from SaaS investments is also a challenging task, as deployments use business outcomes as the ultimate goal. Although hard calculations seem challenging for most customers, it’s clear that enhanced levels of support and “customer success” roles are increasingly valued. Having these post-sale resources available and putting a greater focus on outcomes and other intangible benefits than on technology benefits seems to be the best way for SaaS vendors to meet inflated customer expectations for what the solutions can and should do for their business.

 

 

The 4 P’s of marketing – people, process, partners and platforms – emerge behind AI and compel vendors to adopt S-centric frameworks

digital marketing services infographic, 4 P's marketing

Market dynamics will evolve in the next 5 years, with voice and video the core conduits for trusted and tangible AI-based marketing campaigns

The digital marketing services (DMS) market will grow at a CAGR of 16.2% from 2017 to 2022, reaching $125 billion, as organizations across geographies adopt artificial intelligence (AI)-enabled, customer experience-based voice and video solutions to run outcome-based campaigns addressing business pain points beyond brand awareness. Marketing in the moment frameworks will continue to dictate the shift toward hyper-personalization as consumers’ attention becomes the new currency and creates opportunities in areas such as omnichannel delivery and intelligent operations.

The shift from brand awareness to activation and support results in four new P’s of marketing — people, process, partners and platforms — leading to data management issues and opportunities. Winning vendors can adopt “S”-centric frameworks that emphasize closing skills gaps, delivering at scale and being in sync with partners’ visions, and addressing customer data silos through the development of interoperable and secure solutions.

Portfolio and go-to-market transformation and AI solution integration will be among the levers vendors can use to capitalize on a growing DMS market. Feeding the hype of AI could be a double-edged sword if technology and services vendors cannot deliver on the promise to shift the perception of marketing from a cost center to a business value driver.

AI-based voice and video platforms will increasingly take center stage as enablers for delivering campaigns in hybrid marketing environments, helping brands better connect consumers’ offline and online professional, purchasing and social behavior data. Technology partnerships and expertise in integrating platforms such as IBM Watson, Google and Adobe Sensei in the business-to-business segment and Amazon Echo and Google Home in the business-to-consumer segment will be key to services vendors’ success. The inability of vendors to recruit and retain talent with skills in these technologies might hinder market share as vendors are unable to address tasks at speed. Lastly, within the next two years, the broad-based adoption of AI across omnichannel platforms will reduce the need for multiple vendors to support engagements, and will also result in new opportunities in intelligent marketing operations.

For more information, contact Senior Analyst Bozhidar Hristov ([email protected]).

Robots laundering IT budgets?

Automate a bad process or fix the process first?

As consultancies start expanding their robotics process automation offerings (RPA) and the software and related services begin permeating through enterprises’ procurement, human resources, IT, and even internal audit, a curious debate has surfaced between the merits of opting for careful and meticulous process assessment, documentation and improvement or just deciding to throw some robots at the process to get the cost savings benefits as quickly as possible.

I’m not surprised this discussion surfaced, given the rapid adoption of RPA solutions by companies in a wide variety of industries and the sustained investment by IT services vendors and consultancies in the people and assets needed to implement RPA (see any of our reporting in the last year on EY, Accenture, PwC or Capgemini). What surprises me is where the vendors and clients come down on this debate. At a recent three-day event, I listened to fairly passionate discussions on this topic, with clients taking the position that a company needs to do the standard evaluate-improve-refine process for their processes before applying automation. In contrast, the consultants — the ones best positioned to provide advisory services around that standard approach and charge for those services — argued for the fast fix-and-go approach. One consultant noted, “Throw robots at a bad process if it saves time and money now. … Then reinvest those savings into whatever else you need completed.” One client, who changed her mind by the third day after absorbing the consultants’ lessons, described some processes as tasks that employees “deplore, but must be done accurately, timely and repeatedly to help run the business.” She said RPA could be applied to these, with the savings poured into new artificial intelligence or other desired-but-not-a-priority initiatives.

Of course, it’s not that easy, or robots would be doing every deplorable task and automating every aggravating process

And plenty of consultancies will continue to offer process optimization and change management as core elements to most engagements. I’ll be watching the consultancies that have invested heavily in RPA and how they describe their engagements, which clients they highlight, and how their talent models shift over the next year. I’ll also be looking for examples of companies embracing rapid RPA deployments, knowing not every process was improved, but they threw the robots at them. Most importantly, we will be asking about the redeployment of those costs savings.

My colleague Jen Hamel’s Digital Transformation Customer Research, published in March, noted that clients haven’t been investing as much in data management, “despite the struggle organizations face with underlying data integrity and standardization issues that hamstring generation of actionable insight and limit analytics solution value.” She went on to note that TBR expects “this trend will be exacerbated by the proliferation of connected devices, ingestion of new data from the incorporation of additional sensor technology and breakdown of silos accelerating data inputs as processes are transformed.” So, if I had to bet, I’d say the smarter enterprises will be plowing RPA savings into the less-exciting, more-impactful data management tools or enhanced capabilities around risk and compliance. As Jen also pointed out, “[As] 66% of DT [digital transformation] services buyers used a vendor from a prior IT services engagement, existing relationships in clients’ IT organizations are a good starting place for ascertaining and accessing DT budgets.” We’ll be watching this closely, all the while wishing I had a robot to do the watching while I do the thinking.

 

Digital transformation advances analytics and insights

Realizing the dream of AI-embedded business processes must start with people and data management

Realizing the dream of AI-embedded business processes must start with people and data management

Every enterprise looks to use emerging technologies to cut costs, grow revenue or create new business models. The combination of changes in how people work and what new technologies can best be applied creates massive opportunities for services vendors. This new market — broadly defined as “digital transformation” (DT) — will evolve through the current hype peak into a long, steady stream of fundamentally traditional services engagements involving a mixture of process knowledge and technical expertise. Though no longer “emerging” technologies, data management and analytics software remain at the core of DT initiatives and adoption of truly emerging technologies such as artificial intelligence (AI). As the analytics and insights (A&I) professional services market matures, competencies around AI, human-centric user experience design and DT-related change management will be key to vendors’ future growth.

To sustain A&I professional services opportunities over the long term, vendors must stay on top of AI technology developments while maintaining a broader perspective on the impact of AI on clients’ business processes and human resource (HR) strategies. As AI adoption grows, so does the technology’s complexity, particularly at the intersection points between humans and machines and between regulatory policy and technological innovation. We expect rising concerns around security and governance, regulatory compliance, and HR implications of AI systems will continue to drive consulting and solution design engagements tied to broader DT initiatives. To capture these expected opportunities, A&I services vendors invest in service and technology offerings to assist clients with AI adoption — from upfront advisory to data integration, application development and managed services. However, despite vendors’ massive investments in AI capabilities and a growing number of high-profile use cases for the technology, TBR’s research around enterprise DT initiatives indicates clients have not fully bought into the value services vendors can provide in creating AI solutions, suggesting vendors have a marketing challenge to overcome.

The model for a successful vendor is indeed a tall order. Our research indicates enterprises undergoing DT want vendors to understand their business problems in both industry and functional contexts, create solutions that mesh with their existing IT environments and maintain security, and cultivate robust ecosystems of best-of-breed technology partners. While building out strategies around each of these pillars, vendors should message how they can address technical challenges such as data preparation and training to help clients start and continue experimenting with AI, as well as provide process transformation and change management advice to enable clients to bring those experiments to scale. Vendors must walk a fine line between establishing a long-term vision for the future of business and directing clients where to take the first step toward achieving their goals. Framing AI adoption in the context of methodical modernization of individual business functions, rather than as an excuse to play with cool new technology, will keep vendors on the right side of that line.

TBR will continue to monitor the impact of AI on vendors’ go-to-market strategies and enterprise customers’ IT and professional services buying behavior through its Analytics & Insights Professional Services Vendor Benchmark, Digital Transformation Services Market Landscape and Digital Transformation Customer Research. For deeper insight on this topic, see our event perspective on the 2018 O’Reilly Artificial Intelligence Conference, held this past April in NYC.

For more information, contact Senior Analyst Jennifer Hamel ([email protected]).