Letting your clients pick you: Tweaking the digital transformation center model

“Start with a new space, furnish it with funky chairs, nontraditional work spaces and all the latest technologies. Recruit creative talent, mixed with some data scientists and wonder-tech folks, plus seasoned strategists. Bring in current clients and consult on digital transformation.”

The last time I talked about getting leadership right at digital transformation centers, I made the comment above and also mentioned three other critical elements: client selection, talent management and technology partner cooperation. I’ve seen the steady evolution on client selection, but an event last week showed me how much room exists for consultancies to play around to find a winning formula. IT services vendors, led by consultancies, initially designed these centers to cement relationships and expand their footprint with existing clients, often by demonstrating capabilities and services beyond the current engagements. For example, PwC’s supply chain management clients learned the firm also offers cybersecurity, and companies contracting Accenture for BPO could immerse themselves in that consultancy’s digital marketing services. Because initial investments drove the need for cost justification, most consultancies began by opening their doors to any and all clients, with predictably mixed results.

Consultancies learned the most efficient use of these centers included only clients on either end of a simple digital transformation spectrum, forcing the firms to spend additional sales time and effort ensuring clients came prepared. Consultancies stopped wasting half of a one-day workshop resolving a client’s internal political dysfunction or just beginning to scope core business problems and cemented processes around client selection and preparation.

Which leads me to what I saw last week. We will soon publish a special report, but it was strikingly different from every other event, as the consultancy intentionally stayed in the background — a self-described footnote at its own event — allowing clients and non-clients the space and time to collaborate, share cross-industry struggles, innovate and, in many cases, realize the combination of confidence in the ability to change and ambiguity about how to do so led to epiphanies around the need to hire a consultant. Very likely the one who had been in the room for the last two days. Subtle, smart, maybe possible only because the event was off-site for everyone and intentionally a mix of Fortune 500 companies and small to midsized enterprises.

So, a curious twist. Instead of centering on existing clients and ensuring the right ones come to the right collaboration/digital transformation center at the right time, this consultancy allowed clients and potential clients to self-select how much more advice they needed. I don’t expect a wholesale adoption of this by other vendors, but believe I will see elements of this repeated as everyone continues learning what works and what doesn’t, always looking for what’s best for the client and what starts returning some investment on these centers.

Time to get industrial about healthcare

Internet of Things (IoT) hesitation in the healthcare vertical stems from the industry’s complexity, as it is chained by liability and privacy issues, a general unease about change, legacy equipment, and unevolved processes. These complexities are all rooted in real concerns of customers and vendors in the healthcare space. However, the “Industrial IoT Analytics for the Healthcare Industry” presentation by Glassbeam employees Gopal Sundaramoorthy and Puneet Pandit at PTC’s LiveWorx event highlighted that it is time to shift how vendors go to market within the healthcare industry.

Sundaramoorthy indicated there are not a lot of high-level analytics, or grand-scheme IoT implementations, in healthcare. The challenges mentioned above, especially privacy issues, including healthcare organizations’ desire to keep data internal, prevent it. Instead, Sundaramoorthy explained vendors need to talk to healthcare organizations like they talk to manufacturers, focusing on how healthcare organizations can connect equipment to improve asset utilization, save costs and increase efficiencies. This is the operational technology (OT) discussion instead of the IT discussion.

With asset utilization, for example, how is a medical scanning device being used? How many scans are being done and in how much time, what types of scans are being done, and when are the scans happening? Or, a conversation around operator utilization could include aspects such as determining whether operators are fully trained by measuring what functions they are using and how long they take compared to average or trained users. Likewise, predictive maintenance, such as noting when a bulb needs to be replaced in an MRI machine, helps avoid costly or dangerous downtime. These simpler-to-implement OT-based measurements will help hospitals run more efficiently and save money just through connecting machines and adding straightforward analytics. It also helps medical device manufacturers better understand why things are going wrong and how to best improve diagnostic time, shorten repair time and relieve frustration for medical professionals.

Sundaramoorthy indicated that simple connectivity is healthcare’s biggest problem. To break the hesitation barrier, vendors should focus on solving the first step in IoT: connecting the often woefully out-of-date machinery and building in IoT, in the spirit of OT, to prove ROI to medical organizations. After machines are connected and OT-based IoT is proving consistent ROI, the discussion to move to more transformative IT use cases will be a much easier sell.

Smart city solutions have to think outside the trash bin

The “Connecting Your Business to the Smart Cities We All Live In” panel during PTC’s LiveWorx event included ideas consistent with TBR’s previous views on smart cities. One of the most interesting speakers was Nigel Jacob, the co-founder of the Mayor’s Office of New Urban Mechanics, an R&D organization within Boston’s City Hall. Jacob gave a presentation on the “Boston Smart City Playbook,” compiled by his organization, which lists the following rules for vendor engagement:

  1. Stop sending sales people.
  2. Solve real problems for real people.
  3. Don’t worship efficiency.
  4. Better decisions, not (just) better data
  5. Platforms make us go ¯\_(ツ)_/¯.
  6. Toward a “public” privacy policy

All of these points align well with TBR’s view of how vendors need to improve their go-to-market strategy, but a few stood out. “Stop sending sales people” translates well inside and outside smart city applications. Internet of Things (IoT) is a complex technology, and it is difficult for end users to really understand what IoT can do for them. Public sector officials, just like the CEO, CIO or CTO of any private organization, do not want to listen to a sales pitch about why a technology is great. Instead, in the example of Boston, decision makers desire vendor engineers or consultants to be on-site to explain why IoT is good for their city’s particular challenges, how it can be implemented and how it has worked for others, as well as to provide concrete evidence of what Boston can expect to gain in the long run. Only then will a vendor’s solution be taken seriously.

“Better decisions, not (just) better data” is a point TBR believes vendors should take to heart. Data is a building block to insight, but piles of data with no feasible way to turn the data into actionable insight is little more useful than no data at all. Customers seek insight through data, but if there is not an easy path to achieving insight, its value is significantly reduced. Customers believe that to get value out of IoT, they need to bolster their IT, operational technology (OT) and data scientist staff. TBR believes incorporating artificial intelligence and improving user interfaces to simplify IoT products is a path to unlocking value for business decision makers, enabling them to make better decisions without incurring huge selling, general and administrative expenses.

“Platforms make us go ¯\_(ツ)_/¯” is also parallel to customer concerns recorded by TBR. Platforms are exciting to techies, but they do not mean much to customers. Instead, they generally raise fears of platform lock-in, where customers will be unable to access outside technologies or risk becoming a member of a dying standard. Also, the platform level is often too high for customers to understand how IoT will benefit them. Vendors must continue to boast interoperability and focus on use cases or small deployments. Small deployments that solve immediate problems — not technical and platform-based discussions — will be vendors’ gateways to customers. After a few successful small projects, vendors can introduce customers to the grander view centered on a wide platform.

Bigbelly vice president of North American Distribution and Global Marketing Leila Dillon, another presenter during the panel, explained how Bigbelly solved multiple problems for individual cities by thinking outside the box. The company sells solar-powered waste systems, mostly bins, that automatically compact trash and alert waste management when they need to be picked up. This granted cities substantially increased efficiency not only because automatic compacting eliminated waste buildup but also because the alert system saved wasted time having trucks on routes checking all bins instead of only those that are full. Additionally, Bigbelly observed that by thinking creatively, it could further cities’ smart city goals. It started working with cities to equip waste bins with small-cell technology to enable ubiquitous citizen connectivity. In other cases, the company equipped cameras or sensors to track foot or street traffic to help cities understand congestion. Bigbelly is a great example of a company helping to solve a pointed problem — in this case, making waste collection more efficient — and then working with the cities to build additional IoT use cases one success at a time.

‘Popcorn market’ and ‘shrink-wrapped’ IoT: TBR gets creative with industry terms

Observers of emerging tech trends often seek the “hockey stick” moment, or that period when the market takes off following an explosion of activity. However, as TBR Principal Analyst Ezra Gottheil explains in his special report ‘Shrink-wrapped’ IoT will drive accelerating growth; an explosion of activity, or huge moment of growth, will likely never occur in the overall commercial IoT market. Gottheil writes:

Each IoT [Internet of Things] solution comes to market at a different time, meaning that as more packaged solutions become available and as some experience rapid growth, the total growth accelerates. The IoT market has been described as a “popcorn” market, in which each submarket “pops” at its own pace — some smaller markets grow explosively, but the total market (the “pot of popcorn”) expands more uniformly.

A popcorn market leads to slowly accelerating overall growth, generating frustration for companies that had anticipated rapid adoption. This is especially true in the IoT market for horizontal IT companies such as Hewlett Packard Enterprise (HPE) and Dell EMC that are finding themselves selling into new markets, including product development, operational technology (OT) and data science organizations, instead of traditional IT department constituencies. Gottheil notes that for organizations that are seeking to benefit from IoT, the key to accelerating growth is developing packaged “off the shelf” — or “shrink-wrapped” — IoT solutions. The increased availability of IoT solutions targeting specific use cases and business processes in industry subverticals will be key to generating IoT-driven vendor revenue for the foreseeable future.

Your digital transformation center needs leadership

Start with a new space, furnish it with funky chairs, nontraditional work spaces and all the latest technologies. Recruit creative talent, mixed with some data scientists and wonder-tech folks, plus seasoned strategists. Bring in current clients and consult on digital transformation.

As companies implement this playbook, a couple of common themes and challenges are emerging, mostly around client selection, talent management and technology partner cooperation. (Look for future blog posts on all three of these.) We had the pleasure of meeting many of the leaders at these new digital transformation centers (in Miami, Dublin, Frankfurt, Dallas, New York City and more), and I noticed common traits among the people charged with running these new places: passionate, invested, visionary. Some places took a kind of “buddy cop” approach, pairing a creative with an executioner (in a good way, for both). Some bolted long-standing capabilities onto an acquisition. The real kicker: these centers need nonstandard leaders, even as the larger firm — the board that just invested $20 million in a new space and new talent — wants to ensure the investment pays off and puts a trusted, almost always longtime company professional in charge. And that makes leadership more critical than ever.

The best we’ve met (a highly subjective and personal assessment) echoed lessons I learned during my brief time in the U.S. Army and my long exposure to U.S. military culture: train everyone, especially the leaders, and train them for their next job; promote them when they’re ready and support them with more training as their responsibilities evolve. One center leader described to me how her company invested in her management skills, ensuring she could handle the diverse set of backgrounds, skills, expectations, and corporate cultural mindsets she would be leading at the new center. Longtime professionals who grew up within a firm might be able to manage teams mixed with experienced and new hires. But leading such a team requires skills not typically gained from serving only in one organization or growing professionally mostly through similar roles.

As much as I’ve enjoyed digging deeper into the substance behind the hype of these centers — the funky chairs and bleeding-edge tech and clients taking journeys to digital transformations — we still want to understand the business case, the strategies and the metrics that determine whether these substantial investments of money and brand are beginning to pay off. From what we can see to date, success still relies on what it always has: leadership and teamwork. Companies recognizing this lead the pack right now, especially as that pack becomes crowded with cloud, network and legacy IT vendors all looking to play in the digital transformation space.

 

Will digital transformation be the catalyst for adoption of new outcome-based pricing models?

Every day I find myself reading about the developments happening in business-to-consumer (B2C) pricing.

Here’s a sample of those that jumped out recently:

These developments highlight the growing momentum behind providing dynamic, value-based and outcome-based pricing models, a movement being driven by companies’ desires to provide personalized customer experiences at scale.

While this push has been most publicized and noteworthy in the B2C world, driven by the likes of Uber, Netflix and MoviePass, it also consistently permeates the complex business-to-business (B2B) IT products and services world that we focus on. “How do we shift from a cost-plus to value-based pricing model? Are companies really doing outcome-based pricing? Who is doing it well, and for what types of customers? How?” These are common questions vendors are trying to sort through as they change their businesses.

Often, we’ve heard that IT vendors are serious about making outcome-based pricing models work, but the customers are putting the brakes on these types of arrangements. Customers will ultimately balk at the variability and risk of an outcome-based arrangement at some stage of a deal negotiation and push vendors to offer predictable fixed-price engagements. Customers like the idea of not paying when an outcome is not achieved more than sharing the benefit of an outcome that is met, and somewhere in that trade-off the fallback becomes a traditional contractual arrangement.

What’s interesting is that based on recent research, this customer hesitance seems to be abating. In our 2H17 Digital Transformation Customer Research, we asked 165 global enterprises that are undertaking digital transformation initiatives to identify the pricing structures they’ve experienced, and outcome-based pricing emerged as the most common model globally.

As my colleague Jen Hamel points out in the report, “This indicates vendors have become more flexible and creative with pricing to convince clients to take the DT [digital transformation] leap but may see delayed ROI from DT skill investments as revenue depends on project success.”

As digital transformation continues to take root, the question of how vendors can shift to outcome-based pricing will only be asked more frequently, particularly as changes in the timing of revenue recognition from engagements impact vendors’ flexibility around resource investments. We are eager to watch (and to report) as best practices develop and new models emerge and would love to hear about what others think on this topic.

Drop a comment here or email me at [email protected].

 

Pricing research is not always about price

I recently read an article summarizing an onstage interview with Amazon CEO Jeff Bezos at the George Bush President Center. During the interview, Bezos described Amazon’s data mindset:

“We have tons of metrics. When you are shipping billions of packages a year, you need good data and metrics: Are you delivering on time? Delivering on time to every city? To apartment complexes? … [Data around] whether the packages have too much air in them, wasteful packaging … The thing I have noticed is when the anecdotes and the data disagree, the anecdotes are usually right. There’s something wrong with the way you are measuring it.”

This is critical insight for market research practitioners, including those (like myself) focused on pricing. As analysts, we tend to deep dive on the facts and seek hard evidence. We rely on the data to tell the story and articulate the outcomes. Bezos isn’t saying that we should totally discount data. What he’s saying is that data has value when contextualized and re-examined in the context of the actual customer experience.

Pricing is an inherently data-driven exercise. IT and telecom vendors lean on transactional systems, price lists, research firm pricing databases, and other data-centric tools to make pricing decisions and determine appropriate price thresholds. Most of the pricing projects that we do on behalf of our clients start with the question, “Are we competitively priced versus our peers?” That is usually the most basic component of the results that we deliver.

What we’ve found over the years doing this work is that pricing in the ICT space is often more art than science, and that customer anecdotes about pricing are often as valuable and instructive to pricing strategy as the market pricing datasets produced. Our approach to pricing research is rooted in interviews with representatives of the vendor and enterprise customer communities. Often in conducting these interviews, we’ll uncover that the root issues with pricing, which were thought to be associated with the price itself, are often broader issues — something related to value articulation, market segmentation, packaging or delivery efficiency. These aspects influence the customer experience, create pain points, and ultimately dictate willingness to pay and value capture.

When we deliver these results to our pricing research clients, the outcomes are often not only a list or street pricing change, but rather, a rethinking of a broader pricing, go-to-market or customer engagement strategy. Clients will utilize customer anecdotes to rethink how they message a product in their marketing campaigns and content, devise a new approach to customer segmentation, or take a hard look at the delivery cost structure and resource pyramid levels that are driving their price position. In designing pricing research initiatives, we encourage our clients to think more broadly about pricing and incorporate multiple organizational stakeholders into the process, as this can uncover true unforeseen drivers of price position.

How does this compare to your organization’s approach to pricing? How important are customer anecdotes to your pricing processes? Drop a comment here or email me at [email protected].

 

Is the IT hardware market ready for Hardware as a Service?

Hardware as a Service — or maybe you call it PCaaS, DaaS or XaaS — is basically referring to bundling some type of hardware (e.g., phones, PCs, servers) with life cycle services and charging a recurring fee over a multiyear contract. The customer never really owns the hardware, and the vendor takes it back at the end of the agreement.

Sure, it’s not a new concept. But the solution hasn’t exactly taken off like a rocket ship, either. So, is it going to? Maybe. Its initial speed may be more like a Vespa than a SpaceX Falcon, but there are a few things working in its favor.

Why do buyers want it?

  • Retiring hardware is a huge pain. I have talked to IT leaders who have literally acquired warehouse space solely to store old hardware they have no idea what to do with.
  • Making it easier to stay up to date with tech. Management can no longer deny the negative impact on morale brought by an unattractive, slow and/or unreliable device.
  • Automation & Internet of Things (IoT) usher in new capabilities. Who doesn’t want to make managing hardware easier? Hardware as a Service is basically IoT for your IT department. Device management features like tracking device location and health are key functions of many IoT deployments and is a core selling point of Hardware as a Service offerings.

Why do vendors want to sell it?

  • Business models are changing. That darn cloud computing had to come along and change expense models, not to mention make it easier to switch between vendors. From Spotify and Netflix to Amazon Web Services and Salesforce, “as a Service” is second nature to IT buyers in both their personal and professional lives.
  • Creating stickiness. Hardware is more often perceived as “dumb” with the software providing the real value. If you’re a hardware maker (or a VAR), you need to make the buyer see your relationship as one that’s valuable and service-oriented versus transactional.
  • Vendors desire simplicity. Most vendors will tell you they have been building similar enterprise service agreements on a one-off basis for years. These new programs will hopefully create swim lanes to make it faster and easier for partners to build solutions.

Buyers are used to monthly SaaS pricing models, but that’s not really what creates the main appeal for Hardware as a Service. Buyers really want the value-added services and fewer managerial headaches.

So, how’s it going?

As someone who manages several research streams, I get to peek at results from a lot of different studies. Here are a few snippets of things I’ve heard and seen in the last month or so.

  • Personal devices: It certainly seems like there’s the most buzz around PCs, with Dell, HP Inc. and Lenovo all promoting DaaS offerings. I have also heard from enterprises doing initial DaaS pilots with as many as 5,000 PCs, but we seem to still be in very early stages of adoption. Both PC vendors and their channel partners are beginning to report “legit” pipeline opportunities tied to DaaS.
  • Servers: Either outright purchasing or leasing servers is still the overwhelming choice of purchase method for about 90% of IT buyers recently surveyed by TBR. Perceptions that an “as a Service” model will be more expensive in the long run is the main customer concern to date that vendors will need to address via emphasizing the value-added life cycle services.
  • Hyperconverged infrastructure (HCI): A bundle of hardware and a services bundle? This is the bundle of bundles! Not too many HCI vendors are openly promoting an “as a Service” pricing model at this point, but 80% of current HCI buyers in TBR’s most recent Hyperconverged Platforms Customer Research indicated they are interested in a consumption-based purchasing model, particularly to enhance their scalability. About 84% of those surveyed are using HCI for a private or hybrid cloud buildout, so maybe a more cloud-like pricing model make sense. Make no mistake, interest is not synonymous with intent, but it’s safe to say these buyers are at least paying attention to their purchasing options.

My general verdict is that things are still moving at Vespa speed. PCs have a head start over data center hardware based on the concerted go-to-market efforts of the big three OEMs and a consumption model that more closely aligns with the consumer services we’re used to. The second half of this year will be an interesting proving ground to see if the reported pipeline growth is converted to actual customers. Depending on how that goes, maybe we’ll see the data center guys making more serious moves in this space.

What do you think? Add a comment or drop me an email at [email protected].

 

Key findings from TBR’s upcoming HCI customer research

Hyperconverged infrastructure (HCI) is a growing market ripe with opportunity for vendors. TBR forecasts the market will reach $11.7 billion by 2022. Although TBR research indicates that incumbent vendors with a strong presence in the converged infrastructure (CI) market, such as Dell EMC and Cisco, have an advantage in the space, findings also indicate that a growing number of smaller vendors are rising in popularity. Add to that the approximately one-quarter of existing customers who indicated that brand is not a key factor in their decision making, and it becomes clear that the opportunity to take share from existing vendors is high. Further, with nearly three-quarters of respondents indicating they have not yet taken the plunge into the HCI space, there is massive opportunity, through strategic marketing and support, for vendors to encourage new adopters to be their customers.

HCI has a significant place in the cloud market

Eighty-four percent of respondents indicated they are leveraging HCI for either hybrid or private cloud installations. TBR believes this suggests that cloud is not necessarily an inhibitor to HCI adoption, as some vendors may perceive. Further, we believe this signals that consumption-based pricing options, which 81% of respondents indicated they would be interested in considering in the future, will encourage more HCI adoption. Consumption-based pricing enables customers to select HCI for a capex solution as well as for public cloud if they choose, and they can simply compare performance and other features between the two to make purchasing decisions. Vendors can capitalize on this flexibility with strategic marketing.

IT leaders play a crucial role in the HCI decision-making process

HCI remains a strategic purchase, as evidenced by the fact that 74% of respondents indicated IT directors and managers were one of the decision makers. TBR believes that as customers become more familiar with HCI and their HCI vendor, they will be more likely to make repeat purchases and will be less likely to demand direct-from-vendor sales.

To learn more about TBR’s Hyperconverged Platforms Customer Research, contact Stanley Stevens ([email protected]) or your account executive.

 

Democratization now: It’s good for business

Data democratization is a hot topic now. Spokespeople from Google, SAP, Capgemini and other tech companies have spoken and written about how making data available to as many people as possible will both unleash the power of technology and prevent abuse of closely held data. Microsoft CEO Satya Nadella sprinkles his talks and interviews with references to democratization. TBR agrees that data access is critical to achieving artificial intelligence’s (AI) considerable potential, but access is not enough. For AI to do what it can do, business people with domain knowledge, who are regular users, must be able to work directly with data, without intervening layers of developers and data scientists.

Data access is a conundrum. Ensuring appropriate privacy and security while making data available to as many people as possible is a challenge, one that is inhibiting the growth of widespread AI-driven data analysis. This post will not address that challenge, however. It focuses on one of the other growth inhibitors: the current need for data experts, scientists, engineers and janitors, as well as developers, to extract the value from data.

Business users might see the hierarchy of data experts as a priesthood or a bureaucracy, standing between them and the data, but that is not really what is happening. Currently, there are no tools with which business users can conduct their own analyses, at least not without a lot of preparation by the data experts. Better tools are coming; there are R&D efforts worldwide to make data more directly accessible, which is part of what Nadella and other spokespeople are talking about.

Before these democratic tools are made available, there is strong growth in AI and the utilization of data analytics, because the value is there. But the need for experts greatly increases the cost of analysis, so only analyses with the highest potential value are performed. As more democratic tools become available, many more analytic projects will be worthwhile and the use of analytics will grow much faster.

The impact of democratized analytics tools will be huge because the costs associated with the data expert hierarchy are great. Those costs go beyond just personnel. Communication between business users and data experts is time-consuming and expensive, and it lowers the quality and value of the analyses. Business users and data experts live in different worlds and have different vocabularies; misunderstandings are common. Misunderstandings are expensive, but what is worse, working through intermediaries slows the iterative process by orders of magnitude. The potential value of data lies in insights, and finding insight is an iterative process.

The history of business technology is a progress propelled by increasing democratization of tools. The PC itself is prime example, making computing directly available to business users. The internet rode a wave of disintermediation and self-service to its current global saturation. In terms of democratization of AI analytics, the best parallel is the PC spreadsheet, which made it possible for business people to create and tune their own quantitative models of business activities. Before the spreadsheet, creating those models required coding.

“Spreadsheets for AI,” one of which may well be a version of Microsoft’s Excel, will accelerate growth in analytics, data access, storage, cloud services and the Internet of Things. AI spreadsheets will not retard growth in the use of data experts; they serve a different market. Even with very good first versions, broad adoption will take years, so the acceleration of growth will not be sudden. Over the years, however, the ability of business users to directly analyze their data will contribute substantially to the revenue of IT vendors and to that of their customers.