Pricing research is not always about price

I recently read an article summarizing an onstage interview with Amazon CEO Jeff Bezos at the George Bush President Center. During the interview, Bezos described Amazon’s data mindset:

“We have tons of metrics. When you are shipping billions of packages a year, you need good data and metrics: Are you delivering on time? Delivering on time to every city? To apartment complexes? … [Data around] whether the packages have too much air in them, wasteful packaging … The thing I have noticed is when the anecdotes and the data disagree, the anecdotes are usually right. There’s something wrong with the way you are measuring it.”

This is critical insight for market research practitioners, including those (like myself) focused on pricing. As analysts, we tend to deep dive on the facts and seek hard evidence. We rely on the data to tell the story and articulate the outcomes. Bezos isn’t saying that we should totally discount data. What he’s saying is that data has value when contextualized and re-examined in the context of the actual customer experience.

Pricing is an inherently data-driven exercise. IT and telecom vendors lean on transactional systems, price lists, research firm pricing databases, and other data-centric tools to make pricing decisions and determine appropriate price thresholds. Most of the pricing projects that we do on behalf of our clients start with the question, “Are we competitively priced versus our peers?” That is usually the most basic component of the results that we deliver.

What we’ve found over the years doing this work is that pricing in the ICT space is often more art than science, and that customer anecdotes about pricing are often as valuable and instructive to pricing strategy as the market pricing datasets produced. Our approach to pricing research is rooted in interviews with representatives of the vendor and enterprise customer communities. Often in conducting these interviews, we’ll uncover that the root issues with pricing, which were thought to be associated with the price itself, are often broader issues — something related to value articulation, market segmentation, packaging or delivery efficiency. These aspects influence the customer experience, create pain points, and ultimately dictate willingness to pay and value capture.

When we deliver these results to our pricing research clients, the outcomes are often not only a list or street pricing change, but rather, a rethinking of a broader pricing, go-to-market or customer engagement strategy. Clients will utilize customer anecdotes to rethink how they message a product in their marketing campaigns and content, devise a new approach to customer segmentation, or take a hard look at the delivery cost structure and resource pyramid levels that are driving their price position. In designing pricing research initiatives, we encourage our clients to think more broadly about pricing and incorporate multiple organizational stakeholders into the process, as this can uncover true unforeseen drivers of price position.

How does this compare to your organization’s approach to pricing? How important are customer anecdotes to your pricing processes? Drop a comment here or email me at [email protected].

 

Is the IT hardware market ready for Hardware as a Service?

Hardware as a Service — or maybe you call it PCaaS, DaaS or XaaS — is basically referring to bundling some type of hardware (e.g., phones, PCs, servers) with life cycle services and charging a recurring fee over a multiyear contract. The customer never really owns the hardware, and the vendor takes it back at the end of the agreement.

Sure, it’s not a new concept. But the solution hasn’t exactly taken off like a rocket ship, either. So, is it going to? Maybe. Its initial speed may be more like a Vespa than a SpaceX Falcon, but there are a few things working in its favor.

Why do buyers want it?

  • Retiring hardware is a huge pain. I have talked to IT leaders who have literally acquired warehouse space solely to store old hardware they have no idea what to do with.
  • Making it easier to stay up to date with tech. Management can no longer deny the negative impact on morale brought by an unattractive, slow and/or unreliable device.
  • Automation & Internet of Things (IoT) usher in new capabilities. Who doesn’t want to make managing hardware easier? Hardware as a Service is basically IoT for your IT department. Device management features like tracking device location and health are key functions of many IoT deployments and is a core selling point of Hardware as a Service offerings.

Why do vendors want to sell it?

  • Business models are changing. That darn cloud computing had to come along and change expense models, not to mention make it easier to switch between vendors. From Spotify and Netflix to Amazon Web Services and Salesforce, “as a Service” is second nature to IT buyers in both their personal and professional lives.
  • Creating stickiness. Hardware is more often perceived as “dumb” with the software providing the real value. If you’re a hardware maker (or a VAR), you need to make the buyer see your relationship as one that’s valuable and service-oriented versus transactional.
  • Vendors desire simplicity. Most vendors will tell you they have been building similar enterprise service agreements on a one-off basis for years. These new programs will hopefully create swim lanes to make it faster and easier for partners to build solutions.

Buyers are used to monthly SaaS pricing models, but that’s not really what creates the main appeal for Hardware as a Service. Buyers really want the value-added services and fewer managerial headaches.

So, how’s it going?

As someone who manages several research streams, I get to peek at results from a lot of different studies. Here are a few snippets of things I’ve heard and seen in the last month or so.

  • Personal devices: It certainly seems like there’s the most buzz around PCs, with Dell, HP Inc. and Lenovo all promoting DaaS offerings. I have also heard from enterprises doing initial DaaS pilots with as many as 5,000 PCs, but we seem to still be in very early stages of adoption. Both PC vendors and their channel partners are beginning to report “legit” pipeline opportunities tied to DaaS.
  • Servers: Either outright purchasing or leasing servers is still the overwhelming choice of purchase method for about 90% of IT buyers recently surveyed by TBR. Perceptions that an “as a Service” model will be more expensive in the long run is the main customer concern to date that vendors will need to address via emphasizing the value-added life cycle services.
  • Hyperconverged infrastructure (HCI): A bundle of hardware and a services bundle? This is the bundle of bundles! Not too many HCI vendors are openly promoting an “as a Service” pricing model at this point, but 80% of current HCI buyers in TBR’s most recent Hyperconverged Platforms Customer Research indicated they are interested in a consumption-based purchasing model, particularly to enhance their scalability. About 84% of those surveyed are using HCI for a private or hybrid cloud buildout, so maybe a more cloud-like pricing model make sense. Make no mistake, interest is not synonymous with intent, but it’s safe to say these buyers are at least paying attention to their purchasing options.

My general verdict is that things are still moving at Vespa speed. PCs have a head start over data center hardware based on the concerted go-to-market efforts of the big three OEMs and a consumption model that more closely aligns with the consumer services we’re used to. The second half of this year will be an interesting proving ground to see if the reported pipeline growth is converted to actual customers. Depending on how that goes, maybe we’ll see the data center guys making more serious moves in this space.

What do you think? Add a comment or drop me an email at [email protected].

 

Key findings from TBR’s upcoming HCI customer research

Hyperconverged infrastructure (HCI) is a growing market ripe with opportunity for vendors. TBR forecasts the market will reach $11.7 billion by 2022. Although TBR research indicates that incumbent vendors with a strong presence in the converged infrastructure (CI) market, such as Dell EMC and Cisco, have an advantage in the space, findings also indicate that a growing number of smaller vendors are rising in popularity. Add to that the approximately one-quarter of existing customers who indicated that brand is not a key factor in their decision making, and it becomes clear that the opportunity to take share from existing vendors is high. Further, with nearly three-quarters of respondents indicating they have not yet taken the plunge into the HCI space, there is massive opportunity, through strategic marketing and support, for vendors to encourage new adopters to be their customers.

HCI has a significant place in the cloud market

Eighty-four percent of respondents indicated they are leveraging HCI for either hybrid or private cloud installations. TBR believes this suggests that cloud is not necessarily an inhibitor to HCI adoption, as some vendors may perceive. Further, we believe this signals that consumption-based pricing options, which 81% of respondents indicated they would be interested in considering in the future, will encourage more HCI adoption. Consumption-based pricing enables customers to select HCI for a capex solution as well as for public cloud if they choose, and they can simply compare performance and other features between the two to make purchasing decisions. Vendors can capitalize on this flexibility with strategic marketing.

IT leaders play a crucial role in the HCI decision-making process

HCI remains a strategic purchase, as evidenced by the fact that 74% of respondents indicated IT directors and managers were one of the decision makers. TBR believes that as customers become more familiar with HCI and their HCI vendor, they will be more likely to make repeat purchases and will be less likely to demand direct-from-vendor sales.

To learn more about TBR’s Hyperconverged Platforms Customer Research, contact Stanley Stevens ([email protected]) or your account executive.

 

Democratization now: It’s good for business

Data democratization is a hot topic now. Spokespeople from Google, SAP, Capgemini and other tech companies have spoken and written about how making data available to as many people as possible will both unleash the power of technology and prevent abuse of closely held data. Microsoft CEO Satya Nadella sprinkles his talks and interviews with references to democratization. TBR agrees that data access is critical to achieving artificial intelligence’s (AI) considerable potential, but access is not enough. For AI to do what it can do, business people with domain knowledge, who are regular users, must be able to work directly with data, without intervening layers of developers and data scientists.

Data access is a conundrum. Ensuring appropriate privacy and security while making data available to as many people as possible is a challenge, one that is inhibiting the growth of widespread AI-driven data analysis. This post will not address that challenge, however. It focuses on one of the other growth inhibitors: the current need for data experts, scientists, engineers and janitors, as well as developers, to extract the value from data.

Business users might see the hierarchy of data experts as a priesthood or a bureaucracy, standing between them and the data, but that is not really what is happening. Currently, there are no tools with which business users can conduct their own analyses, at least not without a lot of preparation by the data experts. Better tools are coming; there are R&D efforts worldwide to make data more directly accessible, which is part of what Nadella and other spokespeople are talking about.

Before these democratic tools are made available, there is strong growth in AI and the utilization of data analytics, because the value is there. But the need for experts greatly increases the cost of analysis, so only analyses with the highest potential value are performed. As more democratic tools become available, many more analytic projects will be worthwhile and the use of analytics will grow much faster.

The impact of democratized analytics tools will be huge because the costs associated with the data expert hierarchy are great. Those costs go beyond just personnel. Communication between business users and data experts is time-consuming and expensive, and it lowers the quality and value of the analyses. Business users and data experts live in different worlds and have different vocabularies; misunderstandings are common. Misunderstandings are expensive, but what is worse, working through intermediaries slows the iterative process by orders of magnitude. The potential value of data lies in insights, and finding insight is an iterative process.

The history of business technology is a progress propelled by increasing democratization of tools. The PC itself is prime example, making computing directly available to business users. The internet rode a wave of disintermediation and self-service to its current global saturation. In terms of democratization of AI analytics, the best parallel is the PC spreadsheet, which made it possible for business people to create and tune their own quantitative models of business activities. Before the spreadsheet, creating those models required coding.

“Spreadsheets for AI,” one of which may well be a version of Microsoft’s Excel, will accelerate growth in analytics, data access, storage, cloud services and the Internet of Things. AI spreadsheets will not retard growth in the use of data experts; they serve a different market. Even with very good first versions, broad adoption will take years, so the acceleration of growth will not be sudden. Over the years, however, the ability of business users to directly analyze their data will contribute substantially to the revenue of IT vendors and to that of their customers.

 

Unchecked cloud IoT costs can quickly spiral upward

The amount of data you transmit and store and analyze is potentially infinite. You can measure things however often you want. And if you measure it often, the amount of data grows without bounds. — Ezra Gottheil, Principal Analyst

Full Article

Why partners are absolutely vital to Lenovo

If you take a look at industry benchmarks, such as those recently published by Technology Business Research, you see Lenovo outscoring its peers in customer satisfaction[1] in almost every attribute, giving Lenovo the highest overall score. And, for the tenth straight year, IBM and Lenovo servers again achieved top rankings[2] in ITIC’s 2017-2018 Global Server Hardware and Server OS Reliability survey. — Daniel Callahan, Analyst

Full Article

Critical success factors for successful pricing research

In my day-to-day life at TBR, I regularly interact with clients seeking to undertake pricing research. Their needs are varied. Some want to understand pricing for a new product or service or sustain their competitive position for an existing offering, while others seek to design an overarching commercial strategy or to increase the effectiveness of their sales teams by arming them with tactical data and insights — and nearly all are focused on influencing revenue and margin.

Capturing pricing data that can be utilized defensibly for decision making is challenging. All pricing is situational and can be influenced by any number of factors. Pricing decisions influence, and are influenced by, nearly all organizational departments, from sales and finance to product management and business strategy, and thus are often highly politicized within ICT enterprises.

Based on our regular experience in serving the pricing research and consulting needs of our client base across ICT industry segments, we have identified five critical success factors that can help clients navigate these challenges:

  • Start with outcomes: We often find that our customers come to us with a research concept in mind, but not a defined goal or set of operational plans for how the research will be deployed in their organization. Sometimes the request is: “We need to know what vendors like us are charging.” But the real goal of the team may be to answer the question: How can we be more efficient in resourcing deals? By starting with the end goal and use case in mind, we find that we often explore areas adjacent to pricing, and that insights on those topics, in concert with pricing data, unlock business value for our client base.
  • Focus on business impact: For all the research we do at TBR, including in pricing, we advise our clients to frame all research needs around the underlying business impact. We design projects, including the questions we propose to cover in primary research and the data that we seek to capture, to ensure that the recommendations we deliver around pricing aim directly at influencing business strategy, revenue and profitability.
  • Focus on context: Our pricing research methodology relies on interviews with vendors and customers. This approach allows us to capture not only pricing data but also contextual data and insights on topics such as discounting, commercial incentives, pricing structures and portfolios. When paired with core quantitative pricing data, these types of interview-driven insights provide predictive value focused on vendor and customer pricing and consumption behavior.
  • Build market constructs: To normalize against deal-specific influencers that can impact a true view of market pricing, we design our research to focus on deal constructs. These constructs are used in all interviews to ensure apples-to-apples comparisons and that we are characterizing a full spectrum of potential price points. Context on topics such as discounting is addressed through qualitative conversations.
  • Consider adjacent markets: Many times, particularly with clients seeking to stand up pricing models and price levels for new offerings, we find that the direct peer landscape may not be the best basis of comparison. By looking at adjacent offerings and considering how similar-yet-adjacent offerings are packaged and delivered, clients are able to gain a broader foothold in their peer landscape in its entirety, and often identify areas to elevate value proposition and raise prices accordingly.

 

SAIC sees more market stability & another CR in September

To remain competitive in those (SETA) areas may require engaging in M&A to add scale; alternatively, moving up the value chain means investing more in applications development and higher-skilled talent. SAIC has options, and its next choices will determine its fate in a rapidly changing industry. — Joey Cresta, Analyst

Full Article

JEDI’s disruption may go beyond the cloud

That shift is “elevating consensus-building into a prerequisite for embarking on disruptive technology adoption” for desired government outcomes. — Joey Cresta, Analyst

Full Article

Competition will intensify in the U.S. telecom market heading into 2020 due to the launch of 5G services and the potential T-Mobile/Sprint merger

HAMPTON, N.H. (June 8, 2018) — Wireless revenue rose 3.1% year-to-year to $58.4 billion among U.S. carriers covered in Technology Business Research Inc.’s (TBR) 1Q18 U.S. & Canada Mobile Operator Benchmark as higher equipment revenue spurred by the adoption of premium devices offset continuing service revenue declines. Increased adoption of premium devices is benefiting equipment revenue as devices such as the iPhone X have pushed the acceptable average selling price for smartphones to over $1,000. Verizon, AT&T and Sprint expect service revenue declines will gradually moderate in 2018 as the bulk of customers are now on unsubsidized service plans. However, service revenue will be negatively impacted by the new ASC 606 industry accounting standard as well as lower overage revenue stemming from the growing adoption of unlimited data plans.

The report examines how the recently announced T-Mobile and Sprint merger would disrupt the wireless and cable industries should it gain regulatory approval. “The proposed T-Mobile merger would serve as a lifeboat for Sprint, alleviating the company’s long-term financial challenges, including its high debt load and struggles to generate positive net income,” said TBR Telecom Analyst Steve Vachon. “The scale gained from Sprint’s operations would also enable T-Mobile to compete more aggressively in the 5G era and strengthen its margins, which historically have trailed those of Verizon and AT&T.”

The proposed T-Mobile and Sprint merger would also disrupt the cable and pay-TV industries, as the combined company would have a base of over 125 million wireless subscribers to which it could cross-sell T-Mobile’s upcoming Layer3 TV video platform. The combined company’s 5G services may also serve as a replacement for traditional wireline broadband connectivity as the combination of T-Mobile’s and Sprint’s spectrum would yield estimated speeds of 450Mbps on a national average.

Combined wireless revenue among Tier 1 Canadian carriers rose 8.8% year-to-year to $6 billion due to continued postpaid additions spurred by shared data programs and higher data usage arising from the accelerated speeds offered by LTE-Advanced services. Despite steady increases in smartphone penetration in Canada, the postpaid market continues to flourish as Rogers and Bell Mobility reported their highest first quarter postpaid net additions in 1Q18 since 2009 and 2011, respectively. Subscriber growth will remain strong throughout 2018 as prepaid customers transition to postpaid plans with higher average revenue per user (ARPU) and as connected device adoption increases in Canada. Carriers will also have the opportunity to target first-time wireless customers in Canada, including youths and young adults as well as immigrants entering the country.

The U.S. & Canada Mobile Operator Benchmark details and compares the activities of the largest U.S. and Canadian operators, including financial performance, go-to-market initiatives and resource management strategies. Covered companies include AT&T, Verizon, Sprint, T-Mobile, U.S. Cellular, Rogers, Telus and Bell Mobility.