Telecom vendors anticipate revenue from 5G in select countries as early as 2H18, but lower China capex drove market decline in 1Q18

HAMPTON, N.H. (June 29, 2018) According to Technology Business Research, Inc.’s (TBR) 1Q18 Telecom Vendor Benchmark, the conclusion of LTE coverage projects in China hampered revenue for the largest vendors. A reduction in demand from telecom operators for routing and switching products also caused revenue to decline for Cisco, Juniper and Nokia. In this market downturn, vendors are employing various strategies to maintain margins and mitigate revenue declines while eyeing initial commercial 5G rollouts, which are set to begin in the U.S. in 2H18.

“Suppliers are trying to sell into the IT environments of operators, engaging with the webscale customer segment, and expanding software portfolios to partially offset falling telecom operator capex,” said TBR Telecom Senior Analyst Michael Soper. “The shift in the revenue mix toward software is also helping vendors maintain operating margins in spite of lower hardware volume. Vendors are also growing their use of automation and artificial intelligence in service delivery to improve profitability by reducing their reliance on human resources.”

Western-based vendors are preparing their portfolios to build out 5G for U.S.-based operators in 2H18. Several operators have aggressive 5G rollout timetables and intend to leverage the technology for fixed wireless broadband and/or to support their mobile broadband densification initiatives. Vendors that have high exposure to the U.S. and are well aligned with market trends such as 5G, media convergence and digital transformation will likely increase market share over the next two years as operators in the region are expected to aggressively invest in these areas starting in 2H18.

The Telecom Vendor Benchmark details and compares the initiatives and tracks the revenue and performance of the largest telecom vendors in segments including infrastructure, services and applications and in geographies including the Americas, EMEA and APAC. The report includes information on market leaders, vendor positioning, vendor market share, key deals, acquisitions, alliances, go-to-market strategies and personnel developments.

For additional information about this research or to arrange a one-on-one analyst briefing, please contact Dan Demers at +1 603.929.1166 or [email protected].

 

ABOUT TBR

Technology Business Research, Inc. is a leading independent technology market research and consulting firm specializing in the business and financial analyses of hardware, software, professional services, and telecom vendors and operators. Serving a global clientele, TBR provides timely and actionable market research and business intelligence in a format that is uniquely tailored to clients’ needs. Our analysts are available to address client-specific issues further or information needs on an inquiry or proprietary consulting basis.

TBR has been empowering corporate decision makers since 1996. For more information please visit www.tbri.com.

LiveWorx 2018: Better together with others

In just over three years, the Internet of Things (IoT) has rapidly evolved on a path to maturity. But it still has miles to go. When TBR started closely watching IoT, it was a nebulous set of technologies that promised to radically change a customer’s business. IoT was surrounded by tremendous hype, and just about every company claimed it had the winning solution but lacked substantiated proof. While much of the ambiguous messaging and ill-defined solutions remain, leading to customer uncertainty and hesitancy to adopt, the thought leaders among the IoT vendor community are starting to arrive at a new understanding. Shane O’Callaghan of TSM Control Systems, a PTC (Nasdaq: PTC) customer, explained it in the most straightforward manner: “IoT is not a technology project, it’s a business project.” TBR believes leading vendors are molding their IoT go-to-market strategies around solving tactical business problems with solutions proved by case studies and that those vendors are gaining traction because of it.

Many of the murmurs from industry watchers at LiveWorx 2018 suggested vendors are disappointed by IoT. The “technology” (a misnomer, TBR believes, as we consider IoT a technique for solving business problems using a combination of technology components and services rather than a technology in and of itself) has failed to meet forecasts, most likely made by analysts and line-of-business (LOB) managers who fell victim to the hype. PTC CEO Jim Heppelmann told analysts that he agrees that most vendors were victims of the hype cycle, leading to an oversaturated market, and that as a result, many will wash out or retreat slowly if they haven’t already (much like GE is now downsizing its IoT aspirations). He assured us that PTC won’t retreat, however, with that promise centered on PTC’s robust toolkit, use case-oriented go-to-market strategy and partner-friendly stance.

Your digital transformation center needs leadership

Start with a new space, furnish it with funky chairs, nontraditional work spaces and all the latest technologies. Recruit creative talent, mixed with some data scientists and wonder-tech folks, plus seasoned strategists. Bring in current clients and consult on digital transformation.

As companies implement this playbook, a couple of common themes and challenges are emerging, mostly around client selection, talent management and technology partner cooperation. (Look for future blog posts on all three of these.) We had the pleasure of meeting many of the leaders at these new digital transformation centers (in Miami, Dublin, Frankfurt, Dallas, New York City and more), and I noticed common traits among the people charged with running these new places: passionate, invested, visionary. Some places took a kind of “buddy cop” approach, pairing a creative with an executioner (in a good way, for both). Some bolted long-standing capabilities onto an acquisition. The real kicker: these centers need nonstandard leaders, even as the larger firm — the board that just invested $20 million in a new space and new talent — wants to ensure the investment pays off and puts a trusted, almost always longtime company professional in charge. And that makes leadership more critical than ever.

The best we’ve met (a highly subjective and personal assessment) echoed lessons I learned during my brief time in the U.S. Army and my long exposure to U.S. military culture: train everyone, especially the leaders, and train them for their next job; promote them when they’re ready and support them with more training as their responsibilities evolve. One center leader described to me how her company invested in her management skills, ensuring she could handle the diverse set of backgrounds, skills, expectations, and corporate cultural mindsets she would be leading at the new center. Longtime professionals who grew up within a firm might be able to manage teams mixed with experienced and new hires. But leading such a team requires skills not typically gained from serving only in one organization or growing professionally mostly through similar roles.

As much as I’ve enjoyed digging deeper into the substance behind the hype of these centers — the funky chairs and bleeding-edge tech and clients taking journeys to digital transformations — we still want to understand the business case, the strategies and the metrics that determine whether these substantial investments of money and brand are beginning to pay off. From what we can see to date, success still relies on what it always has: leadership and teamwork. Companies recognizing this lead the pack right now, especially as that pack becomes crowded with cloud, network and legacy IT vendors all looking to play in the digital transformation space.

 

When the expected cost does not match the actual cost in cloud

In the relationship between customer and business, expectations are everything. In a lot of ways, the shift to cloud computing has evened the playing field for what is expected in terms of cost, responsibilities, and the services exchanged between IT customers and providers. With cloud services, customers can experience far more of a service before buying it, see a clear unit price from the outset and understand the constraints of the service-level agreements. However, uncertainty still lingers in the exact specifications for many solutions, as the complexity of the design and variability of the actual utilization continue to make accurately predicting real-world cost for cloud solutions difficult for many customers.

While there is still typically a difference in cost between on-premises IT deployments and cloud, the unexpectedly higher cost of cloud projects impacts the market in several ways. Highly efficient and mature IT organizations can often use their own internal resources to compete with the price points delivered through public cloud options. For those customers, cost overruns make cloud deployments more expensive, rather than slightly more cost-effective, when compared with utilizing internal resources. Beyond those marginal customers, however, cost overruns universally wreak havoc on internal budgeting processes, which depend on predictable cost structures. Particularly compared with more stable internal IT funding, variability on a monthly basis puts serious stress on the finance function. Lastly, cost overruns impact other IT project decisions, serving as a deterrent to new and expanded cloud projects. In this respect, this unpredictability is bad for all cloud vendors, providing motivation for these providers collectively to further clarify the customer’s financial expectations.

For cloud infrastructure market leader Amazon Web Services (AWS), the problem seems particularly relevant, coming up in a number of discussions with customers. For a vendor that is far ahead of nearest competitor Microsoft, cost overruns are on one of the biggest flaws in the current AWS customer experience. The struggle of customers starts with matching the vast catalog of services available from AWS to the actual IT solution needed. Many customers noted their initial designs, once implemented, lack the performance needed to meet their performance requirements. Whether it’s the tier of storage or the amount of data transfer, customers are forced to change the configuration to more expensive options or incur usage charges above their original expectations.

In speaking with decision makers about their experience, the need for more assistance in both the design and operational optimization are needed to close this gap in the initial expectations and actual costs of cloud implementations. These budget overages may be positive for AWS in the short term, contributing to some of its continued strong revenue growth, but in the long term, they could be the most profound threat. If second-tier public cloud vendors such as IBM, Microsoft and Google can develop and deliver streamlined design and pricing calculators that address these unexpected cost overruns, it could very well help carve out a market niche that certainly adds value for customers. AWS may not have as much incentive to close these expectation gaps, but for competing vendors, any advantage is critical and this could be one that makes a difference long term.

 

Note: This blog is the first in a series driven by TBR’s Cloud Customer Research reports, which focus on applications and infrastructure workloads. More than 50 interviews and 200 surveys are being conducted. Blog posts like this one will highlight the key trends and topics impacting the cloud industry.

Converging trends enable software and IT services firms to continue to grow despite lower operator capex

HAMPTON, N.H. (June 27, 2018) — According to Technology Business Research, Inc.’s (TBR) 1Q18 Telecom Infrastructure Services Benchmark, post-peak RAN spend drove down telecom infrastructure services (TIS) revenue in local currency terms at some of the largest vendors including Ericsson, Huawei and Nokia. Meanwhile, software- and services-centric firms grew revenue as they capitalized on digitalization trends and/or provided outsourcing and C&SI to operators and webscales alike.

“Operators’ push toward ICT convergence and digitalization sets IT services firms up for continued growth,” said TBR Telecom Senior Analyst Michael Soper. “IT services firms are able to help telecom and webscale customers migrate to new technologies and implement new business models, both of which are in high demand as these companies pursue digital transformation.”

Webscale spend remains robust, but TIS work is confined to certain segments of vendors. Webscales seek out cutting-edge technology and typically contract with the OEM for product-attached services. Companies benefitting from this process include Ciena, which is providing optical equipment to webscales, and Nokia, which has many opportunities to sell into webscales with its end-to-end optical portfolio, premium IP routing and switching products. IT services firms, particularly Accenture, are providing back-office outsourcing and software SI.

Vendors with high exposure to the U.S. market and those that are well-aligned with market trends such as 5G, media convergence and digital transformation will likely increase their share of the TIS market over the next two years, as TBR expects operators in the region to aggressively invest in these areas starting in 2H18.

TBR’s Telecom Infrastructure Services Benchmark provides quarterly analysis of the deployment, maintenance, professional services and managed services markets for network and IT suppliers. Suppliers covered include Accenture, Amdocs, Atos, Capgemini, CGI, China Communications Services, Ciena, Cisco, CommScope, CSG International, Ericsson, Fujitsu, Hewlett Packard Enterprise, Huawei, IBM, Infosys, Juniper Networks, NEC, Nokia, Oracle, Samsung, SAP, Tata Consultancy Services, Tech Mahindra, Wipro and ZTE.

For additional information about this research or to arrange a one-on-one analyst briefing, please contact Dan Demers at +1 603.929.1166 or [email protected].

 

ABOUT TBR

Technology Business Research, Inc. is a leading independent technology market research and consulting firm specializing in the business and financial analyses of hardware, software, professional services, and telecom vendors and operators. Serving a global clientele, TBR provides timely and actionable market research and business intelligence in a format that is uniquely tailored to clients’ needs. Our analysts are available to address client-specific issues further or information needs on an inquiry or proprietary consulting basis.

TBR has been empowering corporate decision makers since 1996. For more information please visit www.tbri.com.

Will digital transformation be the catalyst for adoption of new outcome-based pricing models?

Every day I find myself reading about the developments happening in business-to-consumer (B2C) pricing.

Here’s a sample of those that jumped out recently:

These developments highlight the growing momentum behind providing dynamic, value-based and outcome-based pricing models, a movement being driven by companies’ desires to provide personalized customer experiences at scale.

While this push has been most publicized and noteworthy in the B2C world, driven by the likes of Uber, Netflix and MoviePass, it also consistently permeates the complex business-to-business (B2B) IT products and services world that we focus on. “How do we shift from a cost-plus to value-based pricing model? Are companies really doing outcome-based pricing? Who is doing it well, and for what types of customers? How?” These are common questions vendors are trying to sort through as they change their businesses.

Often, we’ve heard that IT vendors are serious about making outcome-based pricing models work, but the customers are putting the brakes on these types of arrangements. Customers will ultimately balk at the variability and risk of an outcome-based arrangement at some stage of a deal negotiation and push vendors to offer predictable fixed-price engagements. Customers like the idea of not paying when an outcome is not achieved more than sharing the benefit of an outcome that is met, and somewhere in that trade-off the fallback becomes a traditional contractual arrangement.

What’s interesting is that based on recent research, this customer hesitance seems to be abating. In our 2H17 Digital Transformation Customer Research, we asked 165 global enterprises that are undertaking digital transformation initiatives to identify the pricing structures they’ve experienced, and outcome-based pricing emerged as the most common model globally.

As my colleague Jen Hamel points out in the report, “This indicates vendors have become more flexible and creative with pricing to convince clients to take the DT [digital transformation] leap but may see delayed ROI from DT skill investments as revenue depends on project success.”

As digital transformation continues to take root, the question of how vendors can shift to outcome-based pricing will only be asked more frequently, particularly as changes in the timing of revenue recognition from engagements impact vendors’ flexibility around resource investments. We are eager to watch (and to report) as best practices develop and new models emerge and would love to hear about what others think on this topic.

Drop a comment here or email me at [email protected].

 

Pricing research is not always about price

I recently read an article summarizing an onstage interview with Amazon CEO Jeff Bezos at the George Bush President Center. During the interview, Bezos described Amazon’s data mindset:

“We have tons of metrics. When you are shipping billions of packages a year, you need good data and metrics: Are you delivering on time? Delivering on time to every city? To apartment complexes? … [Data around] whether the packages have too much air in them, wasteful packaging … The thing I have noticed is when the anecdotes and the data disagree, the anecdotes are usually right. There’s something wrong with the way you are measuring it.”

This is critical insight for market research practitioners, including those (like myself) focused on pricing. As analysts, we tend to deep dive on the facts and seek hard evidence. We rely on the data to tell the story and articulate the outcomes. Bezos isn’t saying that we should totally discount data. What he’s saying is that data has value when contextualized and re-examined in the context of the actual customer experience.

Pricing is an inherently data-driven exercise. IT and telecom vendors lean on transactional systems, price lists, research firm pricing databases, and other data-centric tools to make pricing decisions and determine appropriate price thresholds. Most of the pricing projects that we do on behalf of our clients start with the question, “Are we competitively priced versus our peers?” That is usually the most basic component of the results that we deliver.

What we’ve found over the years doing this work is that pricing in the ICT space is often more art than science, and that customer anecdotes about pricing are often as valuable and instructive to pricing strategy as the market pricing datasets produced. Our approach to pricing research is rooted in interviews with representatives of the vendor and enterprise customer communities. Often in conducting these interviews, we’ll uncover that the root issues with pricing, which were thought to be associated with the price itself, are often broader issues — something related to value articulation, market segmentation, packaging or delivery efficiency. These aspects influence the customer experience, create pain points, and ultimately dictate willingness to pay and value capture.

When we deliver these results to our pricing research clients, the outcomes are often not only a list or street pricing change, but rather, a rethinking of a broader pricing, go-to-market or customer engagement strategy. Clients will utilize customer anecdotes to rethink how they message a product in their marketing campaigns and content, devise a new approach to customer segmentation, or take a hard look at the delivery cost structure and resource pyramid levels that are driving their price position. In designing pricing research initiatives, we encourage our clients to think more broadly about pricing and incorporate multiple organizational stakeholders into the process, as this can uncover true unforeseen drivers of price position.

How does this compare to your organization’s approach to pricing? How important are customer anecdotes to your pricing processes? Drop a comment here or email me at [email protected].

 

Is the IT hardware market ready for Hardware as a Service?

Hardware as a Service — or maybe you call it PCaaS, DaaS or XaaS — is basically referring to bundling some type of hardware (e.g., phones, PCs, servers) with life cycle services and charging a recurring fee over a multiyear contract. The customer never really owns the hardware, and the vendor takes it back at the end of the agreement.

Sure, it’s not a new concept. But the solution hasn’t exactly taken off like a rocket ship, either. So, is it going to? Maybe. Its initial speed may be more like a Vespa than a SpaceX Falcon, but there are a few things working in its favor.

Why do buyers want it?

  • Retiring hardware is a huge pain. I have talked to IT leaders who have literally acquired warehouse space solely to store old hardware they have no idea what to do with.
  • Making it easier to stay up to date with tech. Management can no longer deny the negative impact on morale brought by an unattractive, slow and/or unreliable device.
  • Automation & Internet of Things (IoT) usher in new capabilities. Who doesn’t want to make managing hardware easier? Hardware as a Service is basically IoT for your IT department. Device management features like tracking device location and health are key functions of many IoT deployments and is a core selling point of Hardware as a Service offerings.

Why do vendors want to sell it?

  • Business models are changing. That darn cloud computing had to come along and change expense models, not to mention make it easier to switch between vendors. From Spotify and Netflix to Amazon Web Services and Salesforce, “as a Service” is second nature to IT buyers in both their personal and professional lives.
  • Creating stickiness. Hardware is more often perceived as “dumb” with the software providing the real value. If you’re a hardware maker (or a VAR), you need to make the buyer see your relationship as one that’s valuable and service-oriented versus transactional.
  • Vendors desire simplicity. Most vendors will tell you they have been building similar enterprise service agreements on a one-off basis for years. These new programs will hopefully create swim lanes to make it faster and easier for partners to build solutions.

Buyers are used to monthly SaaS pricing models, but that’s not really what creates the main appeal for Hardware as a Service. Buyers really want the value-added services and fewer managerial headaches.

So, how’s it going?

As someone who manages several research streams, I get to peek at results from a lot of different studies. Here are a few snippets of things I’ve heard and seen in the last month or so.

  • Personal devices: It certainly seems like there’s the most buzz around PCs, with Dell, HP Inc. and Lenovo all promoting DaaS offerings. I have also heard from enterprises doing initial DaaS pilots with as many as 5,000 PCs, but we seem to still be in very early stages of adoption. Both PC vendors and their channel partners are beginning to report “legit” pipeline opportunities tied to DaaS.
  • Servers: Either outright purchasing or leasing servers is still the overwhelming choice of purchase method for about 90% of IT buyers recently surveyed by TBR. Perceptions that an “as a Service” model will be more expensive in the long run is the main customer concern to date that vendors will need to address via emphasizing the value-added life cycle services.
  • Hyperconverged infrastructure (HCI): A bundle of hardware and a services bundle? This is the bundle of bundles! Not too many HCI vendors are openly promoting an “as a Service” pricing model at this point, but 80% of current HCI buyers in TBR’s most recent Hyperconverged Platforms Customer Research indicated they are interested in a consumption-based purchasing model, particularly to enhance their scalability. About 84% of those surveyed are using HCI for a private or hybrid cloud buildout, so maybe a more cloud-like pricing model make sense. Make no mistake, interest is not synonymous with intent, but it’s safe to say these buyers are at least paying attention to their purchasing options.

My general verdict is that things are still moving at Vespa speed. PCs have a head start over data center hardware based on the concerted go-to-market efforts of the big three OEMs and a consumption model that more closely aligns with the consumer services we’re used to. The second half of this year will be an interesting proving ground to see if the reported pipeline growth is converted to actual customers. Depending on how that goes, maybe we’ll see the data center guys making more serious moves in this space.

What do you think? Add a comment or drop me an email at [email protected].

 

Key findings from TBR’s upcoming HCI customer research

Hyperconverged infrastructure (HCI) is a growing market ripe with opportunity for vendors. TBR forecasts the market will reach $11.7 billion by 2022. Although TBR research indicates that incumbent vendors with a strong presence in the converged infrastructure (CI) market, such as Dell EMC and Cisco, have an advantage in the space, findings also indicate that a growing number of smaller vendors are rising in popularity. Add to that the approximately one-quarter of existing customers who indicated that brand is not a key factor in their decision making, and it becomes clear that the opportunity to take share from existing vendors is high. Further, with nearly three-quarters of respondents indicating they have not yet taken the plunge into the HCI space, there is massive opportunity, through strategic marketing and support, for vendors to encourage new adopters to be their customers.

HCI has a significant place in the cloud market

Eighty-four percent of respondents indicated they are leveraging HCI for either hybrid or private cloud installations. TBR believes this suggests that cloud is not necessarily an inhibitor to HCI adoption, as some vendors may perceive. Further, we believe this signals that consumption-based pricing options, which 81% of respondents indicated they would be interested in considering in the future, will encourage more HCI adoption. Consumption-based pricing enables customers to select HCI for a capex solution as well as for public cloud if they choose, and they can simply compare performance and other features between the two to make purchasing decisions. Vendors can capitalize on this flexibility with strategic marketing.

IT leaders play a crucial role in the HCI decision-making process

HCI remains a strategic purchase, as evidenced by the fact that 74% of respondents indicated IT directors and managers were one of the decision makers. TBR believes that as customers become more familiar with HCI and their HCI vendor, they will be more likely to make repeat purchases and will be less likely to demand direct-from-vendor sales.

To learn more about TBR’s Hyperconverged Platforms Customer Research, contact Stanley Stevens ([email protected]) or your account executive.

 

Democratization now: It’s good for business

Data democratization is a hot topic now. Spokespeople from Google, SAP, Capgemini and other tech companies have spoken and written about how making data available to as many people as possible will both unleash the power of technology and prevent abuse of closely held data. Microsoft CEO Satya Nadella sprinkles his talks and interviews with references to democratization. TBR agrees that data access is critical to achieving artificial intelligence’s (AI) considerable potential, but access is not enough. For AI to do what it can do, business people with domain knowledge, who are regular users, must be able to work directly with data, without intervening layers of developers and data scientists.

Data access is a conundrum. Ensuring appropriate privacy and security while making data available to as many people as possible is a challenge, one that is inhibiting the growth of widespread AI-driven data analysis. This post will not address that challenge, however. It focuses on one of the other growth inhibitors: the current need for data experts, scientists, engineers and janitors, as well as developers, to extract the value from data.

Business users might see the hierarchy of data experts as a priesthood or a bureaucracy, standing between them and the data, but that is not really what is happening. Currently, there are no tools with which business users can conduct their own analyses, at least not without a lot of preparation by the data experts. Better tools are coming; there are R&D efforts worldwide to make data more directly accessible, which is part of what Nadella and other spokespeople are talking about.

Before these democratic tools are made available, there is strong growth in AI and the utilization of data analytics, because the value is there. But the need for experts greatly increases the cost of analysis, so only analyses with the highest potential value are performed. As more democratic tools become available, many more analytic projects will be worthwhile and the use of analytics will grow much faster.

The impact of democratized analytics tools will be huge because the costs associated with the data expert hierarchy are great. Those costs go beyond just personnel. Communication between business users and data experts is time-consuming and expensive, and it lowers the quality and value of the analyses. Business users and data experts live in different worlds and have different vocabularies; misunderstandings are common. Misunderstandings are expensive, but what is worse, working through intermediaries slows the iterative process by orders of magnitude. The potential value of data lies in insights, and finding insight is an iterative process.

The history of business technology is a progress propelled by increasing democratization of tools. The PC itself is prime example, making computing directly available to business users. The internet rode a wave of disintermediation and self-service to its current global saturation. In terms of democratization of AI analytics, the best parallel is the PC spreadsheet, which made it possible for business people to create and tune their own quantitative models of business activities. Before the spreadsheet, creating those models required coding.

“Spreadsheets for AI,” one of which may well be a version of Microsoft’s Excel, will accelerate growth in analytics, data access, storage, cloud services and the Internet of Things. AI spreadsheets will not retard growth in the use of data experts; they serve a different market. Even with very good first versions, broad adoption will take years, so the acceleration of growth will not be sudden. Over the years, however, the ability of business users to directly analyze their data will contribute substantially to the revenue of IT vendors and to that of their customers.