We need to talk about the data

TBR believes that creating common terminology and understanding around data is key to successfully implementing an evolutionary digital transformation strategy, one that enables the organization to transform incrementally, as it capitalizes on new opportunities and deals with new challenges. Essential to this approach to digital transformation is an organizational cultural transformation, one that embraces continual innovation and ongoing collaboration across departments and disciplines and that enlists all parties in the process of harnessing organizational data assets to move the organization forward.

The many uses and users of data

It is commonly accepted that IoT represents the intersection of IT with operations technology (OT). This is true, but only part of the story. Business management is another key player in many projects, and TBR believes it should be a component in all IoT projects. In fact, potential users of data from IoT projects extend beyond these three stakeholders, including many of the departments throughout an organization, such as marketing and sales. To deliver the greatest possible value from any project, including but not confined to IoT, all the potential users of the data should be considered in the design and evolution of each project.

In the early stages of the recent surge in IoT, three to four years ago, the different stakeholders were often brought together for workshops or ideation sessions to invent new solutions made possible by IoT. As IoT has become more common and relevant players are more familiar with common use cases such as status monitoring and asset tracking, there has been less need for this challenging and expensive invention phase of IoT projects. Instead, new projects are often undertaken entirely or almost entirely by OT, sometimes working with IT to ensure compliance with company standards. These projects can confidently deliver a positive ROI while only using data for a single purpose, usually operational efficiency. Potential other uses for the data, or from data that could be generated by the solution, are often not considered in the design. This can be a waste.

The data generated from an IoT project often have value beyond the immediate purpose of the project. For instance, data from a status monitoring solution can be used to identify patterns that could predict service-related incidents. Similarly, comparing status reports across different assembly lines or factories might help identify superior or deficient configurations. Status reports could be correlated with operations speed to help identify either capacity problems or the potential for greater capacity. Capacity limitations or windfalls affect both marketing and sales.

The same kind of potential repurposing of data can be found for most IoT projects. Data has multiple uses. Different people within the organization are able to recognize different potential uses. Uses can be classified into short term and long term. Status data is valuable immediately. Indeed, for the purpose of reacting quickly to status deviations, the data has no long-term value. A solution built for only that purpose would often discard the data to minimize project cost, resulting in a loss of the potential value of the data for long-term analyses. To extract the greatest value and meet broader organizational needs, other people in the organization should be involved in the project design.

5G deployments are expanding, but CSP revenue generation will remain minimal in the short term

5G is becoming a must-deploy technology for an increasing number of CSPs globally, prompting accelerated build-out timelines and broader rollouts

The 5G era is progressing as several countries expanded or commenced commercial 5G deployments in 2019, particularly the U.S., South Korea, Australia, the U.K., Germany, Spain, Italy and Switzerland. An increasing number of communications service providers (CSPs) globally, predominantly in developed countries, are accelerating and broadening the scope of their 5G build-outs. There are a few reasons for this pull forward, including the need for CSPs to stay competitive for customers of traditional mobile broadband and high-speed internet services, reducing the cost per gigabyte of carrying traffic (network opex efficiencies), and building a foundation in preparation for new use cases of the network. The availability of 5G devices, including a variety of smartphones, in 2019 is another key driver prompting earlier infrastructure investment.

Though 5G deployments are accelerating, TBR expects CSP 5G revenue generation will be minimal in the short term as consumers will be slow to adopt 5G devices due to their high prices and limited initial 5G service coverage. TBR believes business customers will provide the greatest opportunity for long-term 5G revenue generation as use cases requiring the ultra-low latency and accelerated data speeds enabled by 5G will be more prevalent in the enterprise space. CSPs are positioning to support enterprise 5G use cases by investing in innovation centers and targeting private 5G network customers.

TBR’s 5G Telecom Market Landscape tracks the 5G-related initiatives of leading operators and vendors worldwide. The report provides a comprehensive overview of the global 5G ecosystem and includes insights pertaining to market development, market sizing, use cases, adoption, regional trends, and operator and vendor positioning and strategies.

Will algorithm patents replace economies of scale as the most critical barrier to entry?

Manufacturing scale matters less as we pivot to a knowledge economy

Economies of scale as a barrier to entry have been a fundamental precept taught for years in economics classes worldwide. Capital had to be invested ahead of being able to create value, and then people could be hired to staff the capital equipment to produce goods. Having both capital assets and existing volume gave companies a distinct competitive advantage. It drove both vertically integrated companies as well as horizontal holding company models, with the latter made famous by Jack Welch’s oversight of U.S. blue chip company General Electric.

Technology today has greatly reduced scale as a competitive advantage. Virtualization and abstraction have led to business theorists talking increasingly about asset-lite business models and asymmetric competition. Clouding this pivot is the emerging discussion around consumer scale. This is a competitive edge gained not necessarily from capital scale, but by capturing consumer brand loyalty that generates the scale. This concept is often discussed as the “force multiplier” or the network effect of the ecosystem. It is giving rise to additional new terminology, such as multi-enterprise business networks, in which partnering and the joining of complementary assets enable all participants to benefit from the aggregation of intellectual property, which is fed to the entire ecosystem of loyal customers.

Humans have the big ideas; curating those ideas into scalable advantage requires technical skills, automation and patent protection

When consumer loyalty generates cash, that cash can be deployed to fund projects, such as small-scale, smaller-dollar-volume projects akin to becoming an internal venture capital (VC) arm for any future product and service innovations. This concept manifests itself in the notion of fast failure and rapid iterations that are anathema to scaled manufacturing best practices. Being successful requires having people who are insightful about what businesses or consumers want and how to turn those wants into an automated piece of software — in short, algorithms.

As virtualization and software abstraction move the economy ever closer to utility computing, first discussed in the late 1980s by technology futurists, and as quantum nears economic advantage, the mission-critical business competency will be writing algorithms to apply against the ubiquitous data traffic being generated and stored throughout the computing utility network. Faster compute leads to faster exploration and discovery. Faster discovery leads to shorter product and service cycles and therefore shorter competitive advantage windows.

As such, algorithms that generate these new insights will increasingly become the way enterprises generate wealth, as well-skilled individuals push the limits of conventional wisdom and then deliver these new insights. Preserving that ever-shortening advantage will come from increased vigilance in protecting intellectual property. Thinking and creativity provide the advantage. We hear time and again at analyst conferences about how skills are in short supply and how people are a firm’s greatest asset. TBR expects to hear more frequently about the patent protections around these automated ideas.

Clean blockchain data fed to quantum will accelerate the value of algorithm patents

Accurate data will be available in real time for these algorithms to run against to generate real-time decision-making guidance. As automation removes more and more human toil from the economy, only individuals at the point of creation or the point of consumption will be critical to the business, with the algorithms mining the consumer demand to test against the next big idea to come from well-skilled humans and converted into competitive advantage through an automated algorithm run against real-time, accurate data.

As explored further in TBR’s Quantum Computing Market Landscape, in the quantum computing realm, where insights and actions can be obtained exponentially faster, the IP advantage is also exponentially greater. Think of the traveling salesman example that comes up regularly in quantum conversations: If a delivery company can patent an algorithm that speeds up delivery rounds and makes deliveries more efficient overall, that could swiftly create extinction events in the delivery market. If we extrapolate this, emerging technology has the potential to fundamentally alter competitive landscapes by generating faster and more accurate insights.

TBR analysts will be attending the Quantum.Tech conference Sept. 10-11 in Boston. Please contact your account executive to coordinate a conversation with TBR analysts at the event.

HCLT improves position in DMS with portfolio investments, but must leverage its niche expertise to capitalize

Digital marketing services provide HCLT with an entry point for transformation opportunities

As clients look to transform CX and pursue omnichannel projects using technology solutions, the DMS space provides growth opportunities for vendors that can generate engagements by bridging together CX offerings with digital platforms to drive clients’ marketing campaigns. Bringing data to the center of the engagement, collected from sources throughout clients’ organizations and combined with analytics, will lead to future initiatives for both the client and vendor.

While HCLT has traditionally avoided large-scale investments around its DMS portfolio, the company has recognized demand for services and growth opportunities within the DMS space, which we believe guided the company’s March 2019 launch of a digital marketing platform, HCL ADvantage Experience. Based in Adobe Experience Cloud, the platform works with multiple marketing sources to collect and store customer data that supports clients’ user experience and enables HCLT to quickly scale clients’ marketing campaigns, including compatibility with legacy systems, through improved user integration on a DevOps framework. The platform will support HCLT’s position to capture application services opportunities, but the company will face pressure from other vendors that have developed similar platforms, limiting its ability to differentiate and compete for growth opportunities outside of existing clients.

Where HCLT’s partnership with Adobe does not necessarily provide an enhanced position for a vertical play, integrating HCLT’s engineering and R&D services capabilities and legacy data from its manufacturing and automotive expertise would enable HCLT to leverage a vertical strategy and better connect with vertical industry clients as well as begin to create separation from competitors.

Additionally, HCLT used its April 2019 acquisition of Strong-Bridge Envision, a U.S.-based digital consultancy, to expand the strength of its Mode 2 services and solutions to support business outcomes for clients through data insights. Strong-Bridge Envision joined HCLT’s Digital & Analytics portfolio, which bolsters HCLT’s position within the DMS space in the U.S. and supplements existing offerings, allowing HCLT to pursue consulting-led engagements with more specialized expertise on digital strategy, business transformation, CX and organizational change management. We expect HCLT will look to expand wallet share and mindshare from existing clients as well as generate consulting-led opportunities, but may face challenges in gaining permission around C-Suite-level conversations. Focusing on its mature verticals, such as financial services, technology and services, and manufacturing, which collectively contributed 57.3% of total revenue for HCLT in 1Q19, may be an easier path for the company to follow as it holds stronger client relationships and market share. While HCLT is able to pursue opportunities within other verticals, we believe financial services, technology and services, and manufacturing serve as a starting point from which HCLT can begin to build its brand around DMS and DT-related consulting before expanding into other areas.

TBR estimates 30% of total global CSP spend (capex and external opex) will be on or related to NFV/SDN in 2023

5G will push CSPs to accelerate and broaden their NFV/SDN initiatives

According to Technology Business Research, Inc.’s (TBR) latest NFV/SDN Telecom Market Forecast, covering 2018 to 2023, 5G will push CSPs to adopt a new network architecture and both NFV and SDN will be critical aspects of that architecture going forward. As such, TBR expects NFV/SDN-related spend growth will correlate with 5G deployments. Since CSPs will need to upgrade their networks from an end-to-end perspective to realize the full potential of 5G, this will naturally drive CSPs toward the virtualization and cloudification of their networks. This trend will impact most, if not all, of the major network domains from an NFV/SDN perspective over the next five years. TBR notes that 5G core is inherently virtualized and that this will also naturally push CSPs deeper into the NFV/SDN space over the next five years as they transition to a stand-alone 5G network.

Rakuten’s legitimization of vRAN will also drive NFV/SDN market growth

Though significant skepticism remains in the industry that Rakuten will be able to make the vRAN model work, should this scenario occur, TBR believes it would embolden CSPs to double down on their own NFV/SDN initiatives, especially as it relates to vRAN. RAN is one of the costliest domains in the construction of a network, and it is a key area CSPs will be keen to virtualize to reap cost savings.

White-box adoption will proliferate, portending significant OEM disruption

TBR expects the use of white-box hardware in NFV/SDN environments will proliferate through the forecast period, accounting for 60% of NFV/SDN hardware spend in 2023, up from 15% in 2018. This industry shift toward white-box hardware will significantly disrupt incumbent OEMs’ business models, prompting them to evolve into software-centric companies. Industry organizations such as the Open Compute Project (OCP) and initiatives spearheaded by leading CSPs such as AT&T will fuel the rapid uptake of white boxes during the forecast period.

Telefonica models the transition of a traditional telco to a digital service provider

Telefonica represents a prime model of the opportunities and challenges telecom operators will experience as they evolve into digital service providers. The digital era will enable telecom operators to become more agile and profitable as they transition away from more costly legacy network technologies and business models. The digital era will place greater expectations on telecom operators, however, as customers will turn to digital service providers to support a broader range of services and use cases.

Telefonica will benefit from recognizing that it is just a single entity competing in a vast digital ecosystem composed of a multitude of players, including other telecom operators, webscales and OTT video providers. Collaborating with the broader technology industry will enable Telefonica to reduce the cost of developing in-house solutions while enabling the company to more effectively support customers’ digital ambitions.

The 2019 Telefonica Industry Analyst Day showcased Telefonica’s (NYSE: TEF) evolution from a traditional telecom operator to a digital service provider (DSP). Telefonica is positioning as a leading global DSP through its progress in virtualizing and cloudifying its network and IT systems as well as the company’s capabilities in emerging technologies, including AI, machine learning (ML), big data and edge computing. Telefonica’s digital transformation initiatives are yielding significant cost savings as the company modernizes its network infrastructure and customer service platforms while creating new services to enhance user experience and support advanced use cases. Telefonica will face challenges in the digital era, however, including growing competition from webscales, regulatory hurdles, and unproven demand for use cases in areas including 5G and edge computing.

Atos at the edge of technology

With the launch of BullSequana Edge and investments in quantum computing, Atos takes a pragmatic approach to executing digital transformation initiatives

For a few years, the impact of big data has created ebbs and flows in buyers’ behavior in end-consumer interactions and IT purchasing patterns, as is typical with any technology. Consumerization of business applications, demand for data quality and governance, and the adoption of connected technologies compel vendors such as Atos to explore opportunities around managing customer data and to invest in solutions that can help them protect their competitive edge.

During the Atos Technology Days 2019 opening keynote, Atos CEO Thierry Breton announced BullSequana Edge, the company’s first edge server to power AI everywhere. Built with the goal of delivering lots of power, including 165-teraflop capacity, that is stored locally, the appliance enables Atos to address the upcoming shift in data localization. Breton stated that while 80% of data globally is stored in data centers and in the cloud, that percentage is expected to shrink to 20% by 2025 as clients seek ways to analyze data in real time at the edge, where it is created. The addition of AI capabilities amplifies Atos’ position as an end-to-end services provider within the IoT space and closes gaps in the asymmetrical relationship between IT and operational technology (OT) departments.

Additionally, BullSequana Edge helps Atos address challenges of exponential data volumes and heterogenous data complexities due to the advent of AI and machine learning (ML), necessary blocks supporting the data economy foundation. With optimized security capabilities including infusion detection, disc encryption and secure boot, the BullSequana server enables Atos to alleviate common pain points of IT and OT, especially as the company builds and offers vertical-centric solutions with the hardware. Although offering a hardware appliance drifts Atos further from pure systems integrators (SIs), which typically manage asset-light portfolios, it brings Atos closer to key IT buyers, which remain the centralized governing body of the final IT purchases, even in discussions that include the C-Suite.

Atos also continues to enhance its quantum computing capabilities. As TBR wrote in its May 2019 Digital Transformation Insights Report, which focused on quantum, “Atos took its strengths in design computing for appliances and programming and emulation environments and announced several quantum research initiatives, including the opening of a global R&D lab in Yvelines, France, and Atos Quantum Learning Machine (QLM) implementations in Europe and the U.S. to enable clients to experiment with disruptive technologies, tackle the explosion of data and accelerate the number of practical use cases across industries. Additionally, about a year ago, Atos developed a consulting practice around quantum computing to educate and advise clients on whether it is possible to use quantum to accelerate business applications. During Atos Technology Days 2019, Atos announced myQLM, a light version of a QLM, which is an on-premises environment designed for quantum software developers. Users can download myQLM on their desktops and use a set of algorithms to train remotely, at home, at universities and research centers, and simulate the actual QLM. A Phyton-based language, QLM allows students and researchers to develop and share code within the community, creating additional entry points for Atos’ broader services portfolio. With customers ranging from universities and research centers to high-performing computer ecosystems and commercial clients, Atos, is building one use case at a time. For France-based oil and gas company Total, Atos is using a QLM simulator to accelerate the analysis of seismic activities, helping Total stay ahead of competitors. Atos is also working with Bayer and RWTH Aachen University in Germany to evaluate the use of quantum computing to research and analyze human disease patterns.“

Our analysis further states that we expect Atos to “monitor the underlying scientific and engineering breakthroughs of the competing architectural investments and accelerate the commercial utility through its development of leading-edge use case applications” in concert with its business partners as it looks to quantum to gain a competitive advantage. For Atos, partner ecosystems, such as with Fujitsu, 1QBit, Rigetti, IBM Q and D-Wave, will play a critical role in its ability to accelerate the development of business use cases.

Navigating the dynamics of the IT services markets demands vendors demonstrate innovation, which allows them to gain trust with new buyer personas. While many of Atos’ SI peers have invested heavily in areas such as marketing services to better appeal to the CMO buyer for customer experience opportunities, the company relies on its full technology stack to expand its addressable market. While some investments, such as in quantum, may not have sizable, tangible impact on its performance, Atos will benefit from being first to market once economic advantage is achieved.

According to a TBR special report on quantum and economic advantage, “What we do not know is how fast quantum computing will take off and what impact it will have on our knowledge as a society. What we do know is that it will take off — algorithm by algorithm, as economic advantage is achieved incrementally — and fundamentally change what we know.” TBR’s Digital Transformation Insights Report further states, “As vendors continue to battle business process and technological hurdles across all three phases of digital transformation — substitution, extension and transformation — quantum will be the one technology that will fundamentally transform enterprises and the way they go to market and sell products and services,” placing Atos in the spotlight through its innovation investments.

Atos Technology Days 2019, held in Paris on May 16 and 17, displayed myriad practical applications of emerging technologies. Held alongside one of the largest events centered on startups in Europe — Viva Technology, with 124,000 attendees, 13,000 startups and 3,300 investors — Atos Technology Days presented an excellent platform for Atos to showcase its innovation capabilities across its entire portfolio stack, including hardware, applications and services. Running under the theme Welcome to the Post-cloud Era, the presentations walked over 200 clients, partners, executives and industry analysts through Atos’ vision of the IT of tomorrow, centered on the edge, quantum, IoT and cybersecurity as well as Atos’ ability to stitch it all together to deliver business outcomes to its customers. Data is exploding, and Atos is preparing to accommodate this phenomenon by effectively managing, storing, securing and analyzing data.

PTC’s innovative outlook, robust solution toolbox, and legacy in CAD and PLM make it a valuable IoT partner

Strategic findings

Shift in focus to AR/VR

In our 2018 LiveWorx EP we suggested a shift from an emphasis on PTC’s ThingWorx IoT platform to PTC being more vocal about Vuforia, its AR/VR solution, and its wider product portfolio. TBR believes that shift has continued with much of the messaging centered on the business implications of augmented reality as well as how its entire product base works in symphony, and less focus on ThingWorx as its tip of the spear into digital transformation.

This shift makes sense. The IoT platform space is saturated with established vendors, along with several smaller entrants, offering some shape of IoT platform. PTC has the key components for an IoT platform, but so do others, including the giants Amazon Web Services (AWS), Microsoft, IBM, Oracle and Google, and OT stalwarts such as Bosch and Siemens. It is hard for PTC to stand out by messaging its IoT platform alone, despite a robust offering, as the IoT platform market is busy. TBR believes the shift could also indicate IoT is not growing quite as fast as PTC hoped.

Instead, PTC has increased its messaging around AR/VR. TBR believes PTC is positioning AR as a new differentiated niche to bring customers into its wider ecosystem, positioning it as a “wow” factor and distinct from peers’ offerings, as well as enhancing the value of other products such as Creo, Windchill, and ThingWorx. Based on the compelling presentations, messaging, and customer lineup using Vuforia, TBR believes PTC has a competitive AR/VR product.

PTC’s pitch is that AR helps customers add the human element to an IoT solution — instead of getting insight from dashboards in the board room, insight is delivered in real time on the factory floor. Conversely, in PTC’s view, AR/VR helps feed data into the IoT solution. Information around what workers see, such as a fire, a faulty part, parts that need to be replaced as well as unsafe conditions, can be fed into a centralized IoT platform, much like a sensor inside a machine. Ultimately, PTC seeks to “decorate” the industrial world with real-time information, and extend the value of IoT data through AR. It remains to be seen how well AR contributes to feeding data into an IoT solution. TBR believes AR is not there yet, but believes PTC did a good job of showing how AR can provide an actionable UI and lead an IoT solution to be more operationally effective.

Key outcomes PTC messages around AR/VR include reducing complexity by allowing workers to always have information on parts and machines; ensuring quality control and compliance using step-by-step checklists; and improving efficiency through gamification. It also offers a drastic reduction in training time as the Vuforia Expert Capture (formerly Vuforia Waypoint) solution allows expert employees to transition knowledge to novice workers or a machine or solution vendor to train a new customers’ IT or OT team.

PTC has a lineup of customers leveraging its Vuforia technology as proof points. Customers seem to adopt in two ways: by leveraging PTC’s polished tools Vuforia Expert Capture and Vuforia Studio, such as Howden and Aggreko, or by building upon PTC’s foundation, such as Fujitsu and Caterpillar, which are leveraging Vuforia Engine to build a proprietary solution.

How well Vuforia is performing monetarily is still questionable to TBR. TBR expects many Vuforia customers are in the pilot and proof-of-concept stages, which could indicate Vuforia is not yet being fully monetized while in multiple trials. However, in speaking about PTC’s strategic partnership with Rockwell Automation, PTC CEO Jim Heppelmann noted 40% of Rockwell Automation’s IoT wins have included AR with joint customers particularly interested in Vuforia Expert Capture. According to Heppelmann, Vuforia contributes 7% of PTC’s current software revenue, a respectable amount compared to its larger legacy PLM and CAD businesses, with growth of 80% year-to-year (TBR expects from a very small base). He also noted the AR-IoT combo is a core growth business for the company and expects the combination to contribute one-third of its sales moving forward, with continued growth of nearly 40% year-to-year.    

An interesting thread we have not seen PTC talk about, publicly or privately, is offshoots of Vuforia to the consumer market and leveraging Vuforia Expert Capture for consumer self-help applications, e.g., instead of a YouTube video on how to tie a complicated knot, a VR experience guiding people on how to tie a knot could be more impactful. This could be expanded to cooking guides, exercise guides, or sewing guides as examples within a huge pool of opportunity. Microsoft and the HoloLens team could be a good partner for these applications, such as leveraging the Xbox install base to reach consumers (if Microsoft is not already moving in this direction alone), and could help foster a content creator network. It could also be leveraged by consumer-focused businesses to educate its end customers, such as sporting goods company Coleman delivering a VR walkthrough of setting up a tent.   

In a day of personal stories, EY showcases the results of corporate commitment to talent recruiting

A small but influential group from EY’s leadership team, including incoming Chairman and CEO Carmine Di Sibio, were on hand in a newly redesigned wavespace to recognize the winners of the EY NextWave Data Science Challenge. An extension of the program deployed in Australia last year, this global challenge resulted in 12,000 submissions from 4,500 participants from 477 universities in 15 countries.

The basic challenge: Predict human traffic patterns

The overarching goal of the project was to take a data set provided by EY partner Skyhook of citizens in the greater Atlanta area. The challenge was to take the citizens’ locations as of 3 p.m. and predict where those citizens would be located at 4 p.m. EY Global Analytics Program Director Antonio Prieto, who spearheaded this effort that will be expanded on in November, stated the intention was to connect students to a challenge that resonates with EY’s mission of building a better working world, which can be done through analytics-optimized smart cities.

Participants were allowed to enter multiple submissions as their models evolved and as they generated new “what if” scenarios. The award winners received cash prizes, EY badges and EY internships. The winners and their locations were:

First Place: Sergio Banchero is studying electronics in Australia and is a native of Brazil.

Second Place (shared): Katherine Edgley and Philipp Barthelme shared the second-place prize and are both studying applied mathematics at the University of Edinburgh.

Third Place: Chia Yew Ken of Singapore has an affinity for natural language processing and finds the parallels to AI pattern recognition interesting.

Each participant presented their basic findings and discussed the underpinning mathematical calculations and manipulations in ways that challenged this mature worker with a liberal arts background to comprehend. The incremental improvements on the algorithm scores seemed slight until put into context by Banchero, who translated his algorithm’s net improvement over the average of all submissions as ultimately capable of reducing 3,200 pounds of CO2 emissions, which would require rain forest acreage equivalent to 16 football fields to remediate naturally.

The State Of Cloud Profitability Has Never Been Stronger

More than a decade after taking a leap of faith, cloud vendors prove profit possibilities

For vendors such as Microsoft (Nasdaq: MSFT), Oracle (NYSE: ORCL) and SAP (NYSE: SAP), offering cloud solutions required them to leave the safe and profitable confines of their traditional software businesses, where they were confident in the business models and drove consistent double-digit operating margins. Even for born-on-the-cloud companies such as Salesforce (NYSE: CRM) and Workday (Nasdaq: WDAY), the lack of short-term profit required them to adjust funding requirements and sell this new business model to potential investors. All vendors that chose to participate in the nascent market had to take on the cloud financial risk without a clear picture of when or how their businesses would reach sustainability and profit.

More than a decade after the initial cloud transition, nine of the leading providers in the space, which come from a variety of business backgrounds, are proving out the benefits of cloud business models. It has taken adjustments to almost every major category of financial and operational strategy, but profitability has improved significantly and is gradually approaching the levels seen with traditional software businesses. In summary, the state of cloud profitability has never been stronger.

Gross profit gets little attention but delivered most of the improvement to cloud profit

The direct costs of delivering a solution — and their inverse, gross profit — get little attention in the cloud business model discussion. Although shifts in sales and marketing strategy may be more attention-grabbing, gross profit and cost of goods sold have made the bigger impact to overall cloud profitability. As shown in Figure 2, the “big nine” cloud vendors have increased cloud gross margin by 5 basis points over the last three years. At 65%, cloud gross margin is still lower than the traditional software gross margin of close to 85%, but it has improved significantly for the cloud businesses. The improvements have been driven by a variety of factors, most notably:

  • Increased scale of data centers: For IaaS vendors that own and operate core data center locations and infrastructure, their growing scale has led to greater cost-effectiveness. The cost of IT infrastructure has gone down, and automation allows vendors to operate data centers more efficiently. Additionally, there is a greater availability of third-party services such as colocation, which allows cloud providers to cost-effectively scale to new regions and expand capacity.
  • Professional services cost declines: As vendors across all cloud service types initially rolled out their services, most of the professional service needs were met by the providing vendor out of necessity. However, as these platforms and services have scaled, the level of third-party skills has expanded, shifting a lot of responsibility and opportunity for service engagements away from the cloud vendors. The result has been a shifting of professional service opportunity to the partner ecosystem, allowing cloud providers to focus on the higher-margin cloud solutions.
  • Declining acquisition-related costs: Acquisitions played a large role in the establishment of cloud computing leaders. IBM (NYSE: IBM) buying SoftLayer, Oracle purchasing NetSuite and SAP buying SuccessFactors are just three examples of the purchases that have shaped the market over the past decade. Many costs of those purchases are borne out in the acquiring organization’s cost of goods sold. As the scale of cloud businesses has grown following the large acquisitions, the overall gross margin has rebounded.