Voice assistant volume is increasing

A survey conducted by Adobe Analytics found that 32% of U.S. consumers owned a smart speaker in September 2018, compared to 28% in December 2017. The report also projected that near half of the U.S. consumer base could own one by the end of December 2018, supported by Adobe Analytics’ finding that nearly 80% of smart speaker sales occur during the holiday season. It is just one study, and there are more conservative studies out there ― but even if the data isn’t completely on the mark, it does uncover the trend of voice-controlled devices gaining ground inside consumers’ households despite use cases and monetization still being blurry.

I own four Amazon Alexa-enabled devices myself: two Echo Dot smart speakers and two Fire TVs. Of the Echo Dots, one was given to me by a colleague to play around with, and another I bought for about one-third the list price from acquaintances who had received it as a gift from their extended family and left it unopened because they felt it was “too creepy.” In our household, the Echo Dots have been used as glorified hands-free music players in our kitchen and one of our bathrooms. The Fire TVs are used as media players first and foremost. Sometimes, we try some of the new skills Amazon sends along in update emails as a fun diversion, but usually that is a one-off activity. I am deeply invested in the Amazon ecosystem, having been a Prime member since its debut and a fan of Prime Video, but it is still challenging to find ways to use Alexa smart-home devices to enhance my other Prime benefits or drive me to Amazon’s e-commerce business.

Adobe’s research seems to align with my anecdotal experience, noting that among the most common voice activities* are asking for music (70% of respondents) and asking fun questions (53% of respondents). The only other activity above 50% is asking about the weather (64% of respondents). So yes, people are using them, but these are not skills that require much depth or complexity or that drive additional revenue for Amazon.

Therein lies the problem for voice-platform providers such as Amazon and Google (Microsoft and Apple are also players, but I don’t believe they are as developed as Amazon and Google are in the smart speaker and voice assistant space). In an ideal world, voice assistants would provide platform companies with a wealth of consumer data as users query the devices about their everyday needs. Also, voice assistants can be a new conduit to monetization through new applications or — especially in Amazon’s case — to lowering barriers to the purchase of goods. However, most complex tasks, such as ordering a ticket for the movie you’d like to see tonight, finding out when the beach is open, or buying an outfit for an upcoming wedding, are still much easier via a smartphone or laptop interface. The Adobe study found that of the 32% of respondents with a smart speaker, only 35% and 30% used voice interfaces for basic research or shopping, respectively.

Improving the use cases, or “skills,” of voice assistants will be critical for platform vendors to increase the use of these devices for complex tasks and to elevate smart speakers from smart radios and novelties to gleaming data gems. TBR expects this to be the major battleground between voice assistant and smart speaker providers moving forward as the form factor has been relatively proved. TBR believes Google has a slight advantage due to its heritage in data mining behind the façade of services as well as its Android and Chrome cross-platform tie-ins (a lot of relevant user data is already in Google, such as contacts, schedules, and often email). Amazon is no slouch either due to its investment spend, growing media empire and robust e-commerce platform, which Google lacks. Apple could be a dark horse; however, its Siri is still weaker on an artificial intelligence (AI) basis and the HomePod’s pricing makes it an unlikely easy gift.

The next frontier for all of these platform providers is in the commercial space, an area we may see Microsoft put much of its effort into while leaving the consumer space for better-suited peers. In fact, collaboration between Microsoft and Amazon on voice and smart speakers may confirm this. Using voice assistants and smart speakers to query analytics or gain business insights or employing them as a “smart secretary” in conference rooms are areas TBR sees as avenues for commercial expansion. TBR has seen slightly different approaches from Amazon and Google in the commercial space. Amazon, likely with Microsoft support, focuses on the office with Alexa for Business, while Google seems to be positioning its voice AI and smart speaker technology to serve as an interface for a business’s customers.

However, as with the consumer space, the use case must be proved, the skills must be ironed out, and existing commercial infrastructure must be modified to support voice assistants and smart speakers. And despite furious investment in these possibilities by the major platform players, TBR doesn’t expect to see Alexa widely adopted in the boardroom for at least another two to three years. For now, I believe smart speakers will continue to find their way into homes as a novelty or curiosity for tech-excited people and early adopters, contributing to slow but steady growth, or as an easy, cost-effective tech-based gift, driving additional bursts of increased unit sales during the holidays.

*Voice activity data includes devices that are not smart speakers, such as smartphones.

2018 5G Americas Analyst Forum

5G will provide network efficiencies for telcos as they anticipate next-generation use cases

Given the introduction of Verizon’s (NYSE: VZ) 5G Home fixed wireless service in October, as well as the upcoming launch of AT&T’s and T-Mobile’s mobile 5G networks by the end of 2018, the 5G era is edging closer to reality after years of industry speculation regarding the technology’s capabilities. Similar to prior network eras, such as the transition from 3G to LTE, the 5G era will be a gradual evolution of existing network capabilities and will not immediately yield its full benefits or dramatically alter the global wireless market during its inception.

A resounding theme at the 2018 5G Americas Analyst Forum was that the 5G era will essentially be “more of the same” initially. LTE will remain the predominant source of connectivity for most wireless subscribers in the Americas over the next several years until 5G coverage becomes nationwide and customers transition to 5G-capable devices. The accelerated speeds offered by LTE-Advanced services, as well as the cost savings offered by IoT network technologies such as Narrowband IoT (NB-IoT) and LTE-M, are currently more than sufficient to support the demands of most consumers and enterprises.

The wireless industry is anticipating 5G will foster IoT innovations in areas including connected car, healthcare, smart cities and augmented reality (AR)/virtual reality (VR). Though advanced IoT use cases that require the precision promised by 5G, such as remote surgery, are being explored, many of these services will not become commercially available until the mid-2020s at the earliest. Additionally, solutions like remote surgery and V2X automotive services will be burdened by significant regulatory challenges as ensuring 100% network reliability and ultra-low latency will be essential to prevent hazardous outcomes.

Although the end-user benefits of 5G will initially be limited, investments in 5G will ultimately be viable due to the network efficiencies operators will gain from the technology. 5G, which is expected to provide between four- and 10-times greater efficiency on a cost-per-gigabyte basis compared to LTE, will enable operators to more cost-effectively add network capacity to support the prevalence of unlimited data plans as well as continued connected device additions. Offering 5G services will also be essential for operators to remain competitive against their rivals as the marketing of accelerated 5G speeds will help to attract subscribers. Lastly, the deployment of 5G networks will prepare operators to support 5G-dependent use cases when they do come to fruition and spur customer demand.

 

 

Around 70 representatives from well-known operators and vendors attended the annual 5G Americas event to talk with more than 70 industry analysts about the state of wireless communications in North America and Latin America as well as discuss challenges and opportunities presented by the rapid development of the mobile ecosystem.

The event kicked off with a presentation from T-Mobile (Nasdaq: TMUS) CTO Neville Ray regarding 5G leadership in the Americas. He discussed topics including projected use cases, the importance of 5G to the U.S. economy, the Americas’ position in the global 5G market, and the different initial approaches U.S. operators are taking to 5G. A panel of network and technology executives from operators including AT&T (NYSE: T), Sprint (NYSE: S), T-Mobile, Telefonica (NYSE: TEF), Cable & Wireless and Shaw (NYSE: SJR) provided additional insights into 5G evolution and activity around 5G by each respective operator.

Day 2 began with panel sessions featuring leaders from top telecom vendors, including Ericsson (Nasdaq: ERIC), Cisco (Nasdaq: CSCO), Nokia (NYSE: NOK), Samsung, Intel (Nasdaq: INTC), Qualcomm (Nasdaq: QCOM) and Commscope (Nasdaq: COMM), to discuss areas such as 5G regulatory challenges, 5G network and technology deployments, and potential 5G go-to-market strategies and use cases. Following these panel sessions, the reminder of the event offered analysts the opportunity to participate in a choice of 34 roundtable discussions focused on key 5G topics, including Internet of Things (IoT), edge computing, artificial intelligence (AI), 5G network infrastructure and technologies, regulatory considerations, and 5G in the automotive industry. 

Big Blue opens its arms, and its wallet, to Red Hat

Red Hat’s projected growth is enough to justify the hefty purchase price

On Oct. 28, IBM (NYSE: IBM) and Red Hat (NYSE: RHT) executives announced a proposed acquisition ― one that will be the industry’s third-largest acquisition should it gain approval. The deal, valued at $34 billion, would bring Red Hat into IBM’s hybrid cloud team, in its Technology Services and Cloud Platforms (TS&CP) group, where its IaaS (formerly SoftLayer), PaaS (formerly Bluemix) and hybrid management capabilities reside.

While the sheer magnitude of the deal may surprise some, the underlying reasons do not. IBM’s cloud strategy was sorely due for a boost, and Red Hat has been looking for a potential buyer for quite some time. Stefanie Chiras, a 17-year IBM vet, joined Red Hat as the VP and general manager of the Red Hat Enterprise Linux (RHEL) business unit in July, likely to lead that group through the planned acquisition. The potential acquisition would also be aided by portfolio synergies around Linux on IBM hardware and Kubernetes. Additionally, IBM is pervasive in the large enterprise market while much of Red Hat’s revenue is channel-led.

What’s most important is that IBM listened to its stakeholders and the broader market, realizing that while its cloud business was growing consistently at around 20% to 25% year-to-year on a quarterly basis, that was not enough to move the needle materially to more effectively compete in cloud. The company’s recognition that it should not always promote all-IBM solutions is a noteworthy shift. Though IBM has had technology partnerships for some time, there was always the underlying perception that it would push its own solutions ahead of others, regardless of customer needs. Its recent and ongoing focus on hybrid IT enablement has changed this; and now, bringing on an open-source company could change the game for IBM.

Sticker shock fades once you factor in the rest of the numbers

Historically, initial public offerings (IPOs) and sales of more traditional technology and software companies have been valued at around 5x their annual revenue. However, in recent years, as more cloud-native companies with subscription-based business models go public or get acquired, this multiple has steadily shifted upward. As a rather extreme example, Cisco (Nasdaq: CSCO) bought AppDynamics for $3.7 billion, a valuation of nearly 16x AppDynamics’ annual revenue, even though in the week prior to the purchase AppDynamics had been valued at $1.9 billion on an annual revenue of approximately $220 million as the company readied for its IPO.
Much of the speculation around this monstrous deal relates to how IBM can and will fund such a hefty purchase. To put this massive $34 billion figure into perspective, Red Hat’s trailing 12-month revenue for the four quarters ended Aug. 31, 2018, was just shy of $3.1 billion, indicating the deal is valued at 11x Red Hat’s annual revenue. If Red Hat were to stay on its double-digit growth pattern and trajectory*, its revenue and operating income would be projected to more than double by the close of 2021, benefiting from access to IBM’s vast enterprise customer base.

These projections help IBM justify the large purchase price. Additionally, it is likely that the purchase price per share was set at least a few weeks ago, when there were more Red Hat shares available and at a higher price. On Oct. 1, Red Hat was trading at $133 a share, compared to the $117 per share price it was trading at on Oct. 26.

Synergies make the acquisition possible; success will come down to execution

Organizational structure

The proposed acquisition poses significant integration challenges for IBM if approved. Though the company has been successful in the past with integrating software acquisitions, it has yet to make a purchase this large, and this is the first major software acquisition since the company reorganized and brought software subgroups across its various business units a couple of years ago, eliminating a dedicated software business unit. Additionally, none of the formerly acquired companies have run as stand-alone units as Red Hat is expected to be.

Red Hat’s proposed position as a stand-alone unit in TS&CP could have varying results. IBM Services’ culture and cumbersome processes could stifle Red Hat’s software-led mindset, culture and innovation. Alternatively, Red Hat’s products could be pulled through in an unprecedented number of Services engagements the company has yet to see due to its much smaller size and scale. This second scenario, however, would only be possible if IBM Services and consultants can differentiate from Red Hat’s existing systems integration partners to maintain IBM’s status as the largest services provider around Red Hat and Linux. Whether or not those partnerships will stay at the strategic levels they are at today, or at all, remains unclear.

Red Hat CEO Jim Whitehurst would report to IBM CEO Ginni Rometty. While it is very likely he would stay with IBM for the year or so required and then retire, there is the possibility, and this is pure speculation, that IBM could be priming him to be a contender for the position of IBM CEO should Rometty look to retire soon.

Go to market

Undoubtedly, IBM has set its sights on reaching more midmarket customers as its large enterprise customer base is slower and more resistant to move to cloud. Red Hat’s prevalence in the midmarket will surely help open the doors to cross-sell IBM solutions and services to these companies, if pricing is adjusted for smaller companies. Additionally, IBM will gain access to a Red Hat developer community of more than 8 million. On the other side of this, Red Hat also can bring its solutions upmarket to IBM’s largest enterprise customers.

Much of IBM’s focus as of late has been on helping customers link on- and off-premises environments and sharing data across truly hybrid environments. Its large Services arm and broad portfolio set have helped offset some legacy software and services revenue erosion in past quarters. While Linux is already relatively pervasive across the market and OpenStack has yet to garner significant demand or traction, Kubernetes is the open-source solution of choice at the moment and will be in coming quarters. IBM continues to update its IBM Cloud Private portfolio centered on Kubernetes, which can also run on OpenShift, presenting an area of immediate portfolio synergy between the two companies. The incorporation of additional open-source technologies into the mix as well as Red Hat’s interoperability with third-party cloud and software solutions only help position IBM as an increasingly technology-agnostic hybrid enabler.

Peer implications

Despite the size of the acquisition and the attention it is garnering, IBM’s cloud competitors will not face substantially altered challenges should the deal go through. Amazon Web Services (AWS) and Microsoft (Nasdaq: MSFT) will continue to dominate the public cloud IaaS and PaaS market. The two have increasingly embraced open-source technology integrations in their proprietary ecosystems, only enabling them to get bigger as they can also work with RHEL customers.

We believe that if this acquisition were to materially impact any single company, it could be Google (Nasdaq: GOOGL) and/or Oracle (NYSE: ORCL). Google struggles to compete at scale with AWS and Microsoft and does not yet have the same permission to play in the large enterprise segment. With IBM, Red Hat would gain that permission almost immediately. Oracle’s Linux offerings are based on RHEL, which could complicate a competitive relationship between IBM and Oracle. While Oracle may have more pressing areas to focus on and invest in, such as Kubernetes in tandem with its peers, the company could, should it choose not to work closely with IBM when Red Hat is integrated, look to acquire another Red Hat-like company with expertise and capabilities in open source and Linux in particular, such as Canonical or SUSE, which was just sold by Micro Focus (NYSE: MFGP) to private equity firm EQT for $2.5 billion.

Has Red Hat’s Jim Whitehurst set himself up to succeed CEO Rometty at IBM?

While it is very likely he will stay with IBM for the year or so required and then retire, there is the possibility, and this is pure speculation, that IBM could be priming him to be a contender for the position of IBM CEO should Ginny look to retire soon. — Cassandra Mooshian, Senior Analyst

What Is 5G Technology And When Will It Arrive?

“With undergoing development, 5G technology is expected to get the show on the road by 2020. That can seem to be a long wait, but it still remains an ambitious timeline. According to Technology Business Research Inc., network operators are expected to spend billions of dollars on 5G capital expenses by 2030.”

Whether by R&D or acquisition, money can’t buy SaaS performance

The SaaS market appears to provide an easy opportunity for vendors to garner significant revenue and growth. SaaS is the largest segment of the cloud market — bigger than the IaaS space, which draws so much attention due to leaders Amazon Web Services and Microsoft Azure. The SaaS market is also much more fragmented, littered with thousands of providers, which would seem to imply that consolidation is a foregone conclusion. However, even for three of the largest leading SaaS providers, the investment level required to compete in the space remains high, and even spending billions of dollars in R&D and acquisitions does not guarantee success.

This is not to say that these billions of investment dollars are all for naught. Despite being around for more than a decade, the SaaS space remains quite immature. Customers are still figuring out which of their applications can be moved to cloud delivery, and how, when and with which vendors those moves can take place. Until a longer track record exists for making these decisions and vendors consolidate disparate offerings into packages more closely resembling integrated solutions, the market remains very much in flux. It’s not the functionality holding back the adoption of hybrid solutions, it’s the difficulty of integrating and managing the multicloud and multivendor solutions. In the meantime, vendors such as Oracle, SAP and Workday have no other choice but to continue accelerating their investments. Their dollars will not buy SaaS performance in the short term, but this is the only way these vendors have a shot as the SaaS space becomes more predictable.

Oracle is in too far to turn back now

By virtue of its long legacy in a diverse field of software, Oracle finds itself in a unique position with cloud solutions. Aside from databases, Oracle is a company built on acquisitions, and that approach holds true with its expansion in cloud. After first downplaying the overall concept of cloud delivery, even while acquiring cloud assets, the vendor recently quickly shifted its messaging and doubled down on internal- and external-driven innovation. The results from a dollar perspective are laid out in Figure 1, representing a steady and significant stream of acquisitions focused on building out mainly SaaS offerings and R&D that funds cloud solutions across the spectrum of IaaS, PaaS and SaaS. The significant amount of Oracle’s investments is undeniable, but the returns are far from overwhelming. The downfall of Oracle’s SaaS investment plans played out quite publicly, as the company first bet it would become the first SaaS/PaaS vendor to achieve a $10 billion run rate, then recently changed its reporting structure midyear to blur the actual results.

 

Oracle maintains worse performance than SAP and Workday for the return on its acquisition and R&D investments, spending more on these investments than the company generated in total cloud revenue during 2016, 2017 and 2018 (estimated). That does not mean Oracle is without successes, however, as the purchase of NetSuite, reflected in Oracle’s large acquisition expense in 2016, contributes to revenue growth and complements the organic development of Fusion Cloud ERP. A lot of Oracle’s struggles in cloud come from organic initiatives, such as its PaaS and IaaS services, which have not taken root with customers despite aggressive sales tactics. Those categories of services account for a significant portion of Oracle’s R&D investments over the past three years, but still generate relatively small revenue streams for the vendor. Nevertheless, despite the investment outweighing the associated revenue contributions, we believe Oracle will and should remain committed to its current cloud strategy. It may not pay off in the near term, but these investments are the best shot for Oracle to execute a longer-term cloud turnaround.

SAP is making all the right financial decisions, but still falling short

Though still acquisitive, SAP’s cloud strategy has been more focused on internal innovation compared with Oracle. A more even mix of R&D and acquisition investments, combined with an earlier commitment to cloud delivery, is producing a better rate of revenue return for SAP, as shown in the graph below. SAP ranks fairly close to Oracle in total cloud revenue but is achieving those run rates after incurring significantly fewer R&D and acquisition expenses. TBR estimates SAP’s combined R&D and acquisition investments for cloud were $6 billion for the past three years, compared with more than $21 billion for Oracle over the same time period.

Despite the comparatively positive financial returns for SAP in cloud, the vendor is still struggling with multiple elements of its portfolio. After allowing Salesforce to capitalize on the shift to moving front-office apps to cloud, SAP recently started circling back to carve out territory in that domain. Through multiple acquisitions in the customer experience space and new messaging, SAP is making a concerted push, but it faces an uphill battle winning more market share in that space. Furthermore, SAP’s effort with SAP Business Suite 4 HANA is a long-term one, and in the meantime, assets such as SAP Cloud Platform are underrepresented in the platform space. The net is that SAP has managed investments well and grown revenue in cloud but is still not achieving at a scale that ensures the vendor’s leadership in the SaaS space.

Workday is opening its wallet after trying the DIY route

Historically, Workday has been more reliant on internal R&D as the sole means of advancing its cloud strategy compared with Oracle and SAP. That certainly does not mean the company was shy about entering new markets or delivering new products, as Workday has rapidly increased its activities in both regards over the past three years. The addition of student, financial and now platform offerings illustrates how broadly Workday has expanded its portfolio beyond core human capital management (HCM) offerings. Part of Workday’s reliance on R&D comes from its core focus on a “single line of code,” which provides simplicity and consistency in the vendor’s offerings to customers. Integrating multiple offerings and services is part of the challenge with acquisitions, which Oracle and SAP know all too well. Workday’s past acquisitions have always been functionality-focused and intermittent. The company’s three acquisitions in 2Q18, including its $1.55 billion purchase of Adaptive Insights, is a departure from that strategy but is likely not indicative of broader plans to acquire more fully baked applications. Workday Cloud Platform will allow Workday to leverage partner-developed, inherently integrated technology to expand portfolio breadth.

 

The assumption that Workday’s acquisition-lite approach to investment would be advantageous is not necessarily true. Even without significant acquisitions, Workday’s investment ratio (R&D + Acquisitions/Cloud Revenue) is higher than SAP’s for the three years from 2016 to 2018. Workday had a lower ratio than Oracle, which is spending aggressively on acquisitions, but Workday ranked above SAP in internal R&D investment level proportional to revenue. Additionally, Workday’s streamlined “single line of code” approach is not guaranteeing success in new product categories. HCM revenue growth remains strong, but Workday’s new expansions in Financials and Student are not seeing accelerated early revenue growth. The new offerings are certainly growing, but not at the rate one would expect given the strong HCM base into which they can be cross-sold. The large acquisition of Adaptive Insights could be part of a change in strategy to add inorganic revenue and could lead to greater cross-selling possibilities for the Financials business.

People, methodology and trust: PwC’s Tokyo Experience Center

Uncertainty, globalization and trust: How PwC suits the Japanese market

In describing PwC’s presence in Japan, firm leaders said professionals in the consulting practice make up 2,500 of 7,300 total at the PwC Japan firm, with the practice’s revenues growing more than 20% year-to-year.

Echoing sentiments expressed by PwC consulting leaders last month in New York City, the Japan-based team said systems integration (SI) work, currently earning approximately 20% of consulting revenues, would expand in coming years as the BXT model pulls through long-tail SI opportunities. Speaking more broadly about the Japanese market, PwC’s leaders noted that their own research revealed that Japanese companies believe the U.S. and China matter most with respect to overall growth, with the U.S. economy increasingly more important to Japanese companies than China’s economy. In addition, while global executives have cited overregulation, terrorism and geopolitical uncertainty as the top three threats to growth, Japanese executives are worried most about the availability of key skills, especially in digital and emerging technologies. Further rounding out the landscape, PwC’s Japan-based leaders said local companies have expressed a renewed interest in overseas M&A opportunities, in part due to saturation of the Japanese market. PwC leaders added that previous “misconduct” by acquired companies and overseas subsidiaries makes some Japanese companies nervous, causing them to exercise caution and restraint when considering potential acquisitions. Even after folding in cybersecurity issues and overall political and economic risk, plus the costs associated with post-merger integration, the M&A picture appears positive, but quietly so. Within this complete market environment, PwC’s local leaders, including Susumu Adachi, Consulting CEO (Japan); Yukinori Morishita, Group Markets leader; and Nobuaki Otake, Business Transformation lead partner, repeated the message that PwC’s expanding role in Japan revolved around trust—a familiar refrain from previous PwC Experience Center visits and analyst events in Miami, New York, Shanghai, Toronto, and Frankfurt, Germany.

 

On Oct. 3, PwC’s Tokyo Experience Center hosted its first-ever Analyst Day in Japan, marking a significant expansion of the firm’s BXT approach across the globe. Leading the event, Koichiro Kimura, PwC’s Japan group chairman and territory senior partner, outlined the firm’s growth and strategy in Japan as well as initiatives launched by both the Experience Center and the firm’s Data & Analytics (D&A) practice. PwC leaders and Japan-based clients rounded out the event with detailed examples of the firm’s relationships and work across multiple offerings, including cybersecurity, business process reengineering, artificial intelligence and change management.

Customer preferences are forming around hybrid and shifting around open source as vendors focus on acquisitions

Prebuilt devices are a ray of clarity amid the fogginess of hybrid

Hybrid can be a difficult thing to define in cloud computing. The term “hybrid” is overused by vendors but underused by customers, causing general confusion over its definition as well as solid examples of hybrid solutions. An area of the market that cuts through those areas of confusion is hybrid cloud integrated systems. These are physical devices (appliances) that are designed to integrate with public cloud services and can be used in customers’ own data centers. The idea that customers can physically touch the box and also integrate with external cloud services makes integrated systems one of the easiest and most obvious hybrid scenarios.

Examples of integrated systems solutions include Azure Stack from Microsoft and its hardware partners and Cloud at Customer from Oracle. While adoption and usage of these hybrid cloud solutions remain limited, the trend is picking up momentum and is prompting vendors such as Amazon and Google to move closer to competing in the space, particularly as customer demand from heavily regulated industries favors local versions of vendor-hosted cloud infrastructure. For example, Amazon Web Services (AWS) and Microsoft are the two front-runners in the race to win the U.S. Department of Defense’s Joint Enterprise Defense Infrastructure (JEDI) contract. While AWS has largely been seen as the overall favorite, its Snowball Edge offering does not meet the same bidirectional synchronization requirement of the tactical edge device that Azure Stack does.

Kubernetes season is in full swing as OpenStack falters

For large enterprise customers, open-source technologies have garnered much interest as part of their cloud strategies. The ability to utilize solutions that provide the same backbone as large cloud providers while maintaining the control associated with open source has been an attractive value proposition for those with the resources to implement and manage them. However, predicting which technologies will be the most commonly adopted has been more challenging, creating uncertainty around frameworks such as OpenStack, which has yet to garner significant momentum in the market.

Compounding the hurdles for OpenStack to overcome continues to be the ongoing explosion in growth among public cloud IaaS front-runners AWS, Google, Microsoft and Alibaba. OpenStack founders and former OpenStack pure plays are making notable shifts toward Kubernetes. The difference, though, is that Canonical and Red Hat are still holding onto OpenStack, while others, such as Rackspace, Hewlett Packard Enterprise, IBM and Mirantis, de-emphasize it.

Customers increasingly understand the benefits of containers and container orchestration platforms and embrace the portability and interoperability they provide. According to a recent interview done as part of TBR’s Cloud Customer Research Program, a retail SVP, CIO and CTO said, “You need to make sure there are escape clauses in your contracts in case you need to get out. Once you’re in it, you’re pretty much married, and that divorce is really bad. That’s the reason we have a container. … Because if it starts to get too expensive, we want to pull it off quickly.”

This is just one example of the immediate enterprise benefits of container and container orchestration platforms, which can change the game for enterprises in terms of their cloud adoption road maps and long-term cloud plans.

Hybridization is becoming even more widespread than customers realize

While pre-integrated devices are the most obvious examples of hybrid usage, the vast majority of activity is occurring in more subtle situations. This activity is driven by the desire among vendors to sell broader solutions and the desire among customers to implement services that integrate with existing and other new technologies. The good news for both sides of the market is that there are more capabilities than ever to put those more cohesive, integrated solutions in place.

Salesforce, whose solutions are commonly integrated into hybrid environments, has taken a notable step into the hybrid enablement space by acquiring MuleSoft. The acquisition, which closed on May 1 at the start of Salesforce’s FY2Q19, brings MuleSoft’s well-known integration Platform as a Service (iPaaS) solution and services into Salesforce’s arsenal. The implications for Salesforce, its customers and the market are vast, as the company can create connections between its applications and the variety of other cloud and legacy systems residing in customers’ environments. Salesforce quickly leveraged the iPaaS technology, bringing Salesforce Integration Cloud to market within the first few months of having MuleSoft on board, enabling customers to augment their Salesforce applications and derive greater insights from their non-Salesforce data.

Webscale competition increases among carrier cloud providers

Combined Cloud as a Service revenue for telecom operators in Technology Business Research Inc.’s (TBR) 2Q18 Carrier Cloud Benchmark rose 26.3% year-to-year in 2Q18 due to strategic acquisitions and alliances, investments in new data centers, and portfolio expansion in growth segments such as SaaS and hybrid cloud. All benchmarked companies sustained year-to-year Cloud as a Service revenue growth in 2Q18 as significant opportunity remains for carriers to target businesses seeking greater cost savings, scalability and efficiency by migrating traditional infrastructure and applications to the cloud.

Certain Asia- and Europe-based operators including China Telecom, Telefonica and Orange accelerated Cloud as a Service revenue growth in 2Q18 as the companies benefit from data sovereignty laws, such as General Data Protection Regulation (GDPR), requiring cloud data to be stored in local data centers, which is slowing the growth momentum of U.S.-based webscale providers in these regions. Pressure from U.S.-based webscale providers will continue to increase over the next five years in Asia and Europe, however, as they ramp up data center investments and partner with local data center providers to gain traction in these regions.

 

 

TBR’s Telecom Practice provides semiannual analysis of Cloud as a Service revenue in key segment splits and regions for the top global carrier cloud operators in its Carrier Cloud Benchmark. Operators covered include Bharti Airtel, British Telecom, CenturyLink, China Telecom, Deutsche Telekom, Korea Telecom, NTT, Orange, Singtel, Telefonica and Vodafone.

ICO as a ‘medicine show’: EY finds abysmal performance in wild west of initial coin offerings

Last December, EY Global Blockchain Leader Paul Brody recognized the breakout market for initial coin offerings (ICOs) and launched a longitudinal study, centered on class of 2017 companies that is fueled by this new way of raising money for software startups. One year later, as detailed in EY’s report published today, market valuations for the top 10 ICOs were off 55% — abysmal performance by any standard. Buried in the bad news for almost all the companies, one can find a few bits of success, particularly with companies providing blockchain infrastructure. The incredibly poor performance around incubation makes a strong case, to use a “Deadwood” metaphor, that snake oil salesmen made up most of those 2017ers. As this year comes to a close, around one-quarter of the initial ICO-backed companies have a product in the market, further evidence the breakout included a number of outright frauds. In addition, of the 25 companies that had products, seven devalued the use of utility tokens by allowing payment in fiat currency, facing up to enterprises’ persistent reluctance to conduct business transactions in anything but hard currencies. Curiously, paying in tokens, according to Brody in a discussion with TBR prior to today’s announcement, came across as only the second-biggest obstacle to commercial adoption, with the first being the desire for transaction privacy — a desire pure public blockchains cannot satisfy. In EY’s previous report on ICOs, issued last December, the firm anticipated the third-greatest objection, concerns over full regulatory compliance, an insight that tracks closely with EY’s tax and audit credentials.

Today’s report includes a few nuggets revealing the depth of EY’s study:

  • “Companies that have made meaningful progress toward working products only increased by 13% in 2018. 71% have no offering in the market at all. Typically, within one year of a traditional venture-backed software startup, you would expect to see a significantly higher percentage of the companies with a functional early stage product.”
  • “Seven out of 25 reviewed projects accept other currencies, rendering utility tokens less valuable. Some projects have altogether dropped their utility tokens to focus on functionality. To become a means of payment, utility tokens have to be stable. If it remains stable, the token is of little interest to speculative investors.”
  • “Globally, sources of funding will likely shift away from retail investors toward entities that can understand and manage the downside risks, such as venture capital and digital asset-focused investment funds.”

Will next year be better? The blockchain infrastructure companies will likely be surpassed by a second wave of ICO-funded companies, with most of these taking an asset-backed approach to token issuance, essentially creating a product that is enterprise-ready at a time when buyers are not convinced of the benefits of placing all their assets on the public blockchain domain. This then raises the question: Do new wave ICO-funded companies need to rip pages from Ethereum’s playbook or simply play within its orbit? Ethereum is not a one-size-fits-all solution, but it certainly provides a solid foundation for many to learn from, especially around its “smart” contact functionality. Further advancing along some of the must-do steps EY pointed out in its December 2017 report, this second wave will more adequately address the need for clear justifications for blockchains and tokens; an ICO process more closely aligned to the initial public offering (IPO) process; enhanced security; and something close to legal compliance, or the regulators will simply begin enforcement substantial enforcement. In short, privacy trumps transactability.

The regulatory aspect piques my interest, in part because of the know-your-customer (KYC) aspects of post-ICO-linked financial transactions and recent efforts of EY, among others, to better incorporate emerging technologies into anti-money-laundering and KYC operations.

In this wild west, with its unregulated moral hazard, where does EY fit in?

My initial thoughts had the consultancy as the “Deadwood” preacher, known to all and trusted, but neither the law nor the bank. My colleagues convinced me EY will be more like the General Store, providing certified, trustworthy services and goods, helping clients mine for gold without shortcuts and faulty equipment that bring down the whole operation. Now imagine artificial-intelligence-enhanced, blockchain-powered resupply brought into Deadwood.