East meets West: A comparative tale of two e-commerce giants placing big bets on the cloud

Alibaba Cloud, AWS focus on build-outs of global footprints via infrastructure investments in 5G and expansion of data center and edge locations

While the growth of the two globally dispersed e-commerce giants Alibaba and Amazon is largely fueled by retail, both businesses have showed marked dedication to the growth of their respective cloud empires, focusing on infrastructure to fuel global expansion and investment in augmenting their respective portfolios. The investment in cloud is evident as the backbone driving each business as they compete on the global stage to become leaders in digital transformation (DT).

Alibaba Group’s aforementioned profit margins, fueled by its B2B operating model, have enabled a hefty investment of $28 billion dedicated to the cloud business. As the world was gripped by the initial effects of the COVID-19 pandemic back in April, the parent company announced the allocation of the sum, which stands as a massive proclamation of the company’s dedication to cloud and related technologies as the core drivers toward the enablement of DT. The sweeping investment, coupled with new leadership and an expanding partnership strategy, solidifies Alibaba’s intent to position its cloud business as a viable contender against AWS, especially in APAC. Alibaba is clearly placing a large majority of its bets on cloud and the future of DT, as the investment equates to 40% of its total 2019 revenue and is 5.7x the revenue of Alibaba Cloud.

The investment will have a profound impact on Alibaba Cloud’s ability to execute on strategies around DT, infrastructure build-out and R&D, and came at a time when the world could not have been in more need of capabilities such as increased bandwidth and enterprise and SME support. The focus in the medium term is a multifaceted push to gain scale globally, attract new customers and expand wallet share with existing customers, and much of this growth will be propelled by the expansion of Alibaba Cloud’s infrastructure backbone with the build-out of data centers and investment in 5G across EMEA and APAC.

Since solidifying its dominance in China and garnering competitive positioning on the global stage, Alibaba has been frequently referred to as the “Amazon of China.” Both companies have anchored their businesses as e-commerce platforms and have demonstrated parallel growth trajectories, becoming mainstays in the lives and businesses of customers globally. The uniqueness of their respective journeys, which have been significantly shaped by their foundations as e-commerce giants, does not overshadow the companies’ similar strategies. Over time, Alibaba and Amazon have evolved rapidly into diversified companies with a distinct focus on technology and digital transformation. While the companies are in different phases of their growth, in terms of size and global footprint, and have different operating models, the investments in and focus on their respective cloud businesses to drive growth are evident when comparing their evolutions and forward-looking growth strategies.  

Unprecedented government support will help CSPs deploy 5G more quickly and broadly than originally anticipated

CSP spend on 5G infrastructure will scale faster and peak higher than originally anticipated due to the vast amount of support by governments in a range of countries, including but not limited to China, the U.S., the U.K., Japan, South Korea and Singapore. Due to this, typical historical deployment curves for cellular technologies will not apply to the 5G market, which is now expected to be widely deployed globally by the middle of this decade instead of in the later years of the decade. This pull forward and broadening of infrastructure investment are primarily due to attempts by leading countries to support their economies amid the COVID-19 crisis as well as to keep pace with China’s aggressive and broad investment initiative for competitive reasons. Over the past 12 months, 5G has become a highly political issue, and this unprecedented government involvement and funding are being justified on national security, economic competitiveness and public health grounds.

The 5G Telecom Market Forecast details 5G trends among the most influential market players, including both suppliers and operators. This research includes current-year market sizing and a five-year forecast by multiple 5G market segments and by geographies well as examines growth drivers, top trends and leading market players.

IT services revenue retained its growth trajectory in 1Q20, but the negative effect from the pandemic will intensify in 2Q20

IT services trailing 12-month (TTM) revenue growth, at 1.5% in U.S. dollars (USD), was down 20 basis points sequentially and 170 basis points year-to-year in 1Q20 as the COVID-19 pandemic began to negatively affect vendors’ revenue growth during March. At every level of every organization, the pandemic forced massive changes in human resources management, pushing vendors to quickly reorganize service delivery to work-from-home models and proactively pursue similar activities with clients as they strive to keep operations running. While vendors are strengthening relationships with existing clients, the pandemic disrupted traditional sales motions, making attracting and landing new logos more difficult in an all-virtual environment, and challenging IT services vendors to develop novel ways to promote new offerings to clients. The pandemic substantially boosted demand for cloud and cybersecurity as all-remote working and delivery necessitated massive changes and brought in new risks.

The IT Services Vendor Benchmark details and compares the initiatives of and track the revenue and performance of the largest global IT services vendors. The report includes information on market leaders, vendor positioning, the IT services market outlook, key deals, acquisitions, alliances, new services and solutions, and personnel developments.

Mipsology’s Zebra looks like a winner

Mipsology is a 5-year-old company, based in France and California, with a differentiated product that solves a real problem for some customers. The company’s product, Zebra, is a deep learning compute engine for neural network inference. While these engines are not uncommon, Zebra unlocks a potentially important platform for inference using the field programmable gate array (FPGA). There are two parts to this story, which is one of the challenges Mipsology faces.

Inference — the phase where deep learning goes to work

Deep learning has two phases: training and inference. In training, the engine learns to do the task for which it is designed. In inference, the operational half of deep learning, the engine performs the task, such as identifying a picture or detecting a computer threat or fraudulent transaction. The training phase can be expensive, but once the engine is trained it performs the inference operation many times, so optimizing inference is critical for containing costs in using deep learning. Inference can be performed in the cloud, in data centers or at the edge. The edge, however, is where there is the greatest growth because the edge is where data is gathered, and the sooner that data can be analyzed and acted upon, the lower the cost in data transmission and storage.

Specialized AI chips are hot, but the mature FPGA is a player too

For both training and inference, specialized processors are emerging that reduce the cost of using deep learning. The most popular deep learning processor is the graphics processing unit (GPU), principally Nvidia’s GPUs. GPUs rose to prominence because Nvidia, seeing the computational potential of its video cards, created a software platform, CUDA, that made it easy for developers and data scientists to use the company’s GPUs in deep learning applications. The GPU is better suited to training than inference, but Nvidia has been enhancing its GPUs’ inference capabilities. Other specialized processors for deep learning inference include Google’s Tensor Processing Unit (TPU) and FPGAs.

FPGAs have been around since the 1980s. They are chips that can be programmed so the desired tasks are implemented in electronic logic, allowing very efficient repetitive execution, which is ideal for some deep learning inference tasks. Mipsology lists several advantages of FPGAs over GPUs for inference, including a lower cost of implementation, a lower cost of ownership and greater durability. While FPGAs have been used in some implementations, including on Microsoft’s Azure platform, these chips have not received the attention that GPUs have.    

Zebra is where inference meets FPGAs

Mipsology’s Zebra compute engine makes it easy for deep learning developers to use FPGAs for inference. Zebra is a software package that provides the interface between the deep learning application and the FPGA, so that specialized FPGA developers do not to have to be brought in to exploit the benefits of the processors. Zebra is analogous to nVidia’s CUDA software; it removes a barrier to implementation.

Bringing together the puzzle pieces

FPGAs are mature and powerful potential solutions that lower the cost of inference, a key to expanding the role of deep learning. However, the programming of FPGAs is often a barrier to their adoption. Zebra is an enabling technology that lowers that barrier. In the world of specialized solutions based on broadly applicable technologies such as deep learning, there are opportunities for products and services to make it easier to assemble the pieces and lower the cost of development. Zebra is exploiting one of these opportunities.

Buoyed by Red Hat profits, IBM’s CEO sees ‘progress’ in shift to cloud and AI

“‘A year into becoming part of IBM, Red Hat has not disappointed and is a major component of the new and diversified life that has been breathed into the IBM portfolio,’ said  said Nicki Catchpole, senior analyst at TBR Cloud and Software.” — WRAL TechWire

COVID-19 is driving IBM, IT industry to deliver faster ‘edge’ computing

“The use cases for edge computing were already vast and varied prior to the pandemic. Self-driving cars, heartmonitoring devices, crop-sensing machinery and inventory-management sensors are examples that scratch the surface of how the low latency, bandwidth management and advanced analytics afforded by edge computing are valuable in a variety of industries.” — WRAL TechWire

Informatica acquires Compact Solutions to improve metadata management across the enterprise

Enterprises are developing hybrid IT environments at an increasing rate, leveraging cloud-based technologies to achieve flexibility and scalability while keeping workloads on premises that contain sensitive information or require low latency. This creates complex IT environments that need to be integrated, governed and made compliant with local regulations.

Informatica has positioned itself as a leading provider of data integration tools and metadata management offerings, the latter of which provide higher-level information about data, such as where it lives, how it moves and what the data set contains. This better enables IT departments to discover, inventory and organize their data sets across the IT landscape, and allows them to better serve their business stakeholders that need easy access to relevant data. Informatica’s Enterprise Data Catalog (EDC) has already been able to connect metadata from sources like on-premises and cloud-based applications, databases, data lakes, data warehouses and cloud platforms. However, while EDC has been able to incorporate the majority of data sources, it was not an all-ecompassing solution. The vendor recently filled EDC’s portfolio gaps with its acquisition of Compact Solutions, which brings the acquired company’s MetaDex technology into EDC. TBR spoke with Informatica’s Senior Director of Product Marketing Dharma Kuthanur, who noted that the acquisition will help Informatica “extract metadata and lineage from the most complex data sources across the enterprise as well as legacy sources (e.g., mainframes). Other sources include code include code and stored procedures that you would find inside databases and data warehouses like Teradata or IBM Db2 warehouse or Oracle warehouse — so really being able to parse static and dynamic code that’s inside the databases that transform the data — and often this used to be a black box, limiting full understanding of data across the enterprise.”

In particular, the integration of MetaDex capabilities into EDC further enhances the ability of customers to catalog and govern their data from a broader range of sources, including mainframes; multivendor extract, transform, load (ETL) tools; code; and statistical and business intelligence tools such as SAS. Adding these capabilities makes Informatica a one-stop shop for metadata connectivity and data lineage across the entire data landscape spanning hybrid IT environments. Providing customers with visibility and control over all of their data better positions Informatica as the foundation of customers’ AI, machine learning and analytics-related initiatives. In addition, this increases the value proposition of the company’s broader suite of data management products and solutions like Axon Data Governance.

MetaDex rounds out EDC’s capabilities, enabling end-to-end metadata connectivity and data lineage

Large vendors such as IBM, Oracle and SAP have competing metadata tools such as Watson Knowledge Catalog, Oracle Metadata Management and SAP Data Hub, respectively, which are mostly used by the vendors’ existing customer bases. These offerings do have some overlap with Informatica’s portfolio, but Kuthanur noted that “for pure stand-alone opportunities we don’t really [compete with] them because we have a much stronger set of capabilities.”

TBR believes Informatica’s strategy around enterprise data management separates it from the larger incumbent vendors like IBM, Oracle and SAP, due to the vendor-agnostic nature of Informatica’s portfolio. According to Kuthanur, “We position our catalog as a ‘catalog of catalogs’ because we can scan all types of data sources across the enterprise, across multiple clouds, including from some of these third-party catalogs like AWS Glue, and then combine that metadata from metadata across the rest of the enterprise.” There are smaller vendors with vendor-agnostic portfolios for enterprise data management such as Collibra and Alation, but these competitors have smaller global footprints and limited metadata connectivity and lineage capabilities compared to Informatica. Informatica’s acquisition of Compact Solutions also puts the vendor roughly 12 to 18 months ahead of Collibra and Alation in regard to portfolio innovation.

Logicalis’ local scale and investments in services transformation offerings position it well to withstand the COVID-19 headwinds in LATAM

Following one of the last in-person analyst events held in in Sao Paolo, Brazil, just before COVID-19 took over our personal and business lives, TBR had a chance to reconnect virtually with Logicalis LATAM CEO Rodrigo Parreira and Logicalis LATAM Director of Strategy Eduardo Harada. Expanding on our discussion at the analyst event in February, Parreira confirmed many of the regional market trends are still in place, although some initiatives have been paused, with COVID-19 forcing buyers to reorient their budget priorities toward run-the-business awards.

A spike in demand around supporting remote work, implementation and management of collaboration tools, and security plays to the strengths of Logicalis’ value proposition, particularly within the infrastructure services domain. Parreira is even more optimistic about a resurgence of opportunities in 2021 as regional buyers begin to solicit services in areas such as automation and AI, creating increased opportunities around IoT and connected devices. As Parreira positioned it, “The crisis accelerated automation.” He also highlighted that more than 50% of Logicalis new bookings have been geared toward services, accelerating the company’s efforts to become an IT services leader.

TBR is not surprised to hear there is an uptick in demand for automation considering the technology’s potential to lower the total cost of ownership, which is of particular importance to highly price-sensitive buyers in the LATAM market. We believe vendors that have experience adopting and scaling automation tools to drive down their own costs, improve remote delivery and retain savings will see immediate rewards. However, Logicalis may face an uphill battle educating regional customers on the value of AI beyond cost optimization, as many continue to see AI as a threat to their jobs. The company, however, is well positioned to capitalize on the trust it has built over the past six decades of operating in the region.

TBR previously wrote, “While the company’s business consulting unit spearheads outcome-based pricing initiatives, we believe Logicalis could further accelerate its value proposition transformation if it approaches every opportunity with scale in mind from the beginning. To execute on such a strategy, the company would need to further build out its consulting and application services capabilities, with acquisitions in these domains highly likely.” COVID-19 will likely fuel market consolidation, including in the LATAM market. Both Parreira and Harada believe consolidation in the LATAM market will be even greater as the challenging macroeconomic conditions will drive many smaller vendors out of business. Acquiring for capabilities, not for scale, would deepen Logicalis’ value proposition in both existing and emerging domains, including AI and SaaS. Larger global peers, though, including the Big Four and multinational corporations, are also scouting for price-competitive targets, possibly pushing Logicalis to take more aggressive action sooner.

Intelligent supply chain and ports: Atos on the present and future of digital transformation in port operations

Applying emerging technologies to supply chains  

In a wide-ranging discussion, Atos Technology and Innovation Lead Erwin Dijkstra and his colleague Bas Stroeken, Scrum Master & Pre-sales Consultant – Intelligent Supply Chain, shared a few key insights into their company’s strategy on integrating emerging technologies, such as AI, blockchain and IoT, into maritime port ecosystems, highlighting Atos’ current clients and use cases. Noting that Atos’ client base includes airports as well as traditional supply chain solutions buyers (such as manufacturers), Dijkstra and Stroeken described Atos’ differentiation as its ability to integrate across an entire enterprise and ecosystem, optimize around delivery times, and build a platform for intelligent supply chain management, which Atos then manages as a service to the client. A critical factor for Atos’ clients, according to Dijkstra and Stroeken, has been the company’s in-depth examination of actors and roles within an enterprise and how those actors will engage with the platform. Various roles require different information and options in the event of an out-of-plan event, making the ideal platform more than simply a collection of data points and alerts. As Stroeken explained, real-time insights are meaningless if everything is going according to plan (think Homer Simpson working at the nuclear power plant — all good, until it is not). When something deviates from expectations, multiple actors need to be alerted, informed and given options for remediation. With multiple actors involved, real-time information becomes critical as one person’s decision nearly always impacts options or needed actions for others in the ecosystem.

Bringing the discussion back to the broader enterprise level, Stroeken made two observations that resonated with TBR. First, professionals tasked with managing supply chains within many enterprises are not deeply experienced in AI, which necessitates Atos acting as the bridge between the technology and the humans who need to understand it, deploy it and benefit from it. Second, as Stroeken said, “Collaboration begins with the proper sharing of data,” which may be a perfect mantra for digital transformation and emerging technologies.

Atos provided two additional use cases, both tied to port operations, specifically customs, an area in which Atos has expertise. In the first, natural language processing and AI contribute to understanding the text in customs forms, improving and expediting the classification process. In more colorful terms, Dijkstra explained how a drone could be classified as a toy, a military use item, or a camera, all with different tax implications, creating a need for assistance among customs agents to get the classifications correct. In a second use case, Atos helps cargo screeners operate more efficiently and with fewer random checks by scanning containers with X-ray machines and using AI to match the images to the manifests. In both cases, Atos operates as the integrator, bringing together various emerging technologies and providing the platform for clients’ continued operations.

TBR and Atos also discussed blockchain as a tool across the maritime shipping and supply chain ecosystems. While the well-known benefits of increased transparency and a more level playing field appeal to enterprises across the shipping world, including manufacturers, ports and shipping operators, Atos’ role primarily comes through facilitating adoption and overcoming the human barriers, such as lack of trust in the technology and uncertainty around data-sharing (see the collaboration mantra above). In TBR’s view, blockchain solutions apply more readily to supply chain than nearly any other use case outside of bitcoin. Atos’ approach — which assumes the technology has been proved secure and reliable, but the humans need coaching — reflects what TBR believes will be the long-term reality for blockchain.

We continue to be intrigued by ports as test beds for emerging technologies and as starter kits for large-scale smart cities. Following a presentation on IoT by Dijkstra, TBR analysts discussed intelligent supply chain solutions, ports and emerging technologies with Dijkstra and Stroeken, including details about Atos’ use cases and current offerings. The following reflects that discussion as well as TBR’s analysis of the consulting and IT services opportunities around emerging technologies, including insights from TBR’s Digital Transformation portfolio and Management Consulting Benchmark.

COVID-19 pushes IT architecture further to the edge

The growing impetus for edge computing in a pandemic-burdened world

It is an understatement that the COVID-19 pandemic and resulting shutdowns have dramatically altered the ways individuals live and businesses function. Reliance on networks, infrastructure, the cloud and associated technologies has never been greater, and the effect of such dependence has laid bare weaknesses in existing architectures. The result has been a proliferation of opportunities and use cases for technology to remediate the pandemic-driven impacts to daily life, namely remote work, increased video streaming and surges in virtual collaboration.

Edge computing is one such supporting technology that was already becoming increasingly relevant in a world where low latency, advanced analytics and intelligent data mining were quickly gaining momentum across a wide range of industries. As devices have become more common and require more processing power, an increasing amount of data was already starting to be generated on what is referred to as the edge of distributed computing networks. By sending only the most important and least time-sensitive information to the cloud, as opposed to raw streams of it, edge computing preserves bandwidth, eases the burden on the cloud and reduces cost. The pandemic then serves as the quintessential blanketed use case for edge computing as the benefits provided by computing data at the edge, such as reduced networking burdens and increased processing speed, address many of the issues caused by the sudden spikes in network traffic and burden on systems.

The COVID-19 pandemic is dramatically accelerating digital transformation timelines for many enterprises while fundamentally changing the ways we interact with and consume data. As remote work and self-isolation measures have resulted in a dramatic uptick in the use of the web, cloud computing has become essential to businesses and people’s personal lives. Edge technology has only recently become recognized as a complementary evolution of cloud computing, and adoption of the technology has been more widespread. Previous use cases centered on leveraging edge computing’s core value proposition of alleviating challenges associated with bandwidth, latency and near real-time analytics. The sudden shift in workloads and network traffic, coupled with bandwidth constraints, has shined a spotlight on how the benefits afforded by edge computing can alleviate the challenges created by the pandemic.