Overview
Despite some of the macroeconomic challenges that have stalled many IT projects, AWS (Amazon Web Services) continues to invest, testing customers’ use of the cloud, and ultimately prepare for a boon in adoption across SaaS, PaaS and IaaS once markets normalize. To support demand, AWS announced several offerings at re: Invent, which, in addition to the usual cadence of updates and features released in response to customer demand, included solutions less customary for the IaaS leader to offer, such as industry-specific modules and a service parallel to a true SaaS business application. When Selipsky took the helm as CEO in May 2021, we speculated how his experience at Salesforce would tie into an infrastructure-centric company like AWS. A year and a half later, it is clear that Selipsky is bringing AWS into a new chapter, one that looks beyond general-purpose infrastructure toward data and application services that will support the next wave of innovation and cutting-edge workload requirements.
New Supply Chain service marks AWS’ foray into enterprise SaaS
With infrastructure services responsible for its large revenue volume and industry-leading margins, AWS is naturally less associated with the SaaS space. Often marketed as business applications, AWS’ SaaS portfolio offerings consist of productivity apps, such as Amazon Chime and Amazon WorkDocs, as well as front-end offerings like Amazon Connect. That is, until now.
AWS used re:Invent to preview AWS Supply Chain, an application-level service that combines Amazon software with AWS’ IaaS and machine learning (ML) services to provide businesses with more visibility into their supply chain operations. With the new solution, users can connect to ERP systems like SAP Business Suite 4 HANA, EDI (electronic data interchange) feeds and other data sources to automatically provision a data lake, import data into a unified model and contextualize it in real time. In his opening keynote, Selipsky argued that getting a complete view of supply chains often requires complex, third-party integrations, but with AWS’ solution, customers can have a unified place for their data, potentially allowing them to consolidate point solutions. This could end up being a tough sell for AWS, however, considering the hold that vendors like Oracle (NYSE: ORCL) and SAP (NYSE: SAP) have on the market, especially as these vendors offer full-suite SaaS solutions, such as ERP and CRM, thus easing many supply chain integration pain points by default.
In other words, customers will by no means rip-and-replace their current systems, but they may still find themselves evaluating their current supply chain management (SCM) products to see if the level of predictive insights matches that of Amazon SageMaker, the ML engine powering AWS Supply Chain. Currently available in preview, the new solution is already gaining traction among companies like Accenture (NYSE: ACN), Lifetime Brands (Nasdaq: LCUT) and Traeger (NYSE: COOK). Amazon-owned retailer Whole Foods is also deploying the solution to improve inventory management and get a real-time view showing when products will arrive at its distribution centers.
The launch of Supply Chain comes just months after Amazon made a notable organizational shift, transitioning Dilip Kumar from VP of Physical Retail and Technology at Amazon to VP of Applications at AWS. This change highlights an ongoing intersection between the commerce and cloud sides of the business, and how Amazon is leveraging AWS to expand the reach of retail solutions like Just Walk Out and, from AWS’ perspective, access unique IP that can help it build new industry applications. Because AWS’ new solution essentially acts as a repository for supply chain data, we do not see it as directly comparable to the traditional SCM solutions on the market, as it is designed more to supplement these companies’ systems rather than replace them. It is still early days for AWS Supply Chain, but regardless, the new service suggests AWS is looking up the stack and building new applications, and Selipsky, with his former experience at SaaS giant Salesforce, could be giving us an early preview into AWS’ transformation to a full-stack vendor.
Customers across industries are building with AWS
With customers becoming more vocal about their need for industry customization, AWS has adapted its go-to-market model to align sales teams by both products and an underlying vertical. With AWS’ competitors making big bets on industry cloud, Selipsky is prioritizing industry solutions, which for AWS largely consist of adding specific rules and policies within popular tools like SageMaker and, in some cases, layering on ISV solutions to address a specific use case. This approach does not necessarily act as a big money maker or help AWS hone in on a particular industry, but nonetheless helps it get a foot in the door with both large enterprise and midmarket customers.
Showcasing some of its latest innovations in healthcare, Selipsky welcomed Lyell Immunopharma, a T-cell testing startup, to the stage. The head of the company described a compelling edge use case, in which they developed an IoT solution on AWS to collect data from medical equipment and process it in real-time via the cloud. This eventually led Selipsky to reveal Amazon Omics, a new Health Insurance Portability and Accountability Act (HIPPA)-compliant solution that allows customers to store, query and analyze genomics data and predict vulnerabilities to target and prevent diseases through SageMaker.
Meanwhile, in financial services, AWS highlighted a deal win with Trust Bank Singapore, which chose AWS as its primary cloud provider and in the course of six weeks has onboarded over 300,000 customers to its digital banking platform. Trust Bank Singapore chose AWS for its user-friendly interfacing and developer tools, which will help the firm build new digital products, and appears willing to fill any industry gaps with third-party software available via the AWS Marketplace. AWS has over 1,600 ISV partners listed on the marketplace, many of which may be wary of AWS building out its own software, including industry-specific apps that could be sold into more strategic accounts. That said, partners should be somewhat reassured by the fact that AWS is still an IaaS provider at its core and AWS Marketplace partners are key to helping customers easily buy software and ultimately expand their AWS consumption.
AWS unveils new services and integrations to become more entrenched in its customers’ data pipelines
Swami Sivasubramanian, AWS VP of Data and ML, delivered the third keynote of the conference and spoke of the importance of an end-to-end data strategy, which he contended is made up of three core components: a strong data foundation, data connectivity and data democratization. According to AWS, the data foundation includes areas customers cannot compromise on, such as reliability, security and scalability. AWS showcased several innovations that address these areas, including Redshift support for multiple Availability Zones, which allows Redshift clusters to run on multiple instances at once to protect critical applications in the event of a data center outage, and Amazon GuardDuty RDS Protection, which extends threat protection capabilities to data stored in Aurora.
Perhaps the most notable release addressing the data foundation was support for a zero-ETL (extract, transform, load) integration between Amazon Aurora and Redshift, which will allow customers to bring data from Aurora databases into the Redshift data warehouse, without requiring them to build and manage ETL pipelines. In addition to addressing a top pain point among data scientists, this announcement marks AWS’ push toward what the company is calling a “zero-ETL” future, one that heavily automates a process that has defined the database market for decades.
That said, in our view, zero-ETL, like so many other concepts in the cloud industry, is somewhat of a marketing term; even as customers benefit from having data automatically replicated from Aurora to Redshift, they will still likely have to perform some level of ETL on the data sitting in Redshift, whether it is extracting data from third-party files or transforming SQL queries. AWS is also not the first cloud provider to innovate in the space, with Google Cloud (Nasdaq: GOOGL) bringing its support for Bigtable data on BigQuery into general availability this past August. With this integration, Google Cloud, like AWS, looks to eliminate manual aspects of ETL pipelines and make it easier for customers to copy data into its data warehouse solution, BigQuery.
The new zero-ETL Aurora and Redshift integration is an indicator AWS is trying to fend off competition from cloud-native database peers, some which are benefiting from more open, infrastructure-agnostic approaches that AWS has yet to adopt. Even legacy database giant Oracle has taken steps to make its database services available on competing public clouds, including AWS, to retain market share. In general, existing customers — 94% of whom reportedly use 10 or more database and analytics services — will appreciate steps AWS is taking to automate the data pipeline, but a promise of a zero-ETL future may not be enough to sway customers outside the AWS database ecosystem.
New connectivity, governance and management capabilities help AWS deliver promise of an end-to-end data strategy
Already strong in the foundational aspects of the data life cycle such as storage, querying and analytics, AWS rounded out its portfolio with new connectivity, cataloging and governance solutions. One of the more notable announcements in this area was Amazon Data Zone. As highlighted by the head of product for Amazon Data Zone, Shikha Verma, the new solution offers customers a unified environment where both data producers and consumers can access, share and utilize data. For instance, Verma laid out a scenario where a marketing team wants to run a campaign and therefore needs to access and analyze sales data, which is dispersed across a data lake, data warehouse and third-party systems. The sales team can use Data Zone to import and cleanse data and provide the marketing team with access to the data so they can effectively run their campaign. Looking to improve data access, AWS also announced 22 new data connectors for AppFlow, its integration service, including support for Google Ads, Zoom Meetings, MailChimp and Stripe.
Making data more accessible throughout an organization, including to the line-of-business domains, remains a key issue that many pure plays and AWS partners are trying to solve. For instance, TBR also attended Informatica World 2022, recalling a similar scenario where the marketing analyst needed more data to run a campaign and with the Informatica Data Loader tool was able to pull data from a third-party system and ingest it into Google Cloud’s BigQuery. At re: Invent, AWS announced support for Informatica Data Loader on Redshift, so customers can similarly copy data to AWS’ data warehouse solution. This integration may provide AWS with inroads to cross-sell newer offerings, including Data Zone, and is one example of strong partner synergies at work between the hyperscalers and pure play PaaS vendors.
That said, these synergies do not come without risk. AWS’ rapid exploration and innovation of new markets poses a challenge for partners as AWS offers customers native alternatives that, if competitive enough in features and capabilities, may cause customers to re-evaluate their use of point solutions. Through 2023, AWS will have to walk a fine line between where it innovates directly and uses partners. This is especially true as Microsoft (Nasdaq: MSFT) and Google Cloud are partnering to build out end-to-end cloud data services, including those that can run across clouds and on premises.
In the final stages of his talk, Sivasubramanian touched on the concept of data democratization, which starts by democratizing access to education. In TBR’s own findings, executive pushback and lack of sufficient skills are among the leading factors preventing businesses from productizing their data and viewing it as an asset, which is why AWS and so many of its partners are investing in training and education programs to advance careers in technology. AWS announced it has trained 310,000 developers on ML technology through DeepRacer and launched 18 new courses for the education platform in 2022 alone. At the event, AWS also announced Machine Learning University, a program for community college educators that offers courses in areas like natural language processing, computer vision and responsible AI.
AWS leads with custom silicon to power a new wave of workloads
AWS’ ability to look up the stack and cross-sell data services to its existing client base is only as good as the breadth and depth of its underlying infrastructure. Aside from being first to market with EC2 in 2009, AWS continues to benefit from the scale of its infrastructure footprint. And while some competitors may tout a greater number of cloud regions, the capacity of AWS’ facilities, in addition to micro data centers suited for latency sensitive workloads, remains largely unmatched. This diverse footprint joined with the Nitro system, which optimizes performance by offloading capabilities like network access and local storage to SmartNICS cards, is key to AWS’ success and helps the company sell its own proprietary processors and capture more critical workloads.
- Unsurprisingly, AWS used re:Invent to showcase a new set of EC2 instances powered by its own Graviton processors. C7gn instances and Hpc6g instances are designed for network-intensive workloads, such as firewalls and load balancers, and high-performance computing (HPC)-specific workloads, like weather forecasting and genomics processing. A key part of AWS’ strategy is expanding the Graviton portfolio, hoping cloud-native customers will spin-up new workloads on AWS chips, especially as existing Intel users may not be itching to move off their industry-standard hardware anytime soon. To support adoption, AWS also announced the Graviton Delivery specialization within the AWS Partner Network. Partners with this validation have approved software and professional services offerings designed to help customers adopt Graviton and optimize their EC2 usage.
- As discussed by David Brown, VP of EC2, managing a networking architecture that meets the needs of administrators, who typically want stricter control over network access, and developers, who want to connect applications without worrying about infrastructure, remains a common challenge. Trying to address both personas, AWS launched VPC Lattice, a networking application that helps users set up service networks within a virtual private cloud (VPC) and apply consistent policies for network access, monitoring and traffic management across AWS accounts. As with some of the new data services announced at re:Invent, VPC Lattice highlights AWS’ strategy of releasing services that appeal to different buyers to help customers consolidate point solutions. For instance, as VPC Lattice expands in global availability and features, network administrators may find they can replace some third-party network solutions. Complementing VPC Lattice, AWS also announced the availability of Network Manager, a new capability within the AWS Management Console that allows customers to monitor network performance in real time.
- Making a push toward HPC workloads, AWS announced the general availability of SimSpace Weaver, a managed compute service that allows customers to run spatial simulations across up to 10 EC2 instances. The service is suited for a range of use cases too demanding for the CPU power and memory typically available in a single compute instance, including smart-city planning or stadium development.
Re: Invent showcased several innovations, but multicloud was not among them
Throughout re: Invent there was a consistent theme of supporting customers with data services regardless of underlying infrastructure (e.g., EC2, EKS [Elastic Kubernetes Service], Lambda). As such, AWS is offering deeper integrations between its products with Data Zone’s connection to Redshift, Athena and QuickSight, as well as the zero-ETL integration between Aurora and Redshift, among the leading examples. Despite these integrations, however, it is abundantly clear that AWS is making it easier to consume multiple products, but only if they are powered by its own infrastructure. Multicloud remains one of the core ways competitors target AWS workloads and brand the company as closed-off and prone to locking customers into its cloud stack. While true in some regards, market dynamics like high inflation and labor shortages are giving new weight to monocloud environments, making it easy for AWS to message around product consolidation and cost savings. As such, for the time being, AWS seems content letting customers use tools from Microsoft Azure or Google Cloud Platform to connect to its cloud rather than enabling a true multicloud environment for customers by offering its own services outside AWS.
Marking its shift toward solution selling, AWS unveils new programs and tools for partners
With over 100,000 partners, some of which can pocket $6.40 for every $1 of AWS they sell, AWS has come a long way over the past decade in how it goes to market. AWS does not re-vamp its partner programs often, but last year’s switch from two partner tracks — Technology and Consulting — to five tracks — Software, Services, Training, Distribution and Hardware — was strategic, providing partners with a simplified way to engage with AWS professionals and customers based on the specific type of product or service they offer. In addition to giving partners more chances to work with AWS resources directly, this model, dubbed Partner Paths, has helped AWS support partners exploring new business models.
The most notable example is the large global systems integrators (GSIs) using both of AWS’ Services and Software tracks as they start to build out IP to supplement traditional services models. Leveraging Partner Paths, AWS announced Solution Factory, which is very much in line with the re: Invent theme of providing customers end-to-end offerings rather than one-off services. With Solution Factory, partners within a particular track can work with AWS experts via workshops, demos and storyboards to roll out prebuilt solutions that address a specific business and industry challenge.
With customers clearly seeking end-to-end solutions, particularly at the PaaS layer, we expect Solution Factory will resonate with the ecosystem and drive greater collaboration among AWS partners, such as ISVs and GSIs, who may see opportunities to work together and provide specific outcome-based solutions on top of AWS. VP of Worldwide Channels and Alliances Ruba Borno also announced new features within the AWS Marketplace, including Vendor Insights, which makes security and compliance information for different vendors available to buyers, and new QuickSight dashboards for data visualization.
AWS is clearly taking steps to help partners better build and sell with the cloud giant, promising higher profit margins for those who engage in solution selling via multiple Partner Paths, new programs like Solution Factory and the AWS Marketplace. Nonetheless, these new offerings may only do so much to address the top concern among most partners: keeping pace with AWS’ evolving portfolio of over 220 services. Some of the large-scale GSIs and software vendors may find it easier to take advantage of AWS’ solution-based programs and chase a $6.40 multiplier, but the Tier 2 consultancies and ISVs, which are key to landing greenfield accounts, may be left playing catch-up. Further, while throughout various keynotes AWS has indicated it “can’t go it alone,” some of the new SaaS and PaaS innovations continue to suggest that AWS is indeed trying to go it alone in hopes of becoming an end-to-end cloud provider. As AWS continues to monetize new offerings over time, it risks boxing out partners and marginalizing opportunities, which may make room for competitors to use selective innovation and partner-first approaches to compete.
TBR Takeaway
Leveraging its IaaS foundation to enable entire data pipelines and build new SaaS applications, AWS has a growing aspiration to offer customers “one hand to shake” when it comes to cloud. We expect this approach will help AWS address customers’ concerns around cost and lack of in-house resources in the near term and win larger deals as customers expand their spend levels beyond basic data hosting in the long term. With its industry-leading profitability — at a 26.3% operating margin in 3Q22 — AWS is in a unique position to build out new product and services capabilities internally, but there are still risks to trying to innovate in too many areas too quickly. Further, as AWS looks up the cloud stack to markets far less saturated than IaaS, the company will welcome more competition and will have to be mindful of not infringing on partners — something its competitors are adept at and will continue to leverage to boost margins and offer clients maximum flexibility on premises and across clouds.
AWS re:Invent 2022: This year, TBR analysts were among the more than 300,000 people who live-streamed the immense amount of content at Amazon Web Services’ (AWS) annual conference, including five executive keynotes and 22 leadership sessions. Macroeconomic uncertainty and its impact on global businesses served as the basis for AWS CEO Adam Selipsky to tout the power of cloud computing and argue that AWS — offering not only more than 600 compute instance types but also a broadening portfolio of adjoining platform and application services — is best suited to address the needs of a maturing enterprise.