Snowflake’s Data Warehouse Evolution: Embracing a Single Platform Approach for Success
During Snowflake’s fifth annual Summit Conference, CEO Frank Slootman discussed how the data cloud market has evolved over the past decade. Less about the technical workload and more about business ecosystems and relationships, data cloud is emerging as the cornerstone of digital transformation and will play a critical role in democratizing data, making generative AI possible and helping businesses discover new revenue streams. Through a series of customer success stories, product demos, and talks from executives including Snowflake Product SVP Christian Kleinerman and NVIDIA CEO Jensen Huang, Snowflake articulated to investors, customers and partners its vision for becoming the data cloud platform for the broadest set of workloads, including, of course, large language models (LLMs) and generative AI. Snowflake Summit 2023 was the company’s largest event to date, with roughly 23,000 in-person and virtual attendees, 250 customer speakers and over 200 ecosystem partners.
Snowflake Is Adapting to Support the Next Wave of Enterprise Workloads
A decade ago, when Snowflake (NYSE: SNOW) was founded, the data warehouse was in some ways its own industry, offering businesses a powerful tool to ingest, query and analyze structured data adjacent to their transactional databases. As with so many other things in the cloud industry, however, what was true 10 years ago is not true today. And with the rise of cloud, we have witnessed the data warehouse shift from an entire subsector to an enterprise workload, which is being skipped over by customers with data lake architectures.
This market force, along with many others, including the proliferation of data and increasing demand for AI, is causing Snowflake to adapt its portfolio and messaging to address new workloads and business use cases. We have seen this shift take place through support for new languages, libraries and runtimes, including the new Snowpark Container Service, a developer runtime for containerized workloads that will be particularly attractive to customers that cannot rewrite their legacy runtimes for the cloud.
We suspect these types of capabilities, in addition to features that support semistructured and unstructured data, will help Snowflake move more squarely into the data lake space and better compete for new workloads, positioning the company as an alternative to emerging players like Databricks and, in many cases, its strategic hyperscaler partners.
Click the image to register for our upcoming TBR Insights Live session!
The Single Platform Approach Is Key to Snowflake’s Success
There are many reasons for Snowflake’s success, but the defining attribute that remains highly relevant today is the company’s single, platform-based product. While offering various capabilities, industry models and features, the Snowflake Data Cloud is essentially one product. As a result, Snowflake is allowing customers to manage their data in the same way: as a product.
With this approach, Snowflake can monetize new capabilities and updates that the company makes to its core engine — from ML-powered functions to improvements in query duration — become applicable across product capabilities and feature sets, which is key to improving the user experience and potentially helping customers maximize their Snowflake investments over time.
One particularly compelling customer example that highlights the value of the data cloud platform came from Mihir Shah, CIO and enterprise head of Data Architecture at Fidelity Investments. Shah got on stage during the summit to speak about Fidelity’s data cloud strategy, which initially took off three years ago and was created to consolidate the company’s 170 legacy databases into a single, integrated platform.
Three years into the implementation, Shah reports that Fidelity has roughly 800 data sets and 200 applications in production on Snowflake, and the company is not yet finished. There are several takeaways from this success story, but perhaps the most important is how it was only after Fidelity consolidated data in one place that the company could apply governance policies and extend data to key stakeholders throughout the organization. Shah also articulated that Fidelity was only able to adopt a data cloud strategy once it changed its operating model.
This is an important concept, as many large enterprises recognize that consolidating data in one place can be a more scalable, profitable approach, rather than trying to manage and operate multiple siloed database systems — yet many are reluctant to change their operating best practices. Because of this trend, aligning closely with strategic partners to help customers overcome organizational obstacles and view data as their biggest assets will ensure further success and help Snowflake uphold its reputation as a market disruptor.
With Innovations in Snowpark and the Native Application Framework, Snowflake Looks to Disrupt App Development
Abiding by the notion that new software should be built on the data cloud, not on the database, Snowflake is moving further into the app development space, using its expertise at the data warehouse layer to support read/write operations and empower a new set of data-intensive applications. One of the key announcements at the summit was that the Snowflake Native Application Framework is currently available for public preview on Amazon Web Services (AWS)(Nasdaq: AMZN). With this announcement, developers can build applications directly within their Snowflake accounts on AWS, thereby eliminating the need to transfer data.
Customers have been able to take advantage of the Snowflake Native Application Framework in private preview for over a year now, but with public availability on AWS and potentially other clouds in the future, Snowflake is taking steps to better help customers build and test applications natively on Snowflake and potentially monetize those apps by publishing them on the Snowflake Marketplace.
It is hard to argue against the idea that in today’s world, every company is a software company, and brands like Capital One (NYSE: COF) and Bloomberg are using the Native Application Framework to develop B2B apps directly in their Snowflake environments and make them available to consumers.
For historical context, Snowflake’s push into application development took off with Snowpark, a developer environment that allows users to code in three languages — Python, Java and Scala — outside Snowflake’s core SQL interface. Since it became generally available in 2021, Snowpark has accelerated customers’ use of the Snowflake platform in areas such as data engineering and ML and is now being used by 35% of total Snowflake customers and 85% of Snowflake customers spending over $1 million per year.
Announced at Summit, Snowpark Container Services is a big advancement in the broader Snowpark vision of empowering application development. By allowing customers to run code as containers directly within their Snowflake accounts, customers can use their Snowflake data to support their internally developed containerized applications and/or those from third-party providers on the Snowflake Marketplace.
With an ongoing market shift from monolithic applications to microservices, support for containers is a natural opportunity for vendors, and this new engine, which supports NVIDIA’s (Nasdaq: NVDA) high-performing GPUs, could help Snowflake capture another wave of mission-critical, born-in-the-cloud applications.
Generative AI Will Help Bring Unstructured Data into Snowflake and Unlock New Use Cases and Growth Opportunities
In 2023, no tech conference would be complete without the mention of generative AI, which is being actively deployed by 43% of customers surveyed in TBR’s 1H23 Cloud Infrastructure & Platforms Customer Research. Snowflake used Summit to showcase Document AI, which uses an internally developed LLM based on technology from Applica, a Polish company Snowflake acquired last year that has built an AI platform capable of abstracting insights from documents. The resulting Document AI service, currently in private preview, will allow customers to abstract information from documents stored in Snowflake. As highlighted in a product demo, this could include an equipment inspection form, whereby the user asks questions such as “Which machine part was defective?” and “Which employee inspected this part?”
Behind the scenes, customers will be using the LLM to extract information from documents that can then be embedded into their data pipelines, essentially feeding unstructured data into analytical processes that incorporate business users. The opportunity to get unstructured data into the hands of business users makes generative AI a promising opportunity for Snowflake and may support the company’s expansion into other workloads and use cases.
Like its peers, Snowflake is not only interested in using its internal LLMs, including Document AI, but also relying on partner models, including those from AI21 Labs and Reka. As highlighted in a discussion between Slootman and Huang, Snowflake will also be supporting NVIDIA’s NeMo platform for LLM development as part of a new partnership agreement that may help facilitate AI and application workloads within Snowflake. Specifically, Snowflake will host LLMs from its three inaugural partners in the new Snowpark Container Services, which is supported by NVIDIA GPUs, where customers will be able to run and fine-tune models based on their proprietary data within Snowflake.
What Is Next?
While there are several innovations to unpack from Snowflake Summit 2023, from support for Apache Iceberg Tables to the Snowflake Performance Index, all tie into the Snowflake strategy of supporting the broadest set of workloads and use cases through a single, unified platform. With its high-performing technical engine and customers’ need for a more integrated development environment, Snowflake may continue to attract data engineering pipelines to its platform and become the cornerstone for modern app development and, by extension, the de facto data cloud platform.
Why Generative AI Should Be Top of Mind for Business Leaders
/in Special Reports /by TBRTBR dives into predictions about generative AI and its very real disruptions: Today, organizations are exploring ways to leverage GenAI to optimize how they operate the front lines of their customer service processes via contact centers, an industry that currently employs over 11 million individuals; and tomorrow, the technology will provide an alternative to enterprises’ IT departments, which must frequently tap third-party services partners for custom software development, a market that was valued at nearly $25 billion in 2022.