Friends of Data

June 18, 2025


Christian Kleinerman, Snowflake: mapping the next era for data and AI


On the third day of Snowflake Summit, inside of Club Hex, Barry sat down for a fireside chat with Christian Kleinerman, Snowflake’s Executive Vice President of Product.

Christian brings a rare perspective shaped by decades at the helm of some of the most foundational data and infrastructure products in tech. He started his career on the SQL Server team at Microsoft and later led infrastructure for YouTube at Google; now he steers product strategy at Snowflake. Along the way, he’s seen the data ecosystem evolve from on-prem relational systems to cloud-scale platforms — and now, into the AI era.

We talked with him about the surprises he’s encountered with AI’s rapid ascent, what he finds valuable about ecosystem partnerships, and why he thinks the next big opportunity for Snowflake — and the broader ecosystem — lies in becoming trusted custodians of enterprise data.

snowflake-1

This transcript has been edited for clarity.

What's the biggest lesson you’ve internalized about building products?

You have to be a user of the product. Even to this day, I use Snowflake — even for things that I should not be using Snowflake for. This is embarrassing, but my wine cellar at home runs on Snowflake. It's very different when you're trying to do a demo.

The simple answer is to use the product. Use the product, but not in the 'I clicked on a couple of things, and I think I know it now' way. Use it in the way that it's intended. Find the problem and use it from beginning to end to answer the question, and then you're going to realize what works and what doesn't work as well.

What have been the biggest surprises for you about AI in the last few years?

I have no problem admitting that I underestimated the potential and impact of AI. One of our board members, Mike Speiser, started sounding the alarm about three years ago. He was saying this thing is bigger than the Internet. We were like, 'What are you talking about?' In my mind, it was generative AI, so it was all about, 'What can you create?' Yeah, you can have an editor that helps you write SQL. That was the main use case that we focused on and it is still a legitimate use case.

I remember telling him that this was not going to be as impactful to the world of data. That's how naive I was. The piece that I missed was its ability to understand, synthesize, reason, and even think, which is a strong word. But that was not clear two years ago. Two years ago, we built Document AI, which lets you extract fields from documents, and we built Snowpark Container Services, where you can host your own LLM. At the time, hosting LLMs was like doing inference in a machine learning model. These days, it’s like a fancy cluster, which is very complex and there is a lot of Python code around it. The models have gotten really, really advanced.

snowflake-2

What’s unclear about the immediate future?

A lot of things are up in the air. Everyone is now an agent platform. And because nobody wants to be in a silo, everyone's agents could potentially talk to everyone else's agents. Everyone then repeats MCP and A2A as if they need to be buzzword compliant.

The reality is that there's going to be next-generation workflows and next-generation chains of operations. There's going to be interoperability between the systems. I was at a dinner last night, and someone was talking about, 'Oh, you remember WSDL?' That was how SOAP services used to discover SOAP services. That's a similar thing, 'Hey, what can I do with you?' And then there's a standard calling convention.

I think some of those things are going to get reincarnated. What agents do you know about? And how do I invoke your agent? And how do I know what you can do? This is literally an MCP type of protocol. There's a lot that is going to evolve.

At this point, I am very careful about making any predictions on AI and how things play out. It's moving very fast. But the thing that’s most interesting to me is that the technology right now is way more capable than what most people are already using — and that's the opportunity.

How do we map the next couple of years for AI?

It's complicated, and whatever I say might change a couple of months from now. I think one thing that is clearer is that pure-play model companies are not viable. The way the models monetize is through the application of the models and the experiences. OpenAI’s money comes from ChatGPT more so than pure API, which is trending toward becoming a) a commodity, and b) pricing-wise, a semi-race to the bottom. Not all the way to zero, but that's where it trends — its application is replacing entire industries.

In many ways, Google search itself is a formidable franchise, a very profitable franchise, and it's getting threatened. So, I think that's how it happens: application and moving up for the model products. All of them will either build and buy apps or they'll get acquired by an apps company, an application of AI.

In the enterprise world, I think everything will revolve around enterprise data. Yes, there are some use cases for ChatGPT for the enterprise, and OpenAI is selling it incredibly aggressively, and I think they're getting traction. But the true value is when you can understand enterprise data — and enterprise data, as many people have realized in the last couple of months, is easy until things like security, compliance, and governance kick in. That's where I think we, and even Databricks, and the cloud providers are in a good position, because we have become trusted custodians of data. I think what we're trying to do, which is bringing AI and building value on top of AI while preserving and respecting governance, is an opportunity for us.

You all are leaning hard into Iceberg – which on its face is about data portability, which is traditionally something database vendors are trying to avoid! Why?

This is what customers want. You can choose to battle your customers, and it might work for some period of time, but it doesn't work in the long run. At the end of the day, many companies have been burned or are still in the process of being burned from getting locked into a bunch of proprietary technology.

That's a horrible situation to be in. You want people using your technology and handing you dollars because they’re getting value out of it, not because getting out of it is too painful. For many of those customers, when Snowflake was introduced, what we were doing was so much better than any other alternative that they said, 'Hey, we'll happily go and use your format,' and everything was proprietary. But in parallel, the Hadoop type of movement evolved into having the capabilities of Snowflake with open data, which is a much better world from a customer perspective. If that's better for customers and there's no true market alternative or market differentiator, that would be dramatically better.

If we could be a thousand times faster than the competition, we could probably say we're going to delay leaning into Iceberg for a little bit. But when things start to get good enough, that's when you need to be very clear. This is what customers want, so go help them succeed. And we are wholeheartedly leaning into Iceberg.

snowflake-3

How do you think about partnerships?

The world of tech partnerships is incredibly complicated, but it's not a new thing. Go look at IBM or Microsoft in the old days. AWS famously was like, 'Which companies are gonna die in this re:Invent?' They announced something and entire categories of companies would be in trouble.

The way I think about it is: as long as we are very clear and transparent with our partners on where we intend to go, where we think we're going to draw some lines, where we end, and where there's opportunity — there's enough time for partners, which hopefully are smaller and more nimble than we are.

I got this perspective when I was at Microsoft — and they compete with everyone and anyone under the sun. The folks that were running partnerships there at the time made sure that we were clear on our intent in the next 6, 12, 18 months, and then let partners make sure that we figure out how to adjust. I also think there's a philosophy that as long as there are APIs for extensibility, then you can do it.

I believe in the idea that: if you want to go fast, go alone. If you want to go far, go together.

What's something you've heard from a customer recently that surprised you?

A lot of people have bought AI model capacity, but they don't yet know how to use that capacity. And some of the largest companies in the world are like, 'I have $10 million that I am not using. I would love to use Cortex.' They would love to be able to plug their own models and keys into Cortex to drive that capacity.

Are there any use cases you’re really excited about?

I am excited about how the economy will grow and how the world will get better when productivity goes up. The models are ahead of the capability. If the capability or the application of models catches up, there may be a truly painful state of the world for a lot of people. What Sam Altman was saying on Monday is that it's like an entry-level employee, but it's going to get better over time.

I met with someone here at Summit over the last 48 hours and they’re not going to renew their entire call center team in India because the current tests with AI agents are better. This is not someone becoming 20% more productive. This is an entire team that's going to disappear.

That gives me anxiety. It also gives me hope in some ways. I have young kids; what should they study? It's a hard question and these improvements are going to apply everywhere. My 11-year-old is already saying AI is bad because people will lose jobs. This is a true story. So I think a lot about that part – the social impact and the economic impact. It's good, but we need to be careful that it doesn't turn into a really negative thing.

snowflake-4

Audience Q1: How is Snowflake optimizing costs?

Cost management is a priority for us. We added all sorts of controls and monitoring solutions for people to better optimize their use of Snowflake. No one at Snowflake wants a customer to pay us a dollar if they're not getting a dollar of value. Zero interest.

We announced consumption views and visibility across regions and clouds. We announced spend anomalies — we’re forecasting what each one of our customers' spend should be, and if it falls out of a given range, it will trigger an alert.

We announced object tags and query tags so that you can take a workload and associate it with a reporting structure — this lets you see the trend of a tag and determine whether it’s going up or going down.

And I'm delighted that there are a lot of partners helping optimize Snowflake spend. We also introduced semantic views as a way to capture and store a semantic model. I have been told that Hex is an amazing early adopter.

Audience Q2: How do you think about making onerous infrastructure tasks easy?

For better or for worse, most of us were there in the beginning, including our founders. All of us are database people, which is why I think of myself as a database person and why you're going to get me excited by showing me a new SQL feature. But we also acknowledge that there are many areas where you may want either Python APIs or REST APIs directly. We have been increasing the number of areas of the product that can be configured with both Python and REST, but many of them still get bootstrapped with SQL.

Audience Q3: What advice would you give to a startup trying to build traction?

My answer for many years has been: If you're driving consumption, our sales team will find you. That continues to be true, but I think now we also value clarity in the use case.

What you want to do is tap into our sales team, who only has one goal, to hit their numbers with big contracts or consumption. That's one piece, but I think the clarity on the use cases, and what the sales play is, has proven to be almost more impactful than pure consumption.

snowflake-6

With Hex, data teams become high-ROI partners for the business. Want to learn more?