Skip to content

A First Look at Snowflake Intelligence

A First Look at Snowflake Intelligence

My team and I had the rare privilege of previewing and working with Snowflake Intelligence over a few weeks in August and September of this year, before it was made available to the general public. Like most technical people, we are always excited about new tools. With all the buzz around AI, however, our excitement got to a point where we were ready to thank anyone and everyone for just allowing us to spend time with this groundbreaking technology. The amount of work or difficulties we could face did not matter. Just give us the thing, please!

A Real World Use Case

A client in the media industry engaged us to conduct a Snowflake Intelligence experiment aimed at assessing its capability to deliver on the broader promises of AI. Our objective was to develop an agentic AI chatbot that would enable users to ask natural-language questions and receive instant insights powered by the enterprise data cloud. The Client was keen to evaluate the ability Snowflake Intelligence would have to:

  • Enable true data self-service by allowing users to independently access, explore, and visualize data without relying on technical teams or waiting for dashboards or reports to be delivered.
  • Support real-time, data-driven decision-making by delivering timely, contextual insights and enabling users to explore "what-if" scenarios so they can respond quickly to changing business conditions.
  • Provide frictionless exploration and insight discovery by handling drill-down follow-up questions, helping to identify patterns or anomalies, and generating projections or recommendations.
  • Improve productivity and cost efficiency by reducing ad-hoc data requests, thereby freeing analysts and engineers to focus on higher-value work, such as modelling, forecasting, and governance.
  • Enhance the ROI of existing data investments by expanding access to insights throughout the organization and transforming data into actionable intelligence at every level.

Of course, there were concerns as well. How accurate would the data be? Could sensitive data be protected from unauthorized access? What would be the cost of such a tool?

Being new to an agentic AI implementation and Snowflake Intelligence itself, the team had its own questions. How do we approach this project? What would the implementation look like? How much effort would it take to stand up the chatbot? With Snowflake Intelligence still in preview, how reliable can it be expected to be? We had a timebox of only four person-weeks to complete this task, including assembling the underlying data, so there was little room for trial and error.

Being geeks by job description and, apparently, by nature, we dug in and wanted to learn and try everything. We'd had prior exposure to Snowflake's ML and AI features, but hadn't built a chatbot like this, so we connected with the Snowflake Product Team and fellow engineers and read documentation voraciously, trying to figure out the best (and fastest) way to develop our chatbot. Below you will find some of the key lessons we learned along the way, as well as some musings on what we anticipate would matter for a production-ready implementation. They will help you on your own journey:

overviewCard2-0ee2e584f19e812c

First Steps: Start Small

It is challenging to determine whether a chatbot would be of any value unless it can address topics that are top of mind for its users. Having said that, the promises and benefits of agentic AI may tempt you to boil the ocean and use all the data you have available at once. However, recognize that you are entering new territory and take it one step at a time. Start with a small data set and allow yourself to get a feel for how data is enabled in Snowflake Intelligence before making a bigger commitment. It is recommended that you start with a set of 3 to 5 tables, each with 10 to 20 columns.

Our Client identified that subscription growth patterns were a key concern for them, but they could only access this data with ad-hoc queries against their legacy data warehouse. We identified three tables that we could create for them, which would contain the subscriber data they would need. We then asked the Client to provide us with a list of 20 sample questions they would want the chatbot to be able to answer using that data:

  • How many customers who cancelled their subscription in July were originally trial subscribers?
  • How did the number of trial-to-full-rate subscription conversions in Aug 2025 compare to Aug 2024?
  • What is the total monthly revenue from trial subscribers who have cancelled and then re-subscribed within the last year?
  • What is the distribution of our subscriber base by the month their subscription started?
  • What are the primary reasons for cancellation, e.g., price, product dissatisfaction, etc? How do these correlate with other data points, such as subscription tenure?

Stock Your Cortex Agent Toolbox

Snowflake Intelligence utilizes Cortex Agents to orchestrate all tasks necessary to deliver a result to the user. Cortex Agent, in turn, uses the tools made available to it. The typical “vanilla” scenario involves using Cortex Analyst for querying structured data in your databases and Cortex Search for retrieving data from non-structured sources, such as PDFs stored in your data lake.

However, you can also create and provide Cortex Agent with custom tools that use stored procedures and user-defined functions to achieve almost anything, even things like interacting with APIs. With these tools at your disposal, you can build not only a chatbot that retrieves information but also an agentic Agent that can perform actions on the user’s behalf, such as scheduling meetings and making flight reservations. With the necessary permissions, of course!

Our use case was relatively simple, so we primarily utilized Cortex Analyst for our data queries, as well as Cortex Search to help the agent understand the numerous interchangeable terms business users used for the same concept. Our next iteration, however, will likely delve deeper into more tooling to empower our implementation.

Get Results Fast

You can leverage your existing engineering methodology by taking a code-first, programmatic approach. However, you would be better off getting a feel for the dynamics of building out the components that underpin Snowflake Intelligence first. You may find that, rather than just needing a single set of requirements at the starting point, you will require extensive collaboration with your business stakeholders, subject matter experts, and data stewards throughout the implementation.

Luckily, Snowflake has made a fundamental commitment to enabling its users to access and work with all components of Snowflake Intelligence right in its Snowsight UI, and you'd be well advised to anchor your initial implementation on this approach. The Snowsight UI gives you the ability to work with all the underlying components of the agentic implementation, allowing rapid prototyping and refinement:

  • You won't need to build a user UI as Snowflake Intelligence already provides one at https://ai.snowflake.com. Rather than implementing it yourself, context-awareness is supported out of the box, enabling users to have multi-turn conversations with follow-up questions. Charts and graphs are also supported, enabling a rich user experience.
  • You won't have to worry about deploying the various objects, as you can create and edit them directly in Snowsight.
  • You can leverage the Cortex Analyst capability to generate the initial draft of your semantic definitions, which will save you considerable time compared to building these from scratch.
  • You can add the Cortex Search service, which would allow you to ask questions about unstructured data like PDF documents residing in your data lake or use it for implementing categorical value data dimensions.
  • The interface allows you to issue prompts at Snowflake Intelligence side by side with the object editor, so you can test and iterate on refining your semantics in real time. You can even supply your semantic models with expressions and verified queries for complex query patterns where using SQL is unavoidable, right in the UI.
  • You can configure the Cortex Agent orchestration of all the tools you have made available to it: Cortex Analyst, Cortex Search as well as any custom tools you may want to provide to it.
  • Consider adding details on Custom Tools here - I understand it wasn’t part of the use case back then, but still worth adding here.
  • Also consider adding details on Cortex Agents - including instructions information.

As a result, you will be able to stand up your initial implementation in days rather than weeks and demonstrate results that can help you pursue executive buy-in and commitment towards broader-scope implementations.

Budget for Improvement Iterations

Data governance is not a sexy topic in just about any organization. Maintaining a data catalog or even clear definitions of what data objects represent in the business realm is often overlooked in favour of the next feature the business wants. Wading into agentic AI, however, you will find that you will need to spend the time to close some of these gaps. The less your data is annotated and documented, the more time you will need to spend on refining your semantic definitions and the more you will depend on support from business subject matter experts who can help you fill in the blanks. We faced this challenge very much in our experiment. It could be because this was our first attempt, or it could be because we lacked a well-defined data catalog, or both.

As data engineers, this shifted us into a new realm. Once we had the initial draft generated by Cortex Analyst, we found ourselves going through multiple iterations of refinement to arrive at definitions that Cortex Analyst would interpret correctly. You might not think that initially, but Cortex Analyst needs to know precisely what you mean when you say: "trial user", "customer", "subscriber", "recipient", or "lapsed account", how those are related and to what data objects, and what search parameters it needs to give you what you are asking for. Enter Cortex Search to the rescue with which you can not only search unstructured data but enable categorical dimensions to solve just this kind of challenge.

Luckily, Snowflake Intelligence makes all details about its reasoning and the queries it uses to fetch data available, right in the UI, making iterations easy and quick. In response to each prompt, it will list the steps it goes through as it interprets the request and reasons about it, the tools Cortex Agent selects to fetch the data it needs, the queries it runs, any subsequent refinement steps it goes through, and finally provides its response.

 

overviewCard3-9b1a6e2763338f66

 

Plan for the Future

Once you have demonstrated the value Snowflake Intelligence can deliver and have secured commitment for an expansion of the implementation scope, you can start planning for your production-ready implementation. Take everything that you have already learned in your POC and recognize that you will have some decisions to make.

Programmatic Approach

Snowflake has enabled control of most Snowflake Intelligence components through a comprehensive REST API, allowing you to implement a code-first, programmatic, code-versioned, SDLC-anchored approach and leverage your environment's infrastructure with its CI/CD pipelines and deployment processes. Sounds great, but that may require you to build out a framework to interact with the API and easily manipulate the target objects; make sure you budget the time and effort for that work.

In the long run, Snowflake has proposed the OSI initiative, which would enable the integration of Snowflake Intelligence with the rich tooling ecosystem of its partners, including notable companies such as Atlan, BlackRock, dbt Labs, Hex, Salesforce, Sigma, and ThoughtSpot, among others. You will need to weigh the risks of waiting for this ecosystem to consolidate and become fully integrated versus the engineering investment and the value you could derive from agentic AI today.

Semantic Models vs. Semantic Views

Both semantic models and semantic views serve to provide the LLMs behind Snowflake Intelligence with the semantic meaning of data objects. They enable the Cortex Agent to translate plain language questions into SQL and interpret the results for human consumption. They distinguish between data modelling concepts, such as facts and dimensions, and encapsulate object relationships. Without them, agentic AI is not possible.

Semantic models are simply YAML files deployed to a Snowflake stage. They have been available for some time now, and there may be tooling available that enables you to leverage your existing semantic implementation into Snowflake. For example, Snowflake has open-sourced tooling that allows the conversion of dbt Semantic Layer definitions into semantic models.

Snowflake, however, has raised the bar by introducing Semantic Views, which are intended to replace semantic models in the long run. Semantic views are first-class Snowflake objects, with the complete set of the platform features, including access control and, most importantly, the ability to query them like a regular SQL view. Yes, you can issue a SELECT statement, even though it may not look like what you expect. See Querying semantic views | Snowflake Documentation for more information.

The breakthrough of Semantic Views is that they provide a semantic paradigm for accessing data, abstracting the actual underlying Snowflake objects. You don't specify tables and columns in your queries, but rather query metrics and facts, and automatically apply dimensional groupings. What is even more powerful is that they promise to provide a unified view of your data between BI and AI tools once BI tools catch up. Having BI and AI speak the same language and consistently deliver the same results is no small feat, as those who have had to align output from different tools would know.

Semantic views are relatively new and may not yet offer full parity with the functionality of semantic models, although this is likely to change in the near future. Tooling support may be lacking in these early days as well; however, the OSI initiative holds considerable promise.

Security Considerations

Data masking and row-based access control are enforced; however, as with any other object, you should pay particular attention to the access of users who have been granted access to roles with elevated privileges and have secondary roles enabled. As a general rule, it is best to set a user's lowest privilege role as their default role.

Controlling for Cost

As usual, monitoring and controlling cost escalation is easy in Snowflake. Plan to implement dedicated virtual warehouses for your Snowflake Intelligence implementations and assign resource monitors to issue alerts and/or suspend the warehouse(s) at your predefined thresholds.

Additionally, review the 'Understanding Cost for Cortex Search Services' section in the Snowflake Documentation and consider budgeting for the Cortex Agents REST API, as outlined in the Snowflake Documentation.

Improving Accuracy

You will find that, as long as your semantic definitions are accurate and your data is organized well, the model will do a great job of giving you the results you want. However, you will often encounter obscure, domain-specific, or legacy-bound business rules that require complex queries to express. That is where expressions, metrics, and verified queries can help give Snowflake Intelligence the necessary context to deliver accurate results.

Consider the orchestration instructions you can add to the Cortex Agent(s). They allow you to set parameters on how its tools work together, as well as pass on system prompts to the underlying LLM for output format.

Conclusion

Our first experience with Snowflake Intelligence left us both impressed and inspired. Even in its preview stage, the platform demonstrated remarkable potential to make AI-powered data interaction truly practical, transforming how organizations make data-driven decisions. From rapid prototyping in Snowsight to the semantic depth of Cortex components, we found ourselves working within a system that makes AI approachable, powerful, and deeply integrated into the data ecosystem we already know.

Of course, as with any new technology, there are caveats. Data quality, semantic precision, and governance all become even more critical in an agentic AI context, and organizations must plan accordingly. Yet, the lessons from our experiment — start small, iterate fast, engage domain experts early, and embrace Snowflake's evolving toolset — point to a clear path forward. With thoughtful implementation, Snowflake Intelligence has the potential to lay the foundation for a truly data-empowered enterprise.

I’m Ivaylo Boiadjiev, Practice Manager, Data Engineering at Infostrux Solutions. You can follow me on LinkedIn.

Subscribe to Infostrux’s Blog at https://blog.infostrux.com for the most interesting Data Engineering and Snowflake news. Follow Infostrux’s open-source efforts through GitHub.

Explore also how Infostrux helped WHOOP implement an agentic chatbot for their analytics. Read the case study here.