How to Use Snowflake Cortex for SQL Generation

SQL Support with ClickUp Brain

Start using ClickUp today

  • Manage all your work in one place
  • Collaborate with your team
  • Use ClickUp for FREE—forever

Most teams treat SQL generation like a magic trick. You type a question and get a query.

But here’s the reality: Snowflake Cortex Analyst only works as well as the semantic model you build first, and that setup isn’t trivial. Learning how to use Snowflake Cortex for SQL generation, data teams can now transform natural language into complex, executable queries in seconds.

This guide walks you through the actual implementation process, from defining your YAML semantic model to querying your data warehouse using natural language, so you understand both the power and the prerequisites before you start.

We also look at where Snowflake Cortex falls short and how ClickUp can support the broader workflows that surround SQL generation.

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

What Is Snowflake Cortex Analyst?

Snowflake Cortex Analyst is a fully managed service that allows you to build conversational applications over your analytical data.

It uses a specialized text-to-SQL agent to turn natural language questions into accurate, executable queries. This service bridges the gap between complex data structures and business users who need answers without writing code.

Key capabilities include:

  • Providing a high-accuracy interface for interacting with structured data
  • Using semantic models to understand your specific business logic and terminology
  • Offering a REST API for easy integration into custom applications or BI tools
  • Maintaining data privacy by processing requests within the Snowflake security boundary

📮 ClickUp Insight: 88% of our survey respondents use AI for their personal tasks, yet over 50% shy away from using it at work. The three main barriers? Lack of seamless integration, knowledge gaps, or security concerns.

But what if AI is built into your workspace and is already secure? ClickUp Brain, ClickUp’s built-in AI assistant, makes this a reality. It understands prompts in plain language, solving all three AI adoption concerns while connecting your chat, tasks, docs, and knowledge across the workspace.

Find answers and insights with a single click!

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

Prerequisites for Snowflake Cortex SQL Generation

Jumping into Snowflake Cortex without the right setup leads to frustration. You might get inaccurate results, waste time troubleshooting, and wrongly conclude the tool is broken when the real issue is a weak foundation.

To avoid this, you need to get three foundational elements in place first.

1. Set up your database and tables

Your AI is only as smart as the data it can access. If your database schema is a maze of cryptic column names like cust_dat_v2_final, both your analysts and the AI will struggle to make sense of it.

This confusion leads to the AI generating incorrect joins or pulling from the wrong columns, and your team wastes hours just trying to decipher the schema before they can even write a query.

Start by ensuring your data warehouse software contains the tables you want Cortex Analyst to query. Whenever possible, use clear, descriptive column names. For example, a column named customer_lifetime_value is far more intuitive for both humans and AI than clv_01.

To proceed with the setup, your Snowflake role will need the following permissions:

  • USAGE: On the database and schema containing your tables
  • SELECT: On the tables you want Cortex Analyst to query
  • CREATE STAGE: On the schema, which is required to upload your semantic model file

2. Create your semantic model file

The biggest hurdle with any text-to-SQL tool is that the AI doesn’t speak your company’s unique language. It doesn’t inherently know that “ARR” means “Annual Recurring Revenue” or that your customers’ table joins to your orders table on the customer_id field.

Without this context, the AI might generate SQL that is technically valid but logically wrong, giving you answers that look correct but are dangerously misleading.

The semantic model is the solution. It’s a YAML file that acts as your custom “translation layer,” teaching Cortex Analyst the specific vocabulary and logic of your business. Building and maintaining this file is a collaborative effort between data engineers who use ETL tools to know the schema and business analysts who know the terminology.

Your semantic model file should contain these key components:

ComponentPurpose
TablesLists each table with a plain-language description of its purpose
ColumnsDefines each column’s semantic type (like category or metric) and can include sample values
RelationshipsSpecifies how tables connect through joins, removing any guesswork for the AI
Verified QueriesProvides example question-and-SQL pairs that act as powerful guides for the LLM

3. Configure Cortex Search Service (optional)

Sometimes, the answers you need are hidden in unstructured text, like product descriptions, support tickets, or call transcripts. Standard SQL queries can’t touch this data, which means you’re often missing the ‘why’ behind the ‘what.’

You can optionally add Snowflake Cortex Search Service here. It’s a search-as-a-service layer that allows you to query both your structured tables and your unstructured text data using AI agents for data analysis at the same time.

You should configure Cortex Search if your analysts need to ask questions that require pulling context from text before generating SQL. For example, you could first search for all product reviews containing the phrase ‘battery issue’ and then generate a SQL query to aggregate the sales data for only those products.

For pure SQL generation against structured tables, this service isn’t necessary.

🧠 Fun Fact: In the early 1970s, IBM researchers Donald Chamberlin and Raymond Boyce created ‘Structured English Query Language.’ They had to change the name to SQL because ‘SEQUEL’ was already trademarked by a British aircraft company.

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

Step-by-Step Guide to Generate SQL with Cortex Analyst

You’ve done the prep work, but now you’re facing a blank screen, unsure of the actual workflow. How do you go from a question in your head to an executable SQL query? When workflow management isn’t clear, new tools often go unused, and the investment in setup is wasted.

The hands-on process is refreshingly straightforward. Here’s a closer look!

Step #1: Prepare your data in Snowflake

Before anything else, your structured data needs to live inside Snowflake. Each Cortex Analyst application is pointed at either a single table or a view composed of one or more tables. Make sure your tables are created and populated.

If you’re loading from flat files:

  • Upload your data files (e.g., CSVs) to a Snowflake Stage
  • Use the COPY INTO command to load data from the stage into your tables
  • Verify the data loaded successfully before moving on

Step #2: Build a semantic model (or semantic view)

This is the most critical setup step. Cortex Analyst’s power comes from combining large language models (LLMs) with semantic models, a YAML file that sits alongside your database schema and encodes business context.

Semantic Views are now Snowflake’s recommended method for Cortex Analyst. They store business metrics, relationships, and definitions directly inside Snowflake. Legacy YAML semantic model files still work, but Snowflake directs new implementations toward Semantic Views.

Your semantic model or view should include:

  • Table and column descriptions: Plain-language explanations of what each field means
  • Business metrics: Definitions for calculated fields like revenue, churn, or conversion rate
  • Filters and synonyms: Alternate terms users might use (e.g., ‘cancelled’ mapped to a specific status value)
  • Verified queries: Snowflake’s Verified Query Repository stores approved question-and-SQL pairs. When a user question resembles one of those entries, Cortex Analyst can reference it during SQL generation

🤝 Friendly Reminder: Snowflake suggests using no more than 10 tables and no more than 50 selected columns for optimal performance in the Snowsight workflow.

Step #3: Upload the semantic model to a Snowflake stage

If you’re using a YAML-based semantic model, it needs to be staged so Cortex Analyst can reference it at runtime.

  • Upload your .yaml file to a Snowflake internal stage (e.g., RAW_DATA)
  • Confirm the file appears in the stage via the Snowsight UI or a LIST @stage_name command
  • Note the stage path; you’ll reference it in your API calls or app configuration

If you’re using a Semantic View, this step is handled natively within Snowflake, and no separate upload is needed.

🔍 Did You Know? NULL in SQL does not mean zero or empty. It represents unknown or missing data, which leads to unintuitive behavior like comparisons that return neither true nor false.

Step #4: Send a natural language question via the REST API

Now the actual SQL generation begins. The REST API generates a SQL query for a given question using a semantic model or semantic view provided in the request.

Structure your API request with:

  • messages; an array containing your user question with role: “user”
  • A reference to your semantic model or semantic view
  • Your preferred model (or leave it as auto for Cortex to select the best one)

You can have multi-turn conversations where you can ask follow-up questions that build upon previous queries.

Step #5: Parse the API response

Each message in a response can have multiple content blocks of different types. Three values that are currently supported for the type field are: text, suggestions, and SQL.

Here’s what each type means:

  • SQL: Cortex successfully generated a query; this is what you’ll execute
  • text: A natural language explanation or answer accompanying the SQL
  • suggestions: The suggestion content type is only included in a response if the user question was ambiguous and Cortex Analyst could not return a SQL statement for that query. Use these to clarify or refine the question

🔍 Did You Know? The order you write SQL is not the order it runs. Even though you write SELECT first, databases actually process FROM and WHERE before selecting columns. This confuses beginners and experienced users alike.

Step #6: Execute the generated SQL in Snowflake

Once you have the SQL block from the response, run it against your Snowflake virtual warehouse. The generated SQL query is executed in your Snowflake virtual warehouse to generate the final output. Data stays within Snowflake’s governance boundary.

Key things to know at execution time:

  • Cortex Analyst fully integrates with Snowflake’s role-based access control (RBAC) policies, ensuring that SQL queries generated and executed adhere to all established access controls
  • If a user lacks access to a table, the query will fail at execution, just as it would with hand-written SQL
  • Warehouse compute costs apply at this stage, separate from Cortex Analyst’s own usage charges

Step #7: Refine and iterate

Getting a perfect query on the first try isn’t always guaranteed. Here’s how to improve results over time:

  • Add verified queries to your semantic model for questions that come up repeatedly
  • Enrich your semantic model with better descriptions, synonyms, and filters when Cortex misinterprets a term
  • Use multi-turn conversation to follow up, for example, ‘Now filter that by region’, multi-turn conversations allow follow-up questions that build upon previous queries
  • Monitor usage via CORTEX_ANALYST_USAGE_HISTORY and Snowflake query history to spot patterns in failed or inaccurate queries

🧠 Fun Fact: One missing JOIN condition can create massive problems. Forgetting a join condition can produce a Cartesian product, multiplying rows dramatically and sometimes crashing systems.

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

Best Practices for Snowflake Text-to-SQL Accuracy

The quality of your semantic model directly determines the accuracy of the queries it generates. Here are the best practices that improve accuracy. 🛠️

  • Add verified queries to your semantic model: This is the most impactful thing you can do. Include plenty of example question-SQL pairs that reflect how your team genuinely asks questions
  • Use descriptive column and table names: The model performs better when column and table names are self-explanatory. If you can’t change the schema, add clear descriptions in your YAML file for any cryptic column names
  • Include sample values: Adding example data for categorical columns (like status or region) helps the model understand the valid filter options available
  • Test with edge cases: During development, intentionally ask ambiguous or tricky questions to identify where your semantic model needs more context or clarification
  • Iterate on your semantic model: Treat your semantic model as a living document. It should be continuously updated through an iterative process based on which queries succeed and which ones fail
Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

ClickUp: A Simpler Alternative to Snowflake Cortex

Snowflake Cortex works well when teams want to generate SQL and run queries across structured data. Teams define schemas, map relationships, and write queries to extract insights. That setup makes sense for data-heavy environments, especially when analysts own reporting.

Many teams, however, don’t need a full SQL layer to answer everyday operational questions. Product managers, program leads, and operations teams often want quick answers tied to active work. 

ClickUp offers a more accessible path. Teams ask questions in plain language, review live dashboards, and act on insights without writing SQL or building semantic models.

Generate and refine SQL faster

Snowflake Cortex focuses on generating SQL queries from structured datasets inside a warehouse environment. That works well when your data already lives in Snowflake, and you have schemas mapped out.

Refine SQL queries and QA tasks using ClickUp Brain
Refine SQL queries and QA tasks using ClickUp Brain

ClickUp Brain supports SQL generation in a more flexible, execution-focused way. Teams generate, refine, and store SQL queries directly inside their workspace where analysis, discussions, and decisions already happen.

Suppose a product analyst works on a retention analysis task inside ClickUp. Instead of switching tools to write queries, they ask ClickUp Brain:

📌 Try this prompt: Write a SQL query to calculate seven-day retention for users grouped by signup cohort.

ClickUp Brain generates a structured query that includes cohort grouping, date filters, and retention logic. The analyst pastes the query into Snowflake or another warehouse and runs it immediately.

It helps:

  • Write joins across multiple tables, such as users, orders, and events
  • Convert plain English product questions into SQL logic ready for execution
  • Debug broken queries and explain issues, such as incorrect joins or missing conditions
  • Rewrite queries for better performance or readability

For example, during a growth experiment review, a marketer asks: ‘Write a SQL query to compare conversion rates between two landing pages over the last 14 days’.

ClickUp Brain generates the query using conditional aggregation and date filters. The team runs it in Snowflake and validates experiment results.

📌 Try this prompt: Fix this SQL query where the join duplicates rows and explain the issue.

ClickUp Brain identifies the join problem, corrects the query, and explains how duplicate rows occurred due to incorrect join conditions.

Replace SQL-driven reporting

Visualize project data using ClickUp Dashboards without SQL queries
Visualize project data using ClickUp Dashboards without SQL queries

Snowflake Cortex workflows often involve generating SQL, running queries, and visualizing results in a separate layer. ClickUp Dashboards remove that multi-step process and present insights directly from live work.

A program management team tracking release readiness can build a dashboard without writing queries. For instance, a release dashboard may include:

  • A task list card filtered to show overdue tasks across all product teams
  • A workload card that shows task distribution across engineers
  • A bar chart comparing completed versus pending tasks by sprint
  • A calculation card tracking average completion time

Suppose a program lead reviews this dashboard before a release meeting. They immediately see that backend services show higher delay rates. They open the task list card and inspect the exact tasks causing risk.

A real-life ClickUp user shares:

ClickUp allows us to QUICKLY pass projects to each other, EASILY check the status of projects, and gives our supervisor a window into our workload at any time without her having to interrupt us. We have certainly saved one day per week using ClickUp, if not more. The number of emails is SIGNIFICANTLY reduced.

Act on insights without pipelines

Snowflake Cortex focuses on generating insights from data. Teams still need to interpret results and trigger actions separately.

Monitor and act on workspace insights using ClickUp Super Agents
Monitor and act on workspace insights using ClickUp Super Agents

ClickUp AI Super Agents close that gap and turn insights into execution. They operate as AI teammates that monitor workspace data continuously and take action based on conditions.

Suppose a program manager oversees multiple product initiatives. A Super Agent can:

  • Monitor tasks across projects and detect when overdue tasks exceed a defined threshold
  • Identify patterns such as repeated delays in the same workflow stage
  • Create a task that summarizes impacted projects and assigns it to the program lead
  • Notify team owners when critical tasks remain unresolved past deadlines

For example, during a release cycle, a Super Agent detects that more than 10 high-priority tasks missed deadlines across two teams. It creates a ClickUp Task titled ‘Release risk: missed deadlines’, attaches all relevant tasks, and assigns the program manager for immediate review.

Teams can also interact directly with the Super Agent: ‘Analyze all active projects and highlight delivery risks for this sprint’.

The Super Agent reviews deadlines, dependencies, and task status, then posts a structured summary inside the workspace.

Here’s how to set up your own Super Agent in ClickUp:

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

Centralize Your Data Workflows With ClickUp

Text-to-SQL tools like Snowflake Cortex make data more accessible. At the same time, getting reliable results still takes effort.

Teams need clean schemas, strong semantic models, and continuous iteration to keep outputs accurate. Even after generating the right query, work doesn’t stop there. Someone still needs to interpret the results, share insights, and turn them into decisions.

ClickUp brings a different approach. Instead of separating analysis from execution, ClickUp connects both. Teams generate SQL, document insights, collaborate on findings, and act on them inside the same workspace.

ClickUp Brain helps write and refine queries, while Dashboards and AI Agents help teams track outcomes and move work forward without jumping across tools.

Snowflake Cortex helps you get answers. ClickUp helps you do something with them. Sign up for ClickUp today!

Summarize this article with AI ClickUp Brain not only saves you precious time by instantly summarizing articles, it also leverages AI to connect your tasks, docs, people, and more, streamlining your workflow like never before.
ClickUp Brain
Avatar of person using AI Summarize this article for me please

Frequently Asked Questions

What is Snowflake Cortex Analyst and how does it differ from Cortex AI?

Snowflake Cortex Analyst is a specialized service within the broader Snowflake Cortex AI suite. Cortex Analyst is focused specifically on text-to-SQL generation using semantic models, while Cortex AI includes a wider range of LLM functions, machine learning model inference, and search capabilities.

Does Snowflake Cortex Analyst work with external data sources like Iceberg tables?

Yes, Cortex Analyst can query Apache Iceberg tables that are managed through Snowflake. As long as the tables are accessible within your Snowflake environment and properly defined in your semantic model, you can generate queries against them.

How accurate is Snowflake text-to-SQL generation for complex queries?

The accuracy for complex queries depends almost entirely on the quality of your semantic model. A model with well-defined table relationships, numerous verified queries, and descriptive metadata will produce significantly more accurate results for multi-table joins and complex aggregations.

What does Snowflake Cortex cost for SQL generation features?

Snowflake Cortex Analyst pricing follows Snowflake’s consumption-based model, which means you are billed based on the compute credits used during the query generation process. For the most current rates, you should always refer to Snowflake’s official pricing documentation.

Everything you need to stay organized and get work done.
clickup product image
Super Agents

Still downloading templates?

There’s an easier way. Try a free AI Agent in ClickUp that actually does the work for you—set up in minutes, save hours every week.