<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Sagara</title>
    <description>The latest articles on Forem by Sagara (@sagara).</description>
    <link>https://forem.com/sagara</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/sagara"/>
    <language>en</language>
    <item>
      <title>Trying Out Snowflake's Adaptive Warehouse — Auto-Scaling Compute Without Manual Sizing</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Thu, 16 Apr 2026 00:59:22 +0000</pubDate>
      <link>https://forem.com/sagara/trying-out-snowflakes-adaptive-warehouse-auto-scaling-compute-without-manual-sizing-j8n</link>
      <guid>https://forem.com/sagara/trying-out-snowflakes-adaptive-warehouse-auto-scaling-compute-without-manual-sizing-j8n</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This is an English translation of the original Japanese article:&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/snowflake-try-adaptive-warehouse/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-try-adaptive-warehouse/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Until now, Snowflake warehouses required you to manually select and manage a size (XS, S, M, L, etc.). With &lt;strong&gt;Adaptive Warehouse&lt;/strong&gt;, compute resources automatically scale based on query characteristics. This eliminates the need for manual warehouse sizing and scale-up/scale-out configuration.&lt;/p&gt;

&lt;p&gt;I learned about this feature from &lt;a href="https://x.com/tshowis" rel="noopener noreferrer"&gt;@tshowis&lt;/a&gt;'s post:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://x.com/tshowis/status/2044446136556282293" rel="noopener noreferrer"&gt;https://x.com/tshowis/status/2044446136556282293&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While release notes haven't been published yet, the official documentation has full details. The Public Preview supported regions include &lt;strong&gt;AWS Tokyo Region (AP Northeast 1)&lt;/strong&gt;, so those using the Tokyo region can try it right away. I tested how it works using TPC-H sample data and have summarized the steps and results below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/warehouses-adaptive" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/warehouses-adaptive&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Feature Overview
&lt;/h2&gt;

&lt;p&gt;Adaptive Warehouse is a compute service that recognizes workloads and automatically allocates resources. It has a dedicated shared compute pool per account and automatically determines the optimal resources based on query characteristics.&lt;/p&gt;

&lt;p&gt;Key features include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;No warehouse sizing required&lt;/strong&gt;: No need to manually select or change sizes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automatic per-query scaling&lt;/strong&gt;: Small queries use fewer resources, large queries automatically use more&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Online conversion&lt;/strong&gt;: Converting existing standard warehouses to Adaptive Warehouses requires no downtime&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Behavior controlled by 2 parameters&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The two controllable parameters are:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Parameter&lt;/th&gt;
&lt;th&gt;Default&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;XLARGE&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Upper performance limit per query (XSMALL–X4LARGE)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;QUERY_THROUGHPUT_MULTIPLIER&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;2&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Scale factor for concurrent execution (0 for unlimited, positive integer for cap)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Setting &lt;code&gt;QUERY_THROUGHPUT_MULTIPLIER&lt;/code&gt; to &lt;code&gt;N&lt;/code&gt; ensures capacity for N queries to run concurrently at the &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt; resource level.&lt;/p&gt;

&lt;h3&gt;
  
  
  Limitations
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;As of April 16, 2026, this is a Public Preview feature. Specifications may change before GA&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Available regions (Public Preview)&lt;/strong&gt;: US West 2 (Oregon), EU West 1 (Ireland), &lt;strong&gt;AP Northeast 1 (Tokyo)&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Required edition: &lt;strong&gt;Enterprise Edition or higher&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Conversion from X5Large/X6Large warehouses to Adaptive Warehouse is not supported&lt;/li&gt;
&lt;li&gt;Mutual conversion with Snowpark-optimized warehouses and Interactive warehouses is also not supported&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Cost
&lt;/h3&gt;

&lt;p&gt;Adaptive Warehouse uses a query-based billing model. Each query's cost is calculated based on the compute resources and software resources consumed. The total warehouse cost is the sum of all executed query costs.&lt;/p&gt;

&lt;p&gt;Creating a warehouse itself incurs no cost — billing begins when queries are executed. For cost management, you can use Budgets, Resource Monitors, and views in &lt;code&gt;SNOWFLAKE.ACCOUNT_USAGE&lt;/code&gt; for monitoring.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Snowflake&lt;/strong&gt;: AWS US Oregon region, Enterprise Edition

&lt;ul&gt;
&lt;li&gt;I happened to have a trial account, so I tested on the US Oregon region&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Verify TPC-H Sample Data
&lt;/h3&gt;

&lt;p&gt;For this test, I used the TPC-H sample data provided by default in Snowflake. TPC-H is a decision support benchmark dataset well-suited for evaluating large-scale data processing with complex joins and aggregations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/sample-data-tpch" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/sample-data-tpch&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Multiple schemas with different scale factors (SF) are available in the &lt;code&gt;SNOWFLAKE_SAMPLE_DATA&lt;/code&gt; database:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Schema&lt;/th&gt;
&lt;th&gt;Approximate Data Scale&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;TPCH_SF1&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Millions of records&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;TPCH_SF10&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Tens of millions of records&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;TPCH_SF100&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Hundreds of millions of records&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;TPCH_SF1000&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Billions of records&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;To clearly observe the automatic scaling behavior, I used &lt;strong&gt;&lt;code&gt;TPCH_SF1000&lt;/code&gt;&lt;/strong&gt; (billions of records).&lt;/p&gt;

&lt;p&gt;First, verify that you can access the sample data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Verify sample data access&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE_SAMPLE_DATA&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TPCH_SF1000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Check table list&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see 8 tables (&lt;code&gt;LINEITEM&lt;/code&gt;, &lt;code&gt;ORDERS&lt;/code&gt;, &lt;code&gt;CUSTOMER&lt;/code&gt;, &lt;code&gt;SUPPLIER&lt;/code&gt;, &lt;code&gt;PART&lt;/code&gt;, &lt;code&gt;PARTSUPP&lt;/code&gt;, &lt;code&gt;NATION&lt;/code&gt;, &lt;code&gt;REGION&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vqdvepod7bpd7vjgk10.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vqdvepod7bpd7vjgk10.png" alt="2026-04-16_08h10_03" width="800" height="512"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Disable Account-Level Cache
&lt;/h3&gt;

&lt;p&gt;To accurately compare execution times before and after parameter tuning, disable the query result cache. With caching enabled, subsequent identical queries would be served from cache, making execution time comparison impossible.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/sql-reference/parameters#use-cached-result" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/sql-reference/parameters#use-cached-result&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Disable caching at the account level:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;ACCOUNTADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Disable query result cache at account level&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;ACCOUNT&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;USE_CACHED_RESULT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;false&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify the parameter is set to &lt;code&gt;false&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Confirm the setting&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="k"&gt;PARAMETERS&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'%USE_CACHED_RESULT%'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwxyzijzcp3ho76miuqze.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwxyzijzcp3ho76miuqze.png" alt="2026-04-16_07h45_15" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;⚠️ If you're using a production account, don't forget to re-enable caching after testing:&lt;/p&gt;


&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;ACCOUNTADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;ACCOUNT&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;USE_CACHED_RESULT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Hands-On Testing
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Create an Adaptive Warehouse (XLARGE Setting)
&lt;/h3&gt;

&lt;p&gt;First, create a new Adaptive Warehouse with &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL = XLARGE&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Create Adaptive Warehouse with XLARGE setting&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="n"&gt;ADAPTIVE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;adaptive_wh_xlarge&lt;/span&gt;
  &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;XLARGE&lt;/span&gt;
       &lt;span class="n"&gt;QUERY_THROUGHPUT_MULTIPLIER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify the configuration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Check warehouse settings&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_wh_xlarge'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Confirm that the &lt;code&gt;type&lt;/code&gt; column shows &lt;code&gt;ADAPTIVE&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwrz00t5mkx5zy4hsmqb6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwrz00t5mkx5zy4hsmqb6.png" alt="2026-04-16_08h52_02" width="800" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Run TPC-H Queries with XLARGE Setting
&lt;/h3&gt;

&lt;p&gt;Run representative TPC-H queries using &lt;code&gt;adaptive_wh_xlarge&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Set &lt;code&gt;QUERY_TAG&lt;/code&gt; so we can compare execution times later:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Set QUERY_TAG for later comparison&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="k"&gt;SESSION&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_xlarge_m2'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;adaptive_wh_xlarge&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE_SAMPLE_DATA&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TPCH_SF1000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  TPC-H Q1: Aggregation Query (Full LINEITEM Table Scan)
&lt;/h4&gt;

&lt;p&gt;First, run TPC-H Q1 (Pricing Summary Report) — one of the simplest TPC-H queries. It performs a full table scan and aggregation on the &lt;code&gt;LINEITEM&lt;/code&gt; table (approximately 6 billion records at SF1000).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- TPC-H Q1: Pricing Summary Report&lt;/span&gt;
&lt;span class="c1"&gt;-- Aggregates the LINEITEM table (~6 billion records at SF1000)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_QUANTITY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_QTY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                  &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_BASE_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;               &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_DISC_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;L_TAX&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_CHARGE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_QUANTITY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_QTY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                  &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_DISC&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                              &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;COUNT_ORDER&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt;
    &lt;span class="n"&gt;LINEITEM&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt;
    &lt;span class="n"&gt;L_SHIPDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;DAY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;90&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1998-12-01'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Results after 5 runs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Run&lt;/th&gt;
&lt;th&gt;Execution Time (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;4.47&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;4.55&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;3.68&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;3.85&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;3.77&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h4&gt;
  
  
  TPC-H Q5: Multi-Table Join Query
&lt;/h4&gt;

&lt;p&gt;Next, run Q5 (Local Supplier Volume), which joins multiple tables:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- TPC-H Q5: Local Supplier Volume&lt;/span&gt;
&lt;span class="c1"&gt;-- Complex query joining 6 tables&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;REVENUE&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt;
    &lt;span class="n"&gt;CUSTOMER&lt;/span&gt;   &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ORDERS&lt;/span&gt;     &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;LINEITEM&lt;/span&gt;   &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;SUPPLIER&lt;/span&gt;   &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;NATION&lt;/span&gt;     &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;REGION&lt;/span&gt;     &lt;span class="n"&gt;R&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt;
    &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;C_CUSTKEY&lt;/span&gt;    &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_CUSTKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_ORDERKEY&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_SUPPKEY&lt;/span&gt;   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_SUPPKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;C_NATIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_NATIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_NATIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NATIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_REGIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;R&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;R_REGIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;R&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;R_NAME&lt;/span&gt;      &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'ASIA'&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1994-01-01'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;  &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;YEAR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1994-01-01'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NAME&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;REVENUE&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Results after 5 runs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Run&lt;/th&gt;
&lt;th&gt;Execution Time (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;5.74&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;3.17&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;3.03&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;3.03&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;2.83&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  3. Create a Comparison Adaptive Warehouse (SMALL Setting)
&lt;/h3&gt;

&lt;p&gt;Create a separate Adaptive Warehouse with &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL = SMALL&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Create Adaptive Warehouse with SMALL setting&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="n"&gt;ADAPTIVE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;adaptive_wh_small&lt;/span&gt;
  &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;SMALL&lt;/span&gt;
       &lt;span class="n"&gt;QUERY_THROUGHPUT_MULTIPLIER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Check warehouse settings&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_wh_small'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Confirm that the &lt;code&gt;type&lt;/code&gt; column shows &lt;code&gt;ADAPTIVE&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvguxi8l19bly3cbpu9r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvguxi8l19bly3cbpu9r.png" alt="2026-04-16_08h58_26" width="800" height="311"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Run TPC-H Queries with SMALL Setting
&lt;/h3&gt;

&lt;p&gt;Run the same Q1 and Q5 queries using &lt;code&gt;adaptive_wh_small&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Set QUERY_TAG for SMALL setting runs&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="k"&gt;SESSION&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_small_m2'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;adaptive_wh_small&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE_SAMPLE_DATA&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TPCH_SF1000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  TPC-H Q1: Aggregation Query
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Q1 execution (MAX_QUERY_PERFORMANCE_LEVEL = SMALL)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_QUANTITY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_QTY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                  &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_BASE_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;               &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_DISC_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;L_TAX&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;SUM_CHARGE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_QUANTITY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_QTY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                  &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_DISC&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                              &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;COUNT_ORDER&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt;
    &lt;span class="n"&gt;LINEITEM&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt;
    &lt;span class="n"&gt;L_SHIPDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;DAY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;90&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1998-12-01'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;L_RETURNFLAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;L_LINESTATUS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Results after 5 runs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Run&lt;/th&gt;
&lt;th&gt;Execution Time (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;21.57&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;31.73&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;27.73&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;21.74&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;34.35&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h4&gt;
  
  
  TPC-H Q5: Multi-Table Join Query
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Q5 execution (MAX_QUERY_PERFORMANCE_LEVEL = SMALL)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_EXTENDEDPRICE&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_DISCOUNT&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;REVENUE&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt;
    &lt;span class="n"&gt;CUSTOMER&lt;/span&gt;   &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ORDERS&lt;/span&gt;     &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;LINEITEM&lt;/span&gt;   &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;SUPPLIER&lt;/span&gt;   &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;NATION&lt;/span&gt;     &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;REGION&lt;/span&gt;     &lt;span class="n"&gt;R&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt;
    &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;C_CUSTKEY&lt;/span&gt;    &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_CUSTKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_ORDERKEY&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;L&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;L_SUPPKEY&lt;/span&gt;   &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_SUPPKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;C_NATIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_NATIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;S_NATIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NATIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_REGIONKEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;R&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;R_REGIONKEY&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;R&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;R_NAME&lt;/span&gt;      &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'ASIA'&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1994-01-01'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;O_ORDERDATE&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;  &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;YEAR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TO_DATE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'1994-01-01'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;N&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;N_NAME&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt;
    &lt;span class="n"&gt;REVENUE&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Results after 5 runs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Run&lt;/th&gt;
&lt;th&gt;Execution Time (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;42.18&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;38.49&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;38.41&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;38.69&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;36.97&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  5. Compare Execution Times Using INFORMATION_SCHEMA.QUERY_HISTORY
&lt;/h3&gt;

&lt;p&gt;After completing runs on both warehouses, compare the two &lt;code&gt;QUERY_TAG&lt;/code&gt; values using &lt;code&gt;INFORMATION_SCHEMA.QUERY_HISTORY()&lt;/code&gt;. Unlike &lt;code&gt;ACCOUNT_USAGE&lt;/code&gt; views, this can be queried without delay.&lt;/p&gt;

&lt;h4&gt;
  
  
  View All 20 Individual Run Times
&lt;/h4&gt;

&lt;p&gt;First, check all 20 individual execution records (Q1 and Q5 × 5 runs each × 2 warehouses). Queries containing &lt;code&gt;L_RETURNFLAG&lt;/code&gt; in &lt;code&gt;QUERY_TEXT&lt;/code&gt; are identified as Q1, and those containing &lt;code&gt;N_NAME&lt;/code&gt; as Q5.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- All 20 execution records (check execution times)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%L_RETURNFLAG%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q1'&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%N_NAME%'&lt;/span&gt;       &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q5'&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="s1"&gt;'OTHER'&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt;                                       &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;query_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;START_TIME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EXECUTION_TIME&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;           &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;exec_sec&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;INFORMATION_SCHEMA&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;QUERY_HISTORY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;END_TIME_RANGE_START&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;HOUR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt;
        &lt;span class="n"&gt;END_TIME_RANGE_END&lt;/span&gt;   &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
        &lt;span class="n"&gt;RESULT_LIMIT&lt;/span&gt;         &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;   &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'ADAPTIVE_WH_XLARGE'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'ADAPTIVE_WH_SMALL'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;        &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'adaptive_xlarge_m2'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_small_m2'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;EXECUTION_STATUS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'SUCCESS'&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;QUERY_TYPE&lt;/span&gt;       &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'SELECT'&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;START_TIME&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6gjzaadikqkr6r3wj96d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6gjzaadikqkr6r3wj96d.png" alt="2026-04-16_09h15_18" width="800" height="416"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;QUERY_TAG&lt;/th&gt;
&lt;th&gt;WAREHOUSE_NAME&lt;/th&gt;
&lt;th&gt;Query&lt;/th&gt;
&lt;th&gt;Run&lt;/th&gt;
&lt;th&gt;Execution Time (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;4.47&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;4.55&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;3.68&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;3.85&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;3.77&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;5.74&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;3.17&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;3.03&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;3.03&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;2.83&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;21.57&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;31.73&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;27.73&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;21.74&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;34.35&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;42.18&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;38.49&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;38.41&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;38.69&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;36.97&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h4&gt;
  
  
  Aggregated Comparison
&lt;/h4&gt;

&lt;p&gt;Check whether lowering &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt; is reflected in &lt;code&gt;execution_time&lt;/code&gt;. I also included &lt;code&gt;queued_overload_time&lt;/code&gt; to observe any queuing trends.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Aggregate execution times by QUERY_TAG × query type for comparison&lt;/span&gt;
&lt;span class="c1"&gt;-- Only target the most recent 5 SELECT statements per QUERY_TAG × query_name&lt;/span&gt;
&lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;ranked&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
        &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;CASE&lt;/span&gt;
            &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%L_RETURNFLAG%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q1'&lt;/span&gt;
            &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%N_NAME%'&lt;/span&gt;       &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q5'&lt;/span&gt;
            &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="s1"&gt;'OTHER'&lt;/span&gt;
        &lt;span class="k"&gt;END&lt;/span&gt;                  &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;query_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;EXECUTION_TIME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;TOTAL_ELAPSED_TIME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;QUEUED_OVERLOAD_TIME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ROW_NUMBER&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;OVER&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="k"&gt;PARTITION&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                         &lt;span class="k"&gt;CASE&lt;/span&gt;
                             &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%L_RETURNFLAG%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q1'&lt;/span&gt;
                             &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;QUERY_TEXT&lt;/span&gt; &lt;span class="k"&gt;ILIKE&lt;/span&gt; &lt;span class="s1"&gt;'%N_NAME%'&lt;/span&gt;       &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'Q5'&lt;/span&gt;
                             &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="s1"&gt;'OTHER'&lt;/span&gt;
                         &lt;span class="k"&gt;END&lt;/span&gt;
            &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;START_TIME&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;rn&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;INFORMATION_SCHEMA&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;QUERY_HISTORY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;END_TIME_RANGE_START&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;DATEADD&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;HOUR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt;
            &lt;span class="n"&gt;END_TIME_RANGE_END&lt;/span&gt;   &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
            &lt;span class="n"&gt;RESULT_LIMIT&lt;/span&gt;         &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;   &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'ADAPTIVE_WH_XLARGE'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'ADAPTIVE_WH_SMALL'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;        &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'adaptive_xlarge_m2'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'adaptive_small_m2'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;EXECUTION_STATUS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'SUCCESS'&lt;/span&gt;
      &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;QUERY_TYPE&lt;/span&gt;       &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'SELECT'&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;query_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;                                           &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;run_count&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EXECUTION_TIME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;       &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;        &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;avg_exec_sec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;MIN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EXECUTION_TIME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;       &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;        &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;min_exec_sec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EXECUTION_TIME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;       &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;        &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;max_exec_sec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TOTAL_ELAPSED_TIME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;   &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;        &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;avg_elapsed_sec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;ROUND&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;QUEUED_OVERLOAD_TIME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;        &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;avg_queued_overload_sec&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;ranked&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;rn&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query_name&lt;/span&gt;
&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;QUERY_TAG&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query_name&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb6qllq5h0tpckexknz7s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb6qllq5h0tpckexknz7s.png" alt="2026-04-16_09h17_51" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here are the summarized results in table format.&lt;/p&gt;

&lt;p&gt;You can see that &lt;code&gt;adaptive_small_m2&lt;/code&gt; has longer &lt;code&gt;avg_exec_sec&lt;/code&gt; than &lt;code&gt;adaptive_xlarge_m2&lt;/code&gt; (execution time increases as the performance cap is lowered).&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;QUERY_TAG&lt;/th&gt;
&lt;th&gt;WAREHOUSE_NAME&lt;/th&gt;
&lt;th&gt;Query&lt;/th&gt;
&lt;th&gt;Runs&lt;/th&gt;
&lt;th&gt;Avg Exec (sec)&lt;/th&gt;
&lt;th&gt;Min (sec)&lt;/th&gt;
&lt;th&gt;Max (sec)&lt;/th&gt;
&lt;th&gt;Avg Elapsed (sec)&lt;/th&gt;
&lt;th&gt;Avg Queue Wait (sec)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;4.06&lt;/td&gt;
&lt;td&gt;3.68&lt;/td&gt;
&lt;td&gt;4.55&lt;/td&gt;
&lt;td&gt;7.17&lt;/td&gt;
&lt;td&gt;0.00&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_xlarge_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_XLARGE&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;3.56&lt;/td&gt;
&lt;td&gt;2.83&lt;/td&gt;
&lt;td&gt;5.74&lt;/td&gt;
&lt;td&gt;6.86&lt;/td&gt;
&lt;td&gt;0.00&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;27.42&lt;/td&gt;
&lt;td&gt;21.57&lt;/td&gt;
&lt;td&gt;34.35&lt;/td&gt;
&lt;td&gt;25.82&lt;/td&gt;
&lt;td&gt;0.00&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;adaptive_small_m2&lt;/td&gt;
&lt;td&gt;ADAPTIVE_WH_SMALL&lt;/td&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;38.95&lt;/td&gt;
&lt;td&gt;36.97&lt;/td&gt;
&lt;td&gt;42.18&lt;/td&gt;
&lt;td&gt;42.72&lt;/td&gt;
&lt;td&gt;0.00&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h4&gt;
  
  
  Analysis
&lt;/h4&gt;

&lt;p&gt;Both Q1 and Q5 showed significantly shorter execution times on the XLARGE warehouse compared to the SMALL warehouse.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Query&lt;/th&gt;
&lt;th&gt;XLARGE Avg Exec Time&lt;/th&gt;
&lt;th&gt;SMALL Avg Exec Time&lt;/th&gt;
&lt;th&gt;Ratio&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Q1&lt;/td&gt;
&lt;td&gt;4.06 sec&lt;/td&gt;
&lt;td&gt;27.42 sec&lt;/td&gt;
&lt;td&gt;~6.8x&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Q5&lt;/td&gt;
&lt;td&gt;3.56 sec&lt;/td&gt;
&lt;td&gt;38.95 sec&lt;/td&gt;
&lt;td&gt;~10.9x&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Q5 showed a larger ratio than Q1. Since Q5 involves JOINs across 6 tables, it's a more complex query where the difference in available resources becomes more pronounced.&lt;/p&gt;

&lt;h4&gt;
  
  
  Testing Note: Warehouse Data Cache
&lt;/h4&gt;

&lt;p&gt;There's an important point to keep in mind about this test. Snowflake warehouses have a &lt;strong&gt;data cache (local disk cache)&lt;/strong&gt; separate from the &lt;strong&gt;query result cache&lt;/strong&gt;. This mechanism retains micro-partitions read from remote storage on the warehouse's local SSD.&lt;/p&gt;

&lt;p&gt;In this test, I disabled the query result cache at the account level with &lt;code&gt;USE_CACHED_RESULT = false&lt;/code&gt;, but &lt;strong&gt;I did not disable the data cache&lt;/strong&gt;. Therefore, when running the same query consecutively on the same warehouse, the 2nd run onward reads data from local SSD, making it faster than the first run.&lt;/p&gt;

&lt;p&gt;Looking at the actual data, the XLARGE warehouse's Q5 first run took 5.74 seconds, while runs 2–5 dropped to 2.83–3.17 seconds, confirming the data cache effect.&lt;/p&gt;

&lt;p&gt;Checking the query profile, the 2nd run shows a certain percentage for &lt;code&gt;Percentage scanned from cache&lt;/code&gt;, confirming that the warehouse data cache was in effect.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SMALL Q5: 1st Run Query Profile&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fupotbrch2xv86vvwpu8x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fupotbrch2xv86vvwpu8x.png" alt="2026-04-16_09h31_43" width="800" height="715"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SMALL Q5: 2nd Run Query Profile&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcffrxp1jq5b1ckc97dmy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcffrxp1jq5b1ckc97dmy.png" alt="2026-04-16_09h32_36" width="800" height="718"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since the purpose of this test was to confirm the "relative difference between SMALL and XLARGE," both warehouses were compared under the same conditions (same cache state), making the relative comparison valid. However, if you want to measure pure cold-start performance, note that you need to suspend and restart the warehouse to clear the data cache before measuring.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tested Snowflake's new &lt;strong&gt;Adaptive Warehouse&lt;/strong&gt; feature using TPC-H sample data.&lt;/p&gt;

&lt;p&gt;Comparing &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt; at XLARGE vs. SMALL, I observed approximately a 6.8x difference for Q1 (aggregation query) and approximately a 10.9x difference for Q5 (6-table JOIN). More complex JOIN queries tend to show a more pronounced difference based on resource allocation, confirming that the &lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt; setting directly impacts performance.&lt;/p&gt;

&lt;p&gt;The most impressive aspect was being able to control the "cost vs. performance balance" with just two parameters (&lt;code&gt;MAX_QUERY_PERFORMANCE_LEVEL&lt;/code&gt; and &lt;code&gt;QUERY_THROUGHPUT_MULTIPLIER&lt;/code&gt;) without thinking about warehouse sizes at all. Previously, you had to make specific size choices like "XL or XXL," but Adaptive Warehouse automates that while letting you simply specify an upper limit — a much simpler design.&lt;/p&gt;

&lt;p&gt;This feature is available in Public Preview for Enterprise Edition and above. In addition to the US Oregon region (US West 2) used in this test, it's also available in EU West 1 (Ireland) and &lt;strong&gt;AP Northeast 1 (Tokyo)&lt;/strong&gt;. If you're using the Tokyo region, you can try it immediately with the same steps described here — give it a try!&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>dataengineering</category>
    </item>
    <item>
      <title>PDFs with Graphs? Just Ask the Agent: Cross-Analyzing Unstructured and Structured Data on Snowflake Cortex Agent</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Mon, 30 Mar 2026 23:13:39 +0000</pubDate>
      <link>https://forem.com/sagara/pdfs-with-graphs-just-ask-the-agent-cross-analyzing-unstructured-and-structured-data-on-snowflake-1ld4</link>
      <guid>https://forem.com/sagara/pdfs-with-graphs-just-ask-the-agent-cross-analyzing-unstructured-and-structured-data-on-snowflake-1ld4</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This is an English translation of the original Japanese article:&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/snowflake-multi-modal-analytics-with-cortex-agent/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-multi-modal-analytics-with-cortex-agent/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Since this is a translated article, some images contain Japanese text.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Previously, analyzing unstructured data on Snowflake required Cortex Search, which meant parsing text and loading it into tables — making it difficult to work with PDFs containing graphs and charts. However, now that the AI_COMPLETE function can directly query PDF files on stages, you can pass entire PDFs to an LLM without text extraction or chunk splitting.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/snowflake-cortex/ai-complete-document-intelligence" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/snowflake-cortex/ai-complete-document-intelligence&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This means we can wrap AI_COMPLETE in a stored procedure as a &lt;strong&gt;PDF custom tool&lt;/strong&gt; and combine it with &lt;strong&gt;Cortex Analyst (Semantic View)&lt;/strong&gt; to enable natural language analysis across both "unstructured data like PDFs on stages" and "table data" — all within a single Cortex Agent. I decided to put this to the test.&lt;/p&gt;

&lt;p&gt;The idea for this article was inspired by the following blog post. The approach of using a stored procedure that reads PDFs via AI_COMPLETE as a custom tool for Cortex Agent was extremely helpful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/truestar/articles/d2431ccd4aa127" rel="noopener noreferrer"&gt;https://zenn.dev/truestar/articles/d2431ccd4aa127&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;⚠️ &lt;strong&gt;Important note:&lt;/strong&gt; The table data (monthly sales details) used in this article is &lt;strong&gt;entirely dummy data&lt;/strong&gt;. While the PDF financial reports are actual published financial data from Classmethod, the monthly breakdown by service line and region is fictional. The dummy data was generated by proportionally distributing values so that the PDF's annual totals match the table's monthly totals.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Background &amp;amp; Challenges
&lt;/h2&gt;

&lt;p&gt;When analyzing unstructured data on Snowflake, the main approaches available until now were:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cortex Search&lt;/strong&gt;: Extract and chunk text into tables, then search and answer via RAG (Retrieval-Augmented Generation)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Document AI&lt;/strong&gt;: Structured data extraction (table conversion) from PDFs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These work well for text-centric documents, but had the following limitations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;PDFs with graphs and charts&lt;/strong&gt; lose information when only text is extracted&lt;/li&gt;
&lt;li&gt;A preprocessing pipeline for text parsing and table conversion is required, adding setup overhead&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Technical Approach
&lt;/h2&gt;

&lt;p&gt;Using AI_COMPLETE's document intelligence feature (&lt;code&gt;TO_FILE&lt;/code&gt; + &lt;code&gt;PROMPT&lt;/code&gt;), you can pass PDFs on stages directly to an LLM without text extraction. Since graphs and tables can be referenced as visual elements, information that was previously lost through text extraction can now be analyzed.&lt;/p&gt;

&lt;p&gt;By wrapping this in a Python stored procedure and registering it as a &lt;strong&gt;custom tool&lt;/strong&gt; for Cortex Agent, then combining it with Cortex Analyst (via Semantic View's &lt;code&gt;cortex_analyst_text_to_sql&lt;/code&gt; tool), we achieve the following architecture:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AGENT_CM_FINANCIAL_ANALYST (Integrated Agent)
├── SP_ASK_CM_FINANCIALS (generic tool)
│     └── AI_COMPLETE + TO_FILE → Directly query PDFs on stage
└── AnalystMonthlySales (cortex_analyst_text_to_sql tool)
      └── Semantic View → SQL aggregation on monthly sales table
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Tool selection is automatic by the Agent.&lt;/strong&gt; Based on the question content, it calls the PDF tool, the Analyst tool, or both, and integrates the results into a unified answer.&lt;/p&gt;

&lt;h3&gt;
  
  
  Limitations
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;AI_COMPLETE with documents is in Public Preview as of March 30, 2026

&lt;ul&gt;
&lt;li&gt;Stages must use &lt;strong&gt;server-side encryption (&lt;code&gt;SNOWFLAKE_SSE&lt;/code&gt;)&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;File size limits: Up to 10MB (900 pages) for Gemini 3.1 Pro, up to 4.5MB for Claude models&lt;/li&gt;
&lt;li&gt;Document count limits per request: Up to 20 for Gemini, up to 5 for Claude&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Cost
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;AI_COMPLETE costs are determined by &lt;strong&gt;processed token count, not file size&lt;/strong&gt;. Text portions and visual elements are tokenized, and the combined input/output is billable&lt;/li&gt;
&lt;li&gt;Cortex Analyst (Semantic View) has a lower unit price when accessed through Cortex Agents (see &lt;a href="https://www.snowflake.com/legal-files/CreditConsumptionTable.pdf" rel="noopener noreferrer"&gt;Consumption Table&lt;/a&gt; Table 6(f) and 6(h))&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Snowflake&lt;/strong&gt;: Enterprise edition&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feature status&lt;/strong&gt;: Public Preview as of March 30, 2026 (AI_COMPLETE with documents, Cortex Agents, Semantic View)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Required privileges&lt;/strong&gt;: SYSADMIN role, &lt;code&gt;SNOWFLAKE.CORTEX_USER&lt;/code&gt; database role grant&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cross-region inference&lt;/strong&gt;: Setting &lt;code&gt;CORTEX_ENABLED_CROSS_REGION = 'ANY_REGION'&lt;/code&gt; may be required to use &lt;code&gt;gemini-3.1-pro&lt;/code&gt; (set with ACCOUNTADMIN)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Model used&lt;/strong&gt;: &lt;code&gt;gemini-3.1-pro&lt;/code&gt; (chosen for PDF reading as it supports up to 10MB)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setup
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Overall Architecture
&lt;/h3&gt;

&lt;p&gt;Here's the complete picture of Snowflake objects we'll create:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POC_CM.SAGARA_TEST
├── STG_CM_FINANCIAL_REPORTS    -- PDF stage
├── V_CM_FINANCIAL_METADATA     -- PDF metadata view
├── T_MONTHLY_SALES             -- Dummy sales table
├── SV_MONTHLY_SALES            -- Semantic View (for Cortex Analyst)
├── SP_ASK_CM_FINANCIALS        -- PDF custom tool (stored procedure)
└── AGENT_CM_FINANCIAL_ANALYST  -- Integrated Agent
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Target Data
&lt;/h3&gt;

&lt;h4&gt;
  
  
  PDFs (Unstructured Data)
&lt;/h4&gt;

&lt;p&gt;We use the following 4 PDFs. In addition to 3 fiscal years of financial reports, we include a &lt;strong&gt;company introduction presentation (slide format) with numerous graphs and charts&lt;/strong&gt; to also verify AI_COMPLETE's visual reading capabilities.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;File Name&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;Period&lt;/th&gt;
&lt;th&gt;Coverage&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;financial-results_202306.pdf&lt;/td&gt;
&lt;td&gt;Financial Report&lt;/td&gt;
&lt;td&gt;19th Period&lt;/td&gt;
&lt;td&gt;2022/7/1 - 2023/6/30&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;financial-results_202406.pdf&lt;/td&gt;
&lt;td&gt;Financial Report&lt;/td&gt;
&lt;td&gt;20th Period&lt;/td&gt;
&lt;td&gt;2023/7/1 - 2024/6/30&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;financial-results_202506.pdf&lt;/td&gt;
&lt;td&gt;Financial Report&lt;/td&gt;
&lt;td&gt;21st Period&lt;/td&gt;
&lt;td&gt;2024/7/1 - 2025/6/30&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;会社紹介資料_20251031.pdf&lt;/td&gt;
&lt;td&gt;Company Introduction&lt;/td&gt;
&lt;td&gt;—&lt;/td&gt;
&lt;td&gt;As of October 2025&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The company introduction presentation contains visual information that would be lost through text extraction: bar graphs of revenue trends, bar graphs of employee count trends, pie charts of team composition, office location maps, etc. The key question is whether AI_COMPLETE can directly read this type of material, which was difficult to handle with the conventional Cortex Search (text parsing approach).&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Financial report example (obtained from &lt;a href="https://classmethod.jp/company/finance/" rel="noopener noreferrer"&gt;Classmethod's official website&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu90b4htgq37g7k2jj2n5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu90b4htgq37g7k2jj2n5.png" alt="2026-03-30_16h45_55" width="800" height="1056"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cbth110p8uhftvl0jq2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cbth110p8uhftvl0jq2.png" alt="2026-03-30_17h37_17" width="764" height="996"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Company introduction example (obtained from &lt;a href="https://speakerdeck.com/classmethod_jinji/introduction-to-classmethod-for-engineers" rel="noopener noreferrer"&gt;Classmethod's official SpeakerDeck&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyll8vthfhhi2lgr2sf1e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyll8vthfhhi2lgr2sf1e.png" alt="2026-03-30_16h48_04" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fku3zj3kuiuqleapu0nxf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fku3zj3kuiuqleapu0nxf.png" alt="2026-03-30_16h49_34" width="800" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Table (Structured Data) — Dummy Data
&lt;/h4&gt;

&lt;p&gt;We generate &lt;strong&gt;dummy data that aligns with the PDF financial figures&lt;/strong&gt;. The key demo point is verifying that "the annual revenue from the PDF side matches the monthly sales aggregation from the table side."&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Column&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;SALE_ID&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Surrogate key (sequential)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;FISCAL_YEAR&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Fiscal year (19, 20, 21)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;YEAR_MONTH&lt;/td&gt;
&lt;td&gt;DATE&lt;/td&gt;
&lt;td&gt;First day of month (36 months)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;SERVICE_LINE&lt;/td&gt;
&lt;td&gt;VARCHAR&lt;/td&gt;
&lt;td&gt;Service line (6 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;CUSTOMER_SEGMENT&lt;/td&gt;
&lt;td&gt;VARCHAR&lt;/td&gt;
&lt;td&gt;Customer segment (5 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;REGION&lt;/td&gt;
&lt;td&gt;VARCHAR&lt;/td&gt;
&lt;td&gt;Region (6 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;REVENUE&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Revenue (thousands of yen)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;COGS&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Cost of goods sold (thousands of yen)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;GROSS_PROFIT&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Gross profit (thousands of yen)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DEAL_COUNT&lt;/td&gt;
&lt;td&gt;NUMBER&lt;/td&gt;
&lt;td&gt;Deal count&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Alignment rules with PDFs:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Total REVENUE per period = PDF's revenue (19th: ¥59,005,311K, 20th: ¥77,190,340K, 21st: ¥95,056,018K)&lt;/li&gt;
&lt;li&gt;Total COGS per period = PDF's cost of sales (19th: ¥52,634,720K, 20th: ¥69,320,565K, 21st: ¥85,782,527K)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Implementation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Stage Creation &amp;amp; PDF Upload
&lt;/h3&gt;

&lt;p&gt;First, create an internal stage for PDF storage. &lt;strong&gt;Note that the encryption type must be &lt;code&gt;SNOWFLAKE_SSE&lt;/code&gt; or AI_COMPLETE won't be able to read the files.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="n"&gt;STAGE&lt;/span&gt; &lt;span class="n"&gt;STG_CM_FINANCIAL_REPORTS&lt;/span&gt;
  &lt;span class="n"&gt;DIRECTORY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ENABLE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;ENCRYPTION&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;TYPE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'SNOWFLAKE_SSE'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Stage for Classmethod financial report PDFs'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After creating the stage, upload the 4 PDFs via Snowsight: &lt;code&gt;Data&lt;/code&gt; &amp;gt; &lt;code&gt;Databases&lt;/code&gt; &amp;gt; &lt;code&gt;POC_CM&lt;/code&gt; &amp;gt; &lt;code&gt;SAGARA_TEST&lt;/code&gt; &amp;gt; &lt;code&gt;Stages&lt;/code&gt; &amp;gt; &lt;code&gt;STG_CM_FINANCIAL_REPORTS&lt;/code&gt; &amp;gt; &lt;code&gt;+ Files&lt;/code&gt; button.&lt;/p&gt;

&lt;p&gt;After uploading, it should look like the following:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkusipim5l9aawgekeats.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkusipim5l9aawgekeats.png" alt="2026-03-30_16h50_42" width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After uploading, refresh the directory table and verify the files:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;STAGE&lt;/span&gt; &lt;span class="n"&gt;STG_CM_FINANCIAL_REPORTS&lt;/span&gt; &lt;span class="n"&gt;REFRESH&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DIRECTORY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;STG_CM_FINANCIAL_REPORTS&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see 4 files listed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gaw0ntmgvfxlexku8gr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gaw0ntmgvfxlexku8gr.png" alt="2026-03-30_16h51_58" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, create a metadata view that extracts document type and fiscal year from file names. Since financial reports and company introduction materials have different naming conventions, we use &lt;code&gt;CASE&lt;/code&gt; statements to handle both patterns.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;V_CM_FINANCIAL_METADATA&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;FILE_URL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;SIZE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;LAST_MODIFIED&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="c1"&gt;-- Determine document type&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'financial-results%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'決算報告書'&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'会社紹介資料%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'会社紹介資料'&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="s1"&gt;'その他'&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;DOC_TYPE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="c1"&gt;-- Extract fiscal year only for financial reports&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'financial-results%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt;
            &lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="n"&gt;SPLIT_PART&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;REPLACE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'.pdf'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'_'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202306'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;19&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202406'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202506'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="mi"&gt;21&lt;/span&gt;
            &lt;span class="k"&gt;END&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;FISCAL_YEAR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'financial-results%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt;
            &lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="n"&gt;SPLIT_PART&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;REPLACE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'.pdf'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'_'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202306'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'2022/7/1 - 2023/6/30'&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202406'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'2023/7/1 - 2024/6/30'&lt;/span&gt;
                &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="s1"&gt;'202506'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'2024/7/1 - 2025/6/30'&lt;/span&gt;
            &lt;span class="k"&gt;END&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'会社紹介資料%'&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="s1"&gt;'2025年10月時点'&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;FISCAL_PERIOD&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DIRECTORY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;STG_CM_FINANCIAL_REPORTS&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;RELATIVE_PATH&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'%.pdf'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Querying the created view shows the following:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrusycfaacmrf7d57drz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrusycfaacmrf7d57drz.png" alt="2026-03-30_16h52_47" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Standalone AI_COMPLETE Verification
&lt;/h3&gt;

&lt;p&gt;Before building the PDF custom tool, let's verify that AI_COMPLETE can directly read PDFs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;AI_COMPLETE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;MODEL&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'gemini-3.1-pro'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;PROMPT&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;PROMPT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s1"&gt;'この決算報告書の売上高と売上原価を教えてください: {0}'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;TO_FILE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'@POC_CM.SAGARA_TEST.STG_CM_FINANCIAL_REPORTS'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'financial-results_202506.pdf'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The PDF specified with &lt;code&gt;TO_FILE&lt;/code&gt; is passed to the &lt;code&gt;{0}&lt;/code&gt; placeholder in the &lt;code&gt;PROMPT&lt;/code&gt; function. If the correct revenue and cost of sales figures are returned, we're good to go.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F867dlmd7o9d84lbs7y0u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F867dlmd7o9d84lbs7y0u.png" alt="2026-03-30_16h54_00" width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Creating the Stored Procedure for the PDF Custom Tool
&lt;/h3&gt;

&lt;p&gt;Since AI_COMPLETE cannot be directly registered as a Cortex Agent tool, we &lt;strong&gt;wrap it in a Python stored procedure&lt;/strong&gt;. It dynamically retrieves the PDF file list from the directory table, filters by &lt;strong&gt;document type (DOC_TYPE)&lt;/strong&gt; and fiscal year, then executes AI_COMPLETE against each PDF.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="k"&gt;PROCEDURE&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;QUESTION&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;FILTER_FISCAL_YEAR&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;RETURNS&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;
&lt;span class="k"&gt;LANGUAGE&lt;/span&gt; &lt;span class="n"&gt;PYTHON&lt;/span&gt;
&lt;span class="n"&gt;RUNTIME_VERSION&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'3.12'&lt;/span&gt;
&lt;span class="n"&gt;PACKAGES&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'snowflake-snowpark-python'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;HANDLER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'main'&lt;/span&gt;
&lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;OWNER&lt;/span&gt;
&lt;span class="k"&gt;AS&lt;/span&gt;
&lt;span class="err"&gt;$$&lt;/span&gt;
&lt;span class="n"&gt;import&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;

&lt;span class="n"&gt;def&lt;/span&gt; &lt;span class="n"&gt;_normalize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nv"&gt;"&lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;Treat string 'NULL'/'null'/empty string as None (ignore)&lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;None&lt;/span&gt;
    &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;strip&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;upper&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'NULL'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;None&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;

&lt;span class="n"&gt;def&lt;/span&gt; &lt;span class="n"&gt;main&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;session&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;filter_doc_type&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;filter_fiscal_year&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;None&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;filter_doc_type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_normalize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filter_doc_type&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;filter_fiscal_year&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_normalize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filter_fiscal_year&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nv"&gt;"&lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;
        SELECT RELATIVE_PATH, DOC_TYPE, FISCAL_YEAR, FISCAL_PERIOD
        FROM POC_CM.SAGARA_TEST.V_CM_FINANCIAL_METADATA
        WHERE 1=1
    &lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="n"&gt;filter_doc_type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;safe_doc_type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;filter_doc_type&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;"'"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;"''"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="nv"&gt;" AND DOC_TYPE = '{safe_doc_type}'"&lt;/span&gt;
    &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="n"&gt;filter_fiscal_year&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="nv"&gt;" AND FISCAL_YEAR = {filter_fiscal_year}"&lt;/span&gt;

    &lt;span class="n"&gt;files_df&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;collect&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="n"&gt;files_df&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nv"&gt;"error"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"No PDF files matched the specified conditions"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="n"&gt;ensure_ascii&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;False&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;stage_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'@POC_CM.SAGARA_TEST.STG_CM_FINANCIAL_REPORTS'&lt;/span&gt;
    &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="k"&gt;row&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;files_df&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;pdf_file&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'RELATIVE_PATH'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="n"&gt;safe_question&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;"'"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;"''"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;safe_pdf_file&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pdf_file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;"'"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;"''"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;ai_query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="nv"&gt;"&lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;
            SELECT AI_COMPLETE(
                MODEL =&amp;gt; 'gemini-3.1-pro',
                PROMPT =&amp;gt; PROMPT(
                    '{safe_question}: {{0}}',
                    TO_FILE('{stage_path}', '{safe_pdf_file}')
                )
            ) AS answer
        &lt;/span&gt;&lt;span class="se"&gt;""&lt;/span&gt;&lt;span class="nv"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ai_query&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;collect&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="s1"&gt;'ANSWER'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="k"&gt;result&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="s1"&gt;'Error: No result'&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="n"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="s1"&gt;'Error: {str(e)}'&lt;/span&gt;

        &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="s1"&gt;'file'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;pdf_file&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="s1"&gt;'doc_type'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'DOC_TYPE'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="s1"&gt;'fiscal_year'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'FISCAL_YEAR'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="k"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'FISCAL_YEAR'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="k"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="s1"&gt;'fiscal_period'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'FISCAL_PERIOD'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="s1"&gt;'answer'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ensure_ascii&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The procedure includes a &lt;strong&gt;&lt;code&gt;_normalize()&lt;/code&gt; function&lt;/strong&gt;. When Cortex Agent passes parameters to the stored procedure, it sometimes sends the &lt;strong&gt;string "NULL"&lt;/strong&gt; instead of SQL NULL. This function converts those to Python &lt;code&gt;None&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;For verification, let's call it with several patterns:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Across all PDFs&lt;/span&gt;
&lt;span class="k"&gt;CALL&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Please summarize the key points of this document'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;-- Financial reports only, 21st period&lt;/span&gt;
&lt;span class="k"&gt;CALL&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s1"&gt;'What are the revenue and operating profit?'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'決算報告書'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'21'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;-- Company introduction only (graph reading)&lt;/span&gt;
&lt;span class="k"&gt;CALL&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s1"&gt;'Read the revenue trend from the performance graph'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'会社紹介資料'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The third query in particular tests AI_COMPLETE's visual reading capability — &lt;strong&gt;reading numerical values from bar graphs&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If JSON-formatted answers from each PDF are returned as shown below, everything is working correctly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjvhvj236034b7jzeczza.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjvhvj236034b7jzeczza.png" alt="2026-03-30_16h56_46" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Dummy Data Generation &amp;amp; Table Creation
&lt;/h3&gt;

&lt;p&gt;Generate table data for Cortex Analyst.&lt;/p&gt;

&lt;p&gt;We create a &lt;code&gt;T_MONTHLY_SALES&lt;/code&gt; table and INSERT approximately 6,000 rows of dummy data that aligns with the PDF financial figures.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Column&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;FISCAL_YEAR&lt;/td&gt;
&lt;td&gt;Fiscal year (19, 20, 21)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;YEAR_MONTH&lt;/td&gt;
&lt;td&gt;First day of month (36 months)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;SERVICE_LINE&lt;/td&gt;
&lt;td&gt;Service line (AWS Resale, Cloud Migration Support, etc. — 6 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;CUSTOMER_SEGMENT&lt;/td&gt;
&lt;td&gt;Customer segment (Enterprise, Mid-Market, etc. — 5 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;REGION&lt;/td&gt;
&lt;td&gt;Region (Kanto, Kansai, etc. — 6 categories)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;REVENUE / COGS / GROSS_PROFIT&lt;/td&gt;
&lt;td&gt;Revenue / Cost of sales / Gross profit (thousands of yen)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DEAL_COUNT&lt;/td&gt;
&lt;td&gt;Deal count&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3g74g4es5h9f8w6gxy1l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3g74g4es5h9f8w6gxy1l.png" alt="2026-03-30_16h58_53" width="800" height="658"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Semantic View Creation
&lt;/h3&gt;

&lt;p&gt;Create a Semantic View for Cortex Analyst. Setting Japanese &lt;code&gt;SYNONYMS&lt;/code&gt; improves the accuracy of natural language queries.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="n"&gt;SEMANTIC&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SV_MONTHLY_SALES&lt;/span&gt;

  &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;
      &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;SALE_ID&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'月次売上'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上データ'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上明細'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上実績'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Monthly sales detail data for Classmethod Inc.'&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="n"&gt;DIMENSIONS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;FISCAL_YEAR&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;FISCAL_YEAR&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'期'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'年度'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'会計期間'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Fiscal year (19=19th period, 20=20th period, 21=21st period)'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;YEAR_MONTH&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;YEAR_MONTH&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'月'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'年月'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'対象月'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Target month (first day of month)'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SERVICE_LINE&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SERVICE_LINE&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'サービス'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'事業'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'事業区分'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'サービス区分'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Service line'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_SEGMENT&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_SEGMENT&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'顧客区分'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'セグメント'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'顧客タイプ'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Customer segment'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;REGION&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;REGION&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'地域'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'エリア'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'拠点'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Region'&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="n"&gt;METRICS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_REVENUE&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;REVENUE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'売上'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上額'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上金額'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'収益'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上高'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Revenue (thousands of yen)'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_COGS&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COGS&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'原価'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上原価'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'コスト'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Cost of goods sold (thousands of yen)'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_GROSS_PROFIT&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;GROSS_PROFIT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'粗利'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'粗利益'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'売上総利益'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'GP'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Gross profit (thousands of yen)'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

    &lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_DEAL_COUNT&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;T_MONTHLY_SALES&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DEAL_COUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;SYNONYMS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'案件数'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'取引件数'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'ディール数'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Deal count'&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Classmethod monthly sales dummy data (for Cortex Analyst)'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  6. Integrated Agent Creation
&lt;/h3&gt;

&lt;p&gt;Next, create the Agent that integrates the PDF custom tool and the Cortex Analyst tool. The key point is to clearly specify how each tool should be used in the instructions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="n"&gt;AGENT&lt;/span&gt; &lt;span class="n"&gt;POC_CM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAGARA_TEST&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AGENT_CM_FINANCIAL_ANALYST&lt;/span&gt;
  &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Classmethod financial report integrated analysis agent (PDF × Table)'&lt;/span&gt;
  &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;SPECIFICATION&lt;/span&gt;
&lt;span class="err"&gt;$$&lt;/span&gt;
&lt;span class="n"&gt;models&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="n"&gt;orchestration&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;auto&lt;/span&gt;

&lt;span class="n"&gt;orchestration&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="n"&gt;budget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;seconds&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;120&lt;/span&gt;
    &lt;span class="n"&gt;tokens&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;32000&lt;/span&gt;

&lt;span class="n"&gt;instructions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="k"&gt;system&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="err"&gt;あなたはクラスメソッド株式会社の企業情報・財務分析アシスタントです。&lt;/span&gt;
    &lt;span class="err"&gt;以下の&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="err"&gt;つのデータソースを使って質問に回答できます。&lt;/span&gt;

    &lt;span class="err"&gt;【データソース&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PDF&lt;/span&gt;&lt;span class="err"&gt;資料】&lt;/span&gt;
    &lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="err"&gt;ツールで参照できます。&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;A&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="err"&gt;決算報告書（&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="err"&gt;期分）&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="err"&gt;決算報告書&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;B&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="err"&gt;会社紹介資料&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="err"&gt;会社紹介資料&lt;/span&gt;

    &lt;span class="err"&gt;【データソース&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="err"&gt;月次売上テーブル】&lt;/span&gt;
    &lt;span class="n"&gt;AnalystMonthlySales&lt;/span&gt;&lt;span class="err"&gt;ツールで参照できます。&lt;/span&gt;

    &lt;span class="err"&gt;【ツール選択ルール】&lt;/span&gt;
    &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt; &lt;span class="err"&gt;決算書の定性的な内容&lt;/span&gt; &lt;span class="err"&gt;→&lt;/span&gt; &lt;span class="n"&gt;PDF&lt;/span&gt;&lt;span class="err"&gt;ツール（&lt;/span&gt;&lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="err"&gt;決算報告書）&lt;/span&gt;
    &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt; &lt;span class="err"&gt;会社概要、拠点、従業員数、経営理念、業績推移グラフ等&lt;/span&gt;
       &lt;span class="err"&gt;→&lt;/span&gt; &lt;span class="n"&gt;PDF&lt;/span&gt;&lt;span class="err"&gt;ツール（&lt;/span&gt;&lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="err"&gt;会社紹介資料）&lt;/span&gt;
    &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt; &lt;span class="err"&gt;数値の集計・比較・推移分析&lt;/span&gt; &lt;span class="err"&gt;→&lt;/span&gt; &lt;span class="n"&gt;Analyst&lt;/span&gt;&lt;span class="err"&gt;ツール&lt;/span&gt;
    &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt; &lt;span class="n"&gt;PDF&lt;/span&gt;&lt;span class="err"&gt;×テーブルの横断分析&lt;/span&gt; &lt;span class="err"&gt;→&lt;/span&gt; &lt;span class="err"&gt;両方呼び出して比較&lt;/span&gt;

  &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="err"&gt;日本語で回答してください。&lt;/span&gt;
    &lt;span class="err"&gt;数値を含む回答ではテーブル形式で見やすく整理してください。&lt;/span&gt;

  &lt;span class="n"&gt;sample_questions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"第21期の売上高は？"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"サービスライン別の売上推移を教えて"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"第20期の監査報告書の内容は？"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"PDFの年間売上とテーブルの月次合計を突合して"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"会社の拠点一覧を教えて"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"業績推移のグラフから売上高の成長率を教えて"&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"従業員数の推移と売上高の推移を比較して"&lt;/span&gt;

&lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;tool_spec&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"generic"&lt;/span&gt;
      &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"SP_ASK_CM_FINANCIALS"&lt;/span&gt;
      &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="err"&gt;クラスメソッドの&lt;/span&gt;&lt;span class="n"&gt;PDF&lt;/span&gt;&lt;span class="err"&gt;資料（決算報告書・会社紹介資料）に対して質問を行い、回答を取得するツール。&lt;/span&gt;
        &lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="err"&gt;（決算報告書&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="err"&gt;会社紹介資料）と&lt;/span&gt;&lt;span class="n"&gt;FILTER_FISCAL_YEAR&lt;/span&gt;&lt;span class="err"&gt;（&lt;/span&gt;&lt;span class="mi"&gt;19&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;21&lt;/span&gt;&lt;span class="err"&gt;）でフィルタ可能。&lt;/span&gt;
      &lt;span class="n"&gt;input_schema&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"object"&lt;/span&gt;
        &lt;span class="n"&gt;properties&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="n"&gt;QUESTION&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"string"&lt;/span&gt;
            &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"PDF資料に対して問い合わせる質問文"&lt;/span&gt;
          &lt;span class="n"&gt;FILTER_DOC_TYPE&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"string"&lt;/span&gt;
            &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"資料種別フィルタ（決算報告書, 会社紹介資料）。指定しない場合はNULL。"&lt;/span&gt;
          &lt;span class="n"&gt;FILTER_FISCAL_YEAR&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"string"&lt;/span&gt;
            &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"会計年度フィルタ（19, 20, 21）。指定しない場合はNULL。"&lt;/span&gt;
        &lt;span class="n"&gt;required&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="nv"&gt;"QUESTION"&lt;/span&gt;

  &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;tool_spec&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"cortex_analyst_text_to_sql"&lt;/span&gt;
      &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"AnalystMonthlySales"&lt;/span&gt;
      &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="err"&gt;月次売上明細テーブルに対して&lt;/span&gt;&lt;span class="k"&gt;SQL&lt;/span&gt;&lt;span class="err"&gt;集計を行い、数値データを分析するツール。&lt;/span&gt;

&lt;span class="n"&gt;tool_resources&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="n"&gt;SP_ASK_CM_FINANCIALS&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"procedure"&lt;/span&gt;
    &lt;span class="n"&gt;identifier&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"POC_CM.SAGARA_TEST.SP_ASK_CM_FINANCIALS"&lt;/span&gt;
    &lt;span class="n"&gt;execution_environment&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"warehouse"&lt;/span&gt;
      &lt;span class="n"&gt;warehouse&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"COPUTE_WH"&lt;/span&gt;
      &lt;span class="n"&gt;query_timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;120&lt;/span&gt;
  &lt;span class="n"&gt;AnalystMonthlySales&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;semantic_view&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"POC_CM.SAGARA_TEST.SV_MONTHLY_SALES"&lt;/span&gt;
    &lt;span class="n"&gt;execution_environment&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"warehouse"&lt;/span&gt;
      &lt;span class="n"&gt;warehouse&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"COPUTE_WH"&lt;/span&gt;
      &lt;span class="n"&gt;query_timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;
&lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  7. Testing via Snowflake Intelligence
&lt;/h3&gt;

&lt;p&gt;Let's interactively test the created Agent from the Snowflake Intelligence UI.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;⚠️ Once again, the table data (monthly sales details) used in this verification is &lt;strong&gt;entirely dummy data&lt;/strong&gt;. While the PDF financial reports are actual published financial data from Classmethod, the monthly breakdown by service line and region is fictional. The dummy data was generated by proportionally distributing values so that the PDF's annual totals match the table's monthly totals. Please keep this in mind.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We'll run the following test questions to verify that tool selection works correctly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 1: "What is the revenue for the 21st period?"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ Answered using the Analyst tool or PDF tool. Either way, the same value (¥95,056,018K) should be returned.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsh0ox918cw4oy0xltqei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsh0ox918cw4oy0xltqei.png" alt="2026-03-30_17h33_00" width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 2: "Show me the revenue trend by service line"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ The Analyst tool is selected, and SQL aggregation is executed against the table data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrvk30zq2xfjhmaijc26.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrvk30zq2xfjhmaijc26.png" alt="2026-03-30_17h34_14" width="800" height="646"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr0st6mmbrf0o51fkxrd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr0st6mmbrf0o51fkxrd.png" alt="2026-03-30_17h34_52" width="800" height="954"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 3: "What does the audit report for the 20th period say?"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ The PDF tool is selected, and the content of the audit report within the financial report PDF is returned.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvqe4nffko6cx35ot2hvj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvqe4nffko6cx35ot2hvj.png" alt="2026-03-30_17h36_11" width="800" height="893"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 4: "Reconcile the annual revenue from the PDF with the monthly totals from the table"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ &lt;strong&gt;Both tools are called&lt;/strong&gt;, and the response confirms that the annual revenue from the PDF side matches the monthly sales total from the table side. This is the highlight of this verification.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46hg9q6ddaoqj7dspcp0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46hg9q6ddaoqj7dspcp0.png" alt="2026-03-30_17h38_10" width="800" height="758"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fph24kqt753x7oh465zcm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fph24kqt753x7oh465zcm.png" alt="2026-03-30_17h38_40" width="800" height="811"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 5: "List the company's office locations"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ The PDF tool (&lt;code&gt;FILTER_DOC_TYPE=会社紹介資料&lt;/code&gt;) is selected, and a list of 8 domestic and 5 overseas offices is returned from the company introduction's office location page (slide with map).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgx2g9groayrlo3v36s2v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgx2g9groayrlo3v36s2v.png" alt="2026-03-30_17h40_58" width="800" height="787"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 6: "What is the revenue growth rate based on the performance trend graph?"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ The PDF tool (&lt;code&gt;FILTER_DOC_TYPE=会社紹介資料&lt;/code&gt;) is selected, and the response &lt;strong&gt;reads numerical values from the bar graph&lt;/strong&gt; and calculates growth rates. This demonstrates that graph reading — which was impossible with text extraction — works properly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ialzb314y5tpzvtxomg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ialzb314y5tpzvtxomg.png" alt="2026-03-30_17h42_50" width="800" height="975"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 7: "Tell me about Classmethod's evaluation system, career paths, and salary ranges"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ The PDF tool (&lt;code&gt;FILTER_DOC_TYPE=会社紹介資料&lt;/code&gt;) is selected, and it properly reads the grade list and salary correlation expressed in complex graphs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff79hcfojv0aakg3hwk6v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff79hcfojv0aakg3hwk6v.png" alt="2026-03-30_17h52_42" width="800" height="896"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7tto438o3qqofrbxwgz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7tto438o3qqofrbxwgz.png" alt="2026-03-30_17h53_00" width="800" height="896"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test 8: "Compare the employee count trend with the revenue trend"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;→ &lt;strong&gt;Both tools are called&lt;/strong&gt; — the PDF tool (employee count trend graph from the company introduction) and the Analyst tool (revenue aggregation from the table) — and the response provides a cross-cutting comparative analysis of employee growth and revenue growth.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6zn523bsw5mykiltjs61.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6zn523bsw5mykiltjs61.png" alt="2026-03-30_17h44_39" width="800" height="733"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1k9th9tdd4ozoiwsb78l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1k9th9tdd4ozoiwsb78l.png" alt="2026-03-30_17h45_38" width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;By using AI_COMPLETE's document intelligence feature, PDFs can be passed directly to an LLM without building a RAG pipeline, making it extremely simple to incorporate as a Cortex Agent custom tool.&lt;/p&gt;

&lt;p&gt;Here's what this verification confirmed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cross-analysis of PDFs and tables is now possible within a single Agent.&lt;/strong&gt; Based on the question content, the Agent automatically selects the appropriate PDF tool, Analyst tool, or both&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reconciliation between PDF annual totals and table monthly aggregations&lt;/strong&gt; is achievable by calling both tools and comparing results&lt;/li&gt;
&lt;li&gt;Even PDFs containing graphs and charts can be analyzed directly without text extraction or chunk splitting, making the &lt;strong&gt;setup dramatically simpler compared to Cortex Search&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the other hand, it's important to note that AI_COMPLETE with documents is in Public Preview (as of March 2026), there are file size limitations, and costs are incurred based on processed token count.&lt;/p&gt;

&lt;p&gt;This approach feels like an excellent fit for use cases where you want to perform cross-cutting analysis of unstructured and structured data on Snowflake. Give it a try!&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>ai</category>
    </item>
    <item>
      <title>Snowflake DCM Projects : Building DEV/PROD Environments with Declarative IaC Using Template Variables</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Fri, 20 Mar 2026 22:29:36 +0000</pubDate>
      <link>https://forem.com/sagara/snowflake-dcm-projects-building-devprod-environments-with-declarative-iac-using-template-4klj</link>
      <guid>https://forem.com/sagara/snowflake-dcm-projects-building-devprod-environments-with-declarative-iac-using-template-4klj</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This is an English version of the original Japanese article:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/snowflake-dcm-projects-preview/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-dcm-projects-preview/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Snowflake has released a new feature called &lt;strong&gt;DCM Projects&lt;/strong&gt;. It allows you to declaratively define Snowflake objects using &lt;code&gt;DEFINE&lt;/code&gt; statements and deploy them through a Plan → Deploy workflow — essentially an Infrastructure-as-Code (IaC) feature native to Snowflake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2026/other/2026-03-20-dcm-projects" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2026/other/2026-03-20-dcm-projects&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-overview" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-overview&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Previously, IaC management of Snowflake objects required external tools such as Terraform or Schemachange. With DCM Projects, you can now manage objects declaratively within Snowflake natively.&lt;/p&gt;

&lt;p&gt;In this article, I verified a workflow where &lt;strong&gt;a single DCM project folder (set of definition files)&lt;/strong&gt; has two targets defined, and deploys to &lt;strong&gt;separate DCM project objects for DEV and PROD&lt;/strong&gt; by switching template variables. I'll walk through the steps and share the results.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; DCM Projects is in Preview as of March 20, 2026. Specifications may change before GA. Also, the official docs recommend separating DEV/PROD environments by using different accounts. The single-account setup in this article is purely for verification purposes — if you operate within a single account, careful access management is required.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Feature Overview
&lt;/h2&gt;

&lt;p&gt;DCM Projects (Database Change Management Projects) is a feature that lets you define the desired state of Snowflake objects in code and manage them declaratively.&lt;/p&gt;

&lt;p&gt;Key characteristics include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Declarative definitions&lt;/strong&gt;: Define the desired state of objects using &lt;code&gt;DEFINE&lt;/code&gt; statements. Dependencies are automatically resolved, so you don't need to worry about the order of declarations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Plan → Deploy workflow&lt;/strong&gt;: Review diffs with &lt;code&gt;PLAN&lt;/code&gt; before deploying, then apply changes with &lt;code&gt;DEPLOY&lt;/code&gt;. It follows the same concept as Terraform's &lt;code&gt;plan&lt;/code&gt; → &lt;code&gt;apply&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Jinja2 templates&lt;/strong&gt;: Supports variable substitution, conditional branching, loops, and macros. You can expand different parameters per environment from a single set of definition files&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Supported objects&lt;/strong&gt;: Database, Schema, Table, Dynamic Table, (Secure) View, Internal Stage, Warehouse, Role / Database Role, Grant, Data Metric Function, Task, SQL Function, Tag, Authentication Policy, and more. However, only a subset of Snowflake objects is supported — for example, File Format is not included&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pipeline management&lt;/strong&gt;: &lt;code&gt;REFRESH ALL&lt;/code&gt; for bulk-refreshing Dynamic Tables, and &lt;code&gt;TEST ALL&lt;/code&gt; for bulk-running data quality tests&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Management interfaces&lt;/strong&gt;: Operable from Snowsight, Snowflake CLI (v3.16+), SQL, and Cortex Code CLI&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-supported-entities" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-supported-entities&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Limitations
&lt;/h3&gt;

&lt;p&gt;The following are the main limitations confirmed as of March 20, 2026, based on the official docs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;General&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;As a Preview feature, specifications may change&lt;/li&gt;
&lt;li&gt;Only a subset of Snowflake objects is supported&lt;/li&gt;
&lt;li&gt;Changes during deploy are subject to CREATE OR ALTER constraints. For some objects (e.g., Tables), partial changes may be applied on failure&lt;/li&gt;
&lt;li&gt;Maximum of 1,000 source files / 10,000 rendered object definitions. Exceeding these limits may cause performance degradation or execution failures&lt;/li&gt;
&lt;li&gt;During Preview, changesets may not fully capture all granular changes&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;_snow&lt;/code&gt; is a reserved identifier&lt;/li&gt;
&lt;li&gt;Sensitive information should not be placed in template variables&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Object-specific limitations&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Database / Schema / Table / View / Dynamic Table&lt;/strong&gt;: Rename is not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Table&lt;/strong&gt;: Column rename, incompatible type changes, adding Search Optimization, and adding tags / masking policies / row access policies to column definitions are not supported. Column order changes are also not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dynamic Table&lt;/strong&gt;: &lt;code&gt;INITIALIZE&lt;/code&gt; is immutable. Body changes or refresh mode changes may require re-initialization / full refresh. Column order changes and rename are not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;View&lt;/strong&gt;: Rename and column order changes are not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Internal Stage&lt;/strong&gt;: Only internal stages are supported, and encryption type is immutable&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Warehouse&lt;/strong&gt;: &lt;code&gt;INITIALLY_SUSPENDED&lt;/code&gt; is immutable&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Role / Database Role&lt;/strong&gt;: Application Roles are not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Grant&lt;/strong&gt;: APPLICATION ROLE grants / CALLER grants are not supported&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tag&lt;/strong&gt;: &lt;code&gt;PROPAGATE&lt;/code&gt; is not supported&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Jinja2 templates&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;import&lt;/code&gt;, &lt;code&gt;extends&lt;/code&gt;, and &lt;code&gt;include&lt;/code&gt; syntax are not supported&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Snowflake&lt;/strong&gt;: AWS US West (Oregon) region, Enterprise Edition (The feature was not enabled in an AWS Tokyo region environment, so a trial account was created in AWS US West.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feature status&lt;/strong&gt;: Preview as of March 20, 2026&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Required privileges&lt;/strong&gt;: &lt;code&gt;CREATE DCM PROJECT ON SCHEMA&lt;/code&gt; privilege. Additionally, the project owner needs sufficient privileges to deploy all objects defined in the project&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Project owner role&lt;/strong&gt;: Since this article includes role creation via &lt;code&gt;DEFINE ROLE&lt;/code&gt;, &lt;code&gt;ACCOUNTADMIN&lt;/code&gt; (which has &lt;code&gt;CREATE ROLE&lt;/code&gt; privilege) is used&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DCM project object location&lt;/strong&gt;: DCM project objects are schema-level objects, so a database / schema to host them is required (this is a separate concept from the deployment target database)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operating environment&lt;/strong&gt;: Snowsight Workspace&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Creating a Workspace
&lt;/h3&gt;

&lt;p&gt;Create a Workspace in Snowsight.&lt;/p&gt;

&lt;p&gt;Navigate to "Projects" → "Workspaces" from the left menu, click the Workspace name at the top left, select "Private Workspace", enter a name, and create it.&lt;/p&gt;

&lt;p&gt;If the Workspace is created as shown below, you're good to go.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ns8nc783x337fabkt4n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ns8nc783x337fabkt4n.png" alt="2026-03-20_18h19_10" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a Database / Schema for DCM Project Objects
&lt;/h3&gt;

&lt;p&gt;DCM project objects are schema-level objects and require a database and schema to host them. In this article, we create a dedicated &lt;code&gt;DCM_ADMIN.PROJECTS&lt;/code&gt; database / schema and place both the DEV and PROD DCM project objects there.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;ACCOUNTADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="n"&gt;SECONDARY&lt;/span&gt; &lt;span class="n"&gt;ROLES&lt;/span&gt; &lt;span class="k"&gt;NONE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- For hosting DCM project objects&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DCM_ADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DCM_ADMIN&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PROJECTS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; The official DCM Projects docs recommend running &lt;code&gt;USE SECONDARY ROLES NONE;&lt;/code&gt; because secondary roles are considered during Plan / Deploy execution, and you want to avoid depending on privileges beyond the project owner role. This article also uses &lt;code&gt;USE SECONDARY ROLES NONE;&lt;/code&gt; during preparation and deploy-related SQL execution.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;After running the queries, confirm that the &lt;code&gt;DCM_ADMIN&lt;/code&gt; database and &lt;code&gt;PROJECTS&lt;/code&gt; schema have been created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnoyuabbpaif0sa39eyhy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnoyuabbpaif0sa39eyhy.png" alt="2026-03-20_18h21_03" width="513" height="178"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Hands-On Walkthrough
&lt;/h2&gt;

&lt;p&gt;Let's build DEV/PROD environments using DCM Projects!&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Creating a DCM Project Folder
&lt;/h3&gt;

&lt;p&gt;Create &lt;strong&gt;a single DCM project folder&lt;/strong&gt; in the Snowsight Workspace. Within this project folder, define two targets (DEV / PROD) in &lt;code&gt;manifest.yml&lt;/code&gt; and deploy the same set of definition files to each environment.&lt;/p&gt;

&lt;p&gt;In the created Workspace, click the "+" button and select "DCM Project". &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyrlspcqz9knsinw8lkdg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyrlspcqz9knsinw8lkdg.png" alt="2026-03-21_05h11_12" width="607" height="661"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Check "Define default target environment" and configure as shown below. Select the database and schema created earlier, and set the Target name to "DEV".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0p8c3gzs6rjqsyu2w85x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0p8c3gzs6rjqsyu2w85x.png" alt="2026-03-21_05h25_32" width="800" height="953"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The folders and files are automatically generated as shown below. The &lt;code&gt;manifest.yml&lt;/code&gt; also contains the role, database, and schema information configured earlier.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgv7h7spfe1ijdwiuc28u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgv7h7spfe1ijdwiuc28u.png" alt="2026-03-21_05h26_07" width="800" height="548"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; In DCM Projects, the &lt;strong&gt;DCM project folder&lt;/strong&gt; (the set of definition files consisting of &lt;code&gt;manifest.yml&lt;/code&gt; + &lt;code&gt;sources/definitions/&lt;/code&gt; in the Workspace) and the &lt;strong&gt;DCM project object&lt;/strong&gt; (a Snowflake schema-level object that is the target of deployment) are separate concepts. A single project folder can have multiple targets, each deploying to a different DCM project object. If you deploy different configurations to a single DCM project object, objects no longer included in the rendered result may become drop targets. Therefore, when managing multiple environments like DEV/PROD, specify a separate DCM project object for each environment.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  2. Deleting the Default .sql Files
&lt;/h3&gt;

&lt;p&gt;Since we have a specific configuration for this verification, delete the default &lt;code&gt;.sql&lt;/code&gt; files that were generated.&lt;/p&gt;

&lt;p&gt;For reference, let's review what each file contains before deleting them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/examples.sql&lt;/code&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This file contains the most common syntax for defining objects using DCM Projects. The key point is defining each object's settings with &lt;code&gt;define&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzztwu979zuklb58mm0h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzztwu979zuklb58mm0h.png" alt="2026-03-21_05h38_57" width="800" height="780"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/jinja_demo.sql&lt;/code&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Since DCM Projects supports Jinja2 templates, you can programmatically define objects using &lt;code&gt;for&lt;/code&gt; and &lt;code&gt;if&lt;/code&gt;. This file demonstrates creating multiple team tables by looping through the &lt;code&gt;teams&lt;/code&gt; values defined in &lt;code&gt;manifest.yml&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fks4099xinspu6musuj8m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fks4099xinspu6musuj8m.png" alt="2026-03-21_05h45_53" width="800" height="604"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/macros/grants_macro.sql&lt;/code&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Since Jinja2 templates are available, you can also create macros like in dbt. This macro creates &lt;code&gt;DEVELOPER&lt;/code&gt; and &lt;code&gt;USAGE&lt;/code&gt; roles for a given team and grants privileges to each.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjxpp9v932azb6ycc062b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjxpp9v932azb6ycc062b.png" alt="2026-03-21_05h56_14" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Folder Structure After Deleting Default .sql Files
&lt;/h4&gt;

&lt;p&gt;It should look like the image below. The &lt;code&gt;macros&lt;/code&gt; folder was also deleted since it's not needed for this exercise.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: &lt;code&gt;tmp.sql&lt;/code&gt; is a file where I wrote the queries to create the database / schema earlier.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frmda6d1b1d24q3nutuk9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frmda6d1b1d24q3nutuk9.png" alt="2026-03-21_05h57_40" width="536" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Configuring manifest.yml (DEV/PROD Targets)
&lt;/h3&gt;

&lt;p&gt;To separate DEV/PROD environments at the database level within the same account, configure &lt;code&gt;manifest.yml&lt;/code&gt; as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;manifest_version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;2&lt;/span&gt;
&lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DCM_PROJECT&lt;/span&gt;

&lt;span class="na"&gt;default_target&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DEV&lt;/span&gt;

&lt;span class="na"&gt;targets&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;DEV&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;account_identifier&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MY_ACCOUNT&lt;/span&gt;
    &lt;span class="na"&gt;project_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_DEV&lt;/span&gt;
    &lt;span class="na"&gt;project_owner&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ACCOUNTADMIN&lt;/span&gt;
    &lt;span class="na"&gt;templating_config&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DEV&lt;/span&gt;
  &lt;span class="na"&gt;PROD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;account_identifier&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;MY_ACCOUNT&lt;/span&gt;
    &lt;span class="na"&gt;project_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DCM_ADMIN.PROJECTS.MY_DCM_PROJECT_PROD&lt;/span&gt;
    &lt;span class="na"&gt;project_owner&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ACCOUNTADMIN&lt;/span&gt;
    &lt;span class="na"&gt;templating_config&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PROD&lt;/span&gt;

&lt;span class="na"&gt;templating&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;defaults&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;wh_auto_suspend&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;60&lt;/span&gt;
  &lt;span class="na"&gt;configurations&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;DEV&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;db_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DEV_DB&lt;/span&gt;
      &lt;span class="na"&gt;wh_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DEV_WH&lt;/span&gt;
      &lt;span class="na"&gt;wh_size&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;X-SMALL&lt;/span&gt;
      &lt;span class="na"&gt;role_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DEV_EXPLORER_ROLE&lt;/span&gt;
    &lt;span class="na"&gt;PROD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;db_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PROD_DB&lt;/span&gt;
      &lt;span class="na"&gt;wh_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PROD_WH&lt;/span&gt;
      &lt;span class="na"&gt;wh_size&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;SMALL&lt;/span&gt;
      &lt;span class="na"&gt;role_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PROD_EXPLORER_ROLE&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fur0bxygr1m4ti231985w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fur0bxygr1m4ti231985w.png" alt="2026-03-21_06h01_13" width="800" height="670"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Key points:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Under &lt;code&gt;targets&lt;/code&gt;, specify &lt;code&gt;account_identifier&lt;/code&gt; (same value since it's the same account), &lt;code&gt;project_name&lt;/code&gt; (fully qualified name of the DCM project object), &lt;code&gt;project_owner&lt;/code&gt; (owner role during deployment), and &lt;code&gt;templating_config&lt;/code&gt; (configuration name to use)&lt;/li&gt;
&lt;li&gt;Define default values in &lt;code&gt;templating.defaults&lt;/code&gt; and environment-specific values in &lt;code&gt;templating.configurations&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Variable resolution order is three stages: "defaults → configuration → runtime override"&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Since this article includes role creation via &lt;code&gt;DEFINE ROLE&lt;/code&gt;, both DEV and PROD use &lt;code&gt;project_owner: ACCOUNTADMIN&lt;/code&gt; to satisfy the &lt;code&gt;CREATE ROLE&lt;/code&gt; privilege requirement. This is to simplify the verification and is &lt;strong&gt;not&lt;/strong&gt; a recommendation to routinely use &lt;code&gt;ACCOUNTADMIN&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The official docs' CI/CD best practices advise against using general administrator roles (like &lt;code&gt;ACCOUNTADMIN&lt;/code&gt;) as the project owner in production. In practice, consider creating dedicated deployment roles like &lt;code&gt;DCM_DEV_DEPLOYER&lt;/code&gt; / &lt;code&gt;DCM_PROD_DEPLOYER&lt;/code&gt; or service users for each environment.&lt;/p&gt;

&lt;p&gt;Also, &lt;strong&gt;by default, the role that executes the deploy (project owner) holds OWNERSHIP of the deployed objects&lt;/strong&gt;. In this article's setup, all objects after deploy are owned by &lt;code&gt;ACCOUNTADMIN&lt;/code&gt;. In practice, consider a setup where a dedicated deployment role is the owner, and plan your &lt;code&gt;GRANT OWNERSHIP&lt;/code&gt; design accordingly. Note that if the project owner transfers OWNERSHIP to another role it doesn't hold, it may become unable to continue managing that object in subsequent deploys.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  4. Defining Databases, Schemas, and Warehouses
&lt;/h3&gt;

&lt;p&gt;Now let's start defining the objects.&lt;/p&gt;

&lt;p&gt;Create definition files for databases, schemas, and warehouses under the &lt;code&gt;sources/definitions/&lt;/code&gt; directory.&lt;/p&gt;

&lt;p&gt;Since the Jinja2 template variable &lt;code&gt;{{ db_name }}&lt;/code&gt; is used, it automatically expands to &lt;code&gt;DEV_DB&lt;/code&gt; for the DEV configuration and &lt;code&gt;PROD_DB&lt;/code&gt; for the PROD configuration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/databases.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftxr4a2tkady4750hozm8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftxr4a2tkady4750hozm8.png" alt="2026-03-21_05h59_12" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/schemas.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdvzpmjmxcfk40dhid80.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdvzpmjmxcfk40dhid80.png" alt="2026-03-21_05h59_54" width="800" height="311"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The warehouse size is also parameterized with &lt;code&gt;{{ wh_size }}&lt;/code&gt; from &lt;code&gt;manifest.yml&lt;/code&gt;, so DEV gets &lt;code&gt;X-SMALL&lt;/code&gt; and PROD gets &lt;code&gt;SMALL&lt;/code&gt;. &lt;code&gt;{{ wh_auto_suspend }}&lt;/code&gt; uses the shared value defined in &lt;code&gt;templating.defaults&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/warehouses.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="k"&gt;WITH&lt;/span&gt;
    &lt;span class="n"&gt;WAREHOUSE_SIZE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'{{ wh_size }}'&lt;/span&gt;
    &lt;span class="n"&gt;AUTO_SUSPEND&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_auto_suspend&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
    &lt;span class="n"&gt;AUTO_RESUME&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;
    &lt;span class="n"&gt;INITIALLY_SUSPENDED&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frw0nhfokypd67ffsvbvg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frw0nhfokypd67ffsvbvg.png" alt="2026-03-21_06h03_33" width="800" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Defining Roles and GRANTs
&lt;/h3&gt;

&lt;p&gt;Create a data exploration role for each environment and grant the necessary privileges. With the template variable &lt;code&gt;{{ role_name }}&lt;/code&gt;, &lt;code&gt;DEV_EXPLORER_ROLE&lt;/code&gt; is created for DEV and &lt;code&gt;PROD_EXPLORER_ROLE&lt;/code&gt; for PROD.&lt;/p&gt;

&lt;p&gt;In DCM Projects, you can write both &lt;code&gt;DEFINE&lt;/code&gt; statements and &lt;code&gt;GRANT&lt;/code&gt; statements in the same file.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/roles.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- USAGE privilege on database&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;USAGE&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- USAGE privilege on each schema&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;USAGE&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;USAGE&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;USAGE&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- RAW schema: SELECT privilege on tables (read-only)&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;ALL&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- STG/MART schemas: SELECT privilege on Dynamic Tables (read-only)&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;ALL&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;ALL&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- USAGE privilege on warehouse&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;USAGE&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}};&lt;/span&gt;

&lt;span class="c1"&gt;-- Role hierarchy to SYSADMIN&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;role_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fek3xrvd8jsd3wxqpwins.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fek3xrvd8jsd3wxqpwins.png" alt="2026-03-21_06h04_48" width="800" height="489"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; DCM Projects tracks only GRANTs deployed through DCM Projects. If you remove a &lt;code&gt;GRANT&lt;/code&gt; statement from the definition files, that GRANT will become a revoke target on the next Deploy.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  6. Defining Tables
&lt;/h3&gt;

&lt;p&gt;Define tables in the RAW schema to store e-commerce order data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/raw_tables.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMERS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;EMAIL&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;REGION&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;CREATED_AT&lt;/span&gt; &lt;span class="n"&gt;TIMESTAMP_NTZ&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDERS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;ORDER_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;ORDER_DATE&lt;/span&gt; &lt;span class="nb"&gt;DATE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;ORDER_STATUS&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;TOTAL_AMOUNT&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;CREATED_AT&lt;/span&gt; &lt;span class="n"&gt;TIMESTAMP_NTZ&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ITEMS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;ORDER_ITEM_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;ORDER_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;QUANTITY&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="n"&gt;CREATED_AT&lt;/span&gt; &lt;span class="n"&gt;TIMESTAMP_NTZ&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fweck43d9swbjo6wz93ge.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fweck43d9swbjo6wz93ge.png" alt="2026-03-21_06h07_05" width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  7. Defining an Internal Stage
&lt;/h3&gt;

&lt;p&gt;Define an Internal Stage for data ingestion. Directory table settings can also be configured.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/stages.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="n"&gt;STAGE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_STAGE&lt;/span&gt;
  &lt;span class="n"&gt;DIRECTORY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ENABLE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F01mh83z0vq58gsfv0q5a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F01mh83z0vq58gsfv0q5a.png" alt="2026-03-21_06h09_30" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  8. Defining Dynamic Tables
&lt;/h3&gt;

&lt;p&gt;Define Dynamic Tables in the STG and MART schemas to build a RAW → STG → MART pipeline.&lt;/p&gt;

&lt;p&gt;In this three-layer architecture, the STG layer performs data cleansing (&lt;code&gt;TRIM&lt;/code&gt;, &lt;code&gt;LOWER&lt;/code&gt;, &lt;code&gt;UPPER&lt;/code&gt;, etc.) and the MART layer performs aggregation and joins. STG layer Dynamic Tables are set to &lt;code&gt;TARGET_LAG = DOWNSTREAM&lt;/code&gt;, and MART layer tables to &lt;code&gt;TARGET_LAG = '1 hour'&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/stg_dynamic_tables.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_CUSTOMERS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DOWNSTREAM&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
      &lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;TRIM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;LOWER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EMAIL&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;EMAIL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;UPPER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;REGION&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;REGION&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;CREATED_AT&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMERS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_ORDERS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DOWNSTREAM&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
      &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_DATE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;UPPER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_STATUS&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;ORDER_STATUS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_AMOUNT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CREATED_AT&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDERS&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_ORDER_ITEMS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DOWNSTREAM&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ITEM_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;TRIM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;QUANTITY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;QUANTITY&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;LINE_TOTAL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CREATED_AT&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ITEMS&lt;/span&gt; &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fynwzjegliuazarw6gscz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fynwzjegliuazarw6gscz.png" alt="2026-03-21_06h12_33" width="800" height="686"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/mart_dynamic_tables.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART_CUSTOMER_ORDERS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'1 hour'&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
      &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;EMAIL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;REGION&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ID&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;TOTAL_ORDERS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;TOTAL_AMOUNT&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;TOTAL_SPENT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;MIN&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_DATE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;FIRST_ORDER_DATE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_DATE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;LAST_ORDER_DATE&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_CUSTOMERS&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;
    &lt;span class="k"&gt;LEFT&lt;/span&gt; &lt;span class="k"&gt;JOIN&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_ORDERS&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;
      &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;O&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt;
    &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;EMAIL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;C&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;REGION&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART_PRODUCT_SALES&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'1 hour'&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
      &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;DISTINCT&lt;/span&gt; &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ID&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;ORDER_COUNT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;QUANTITY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;TOTAL_QUANTITY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LINE_TOTAL&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;TOTAL_REVENUE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="k"&gt;AVG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;AVG_UNIT_PRICE&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STG_ORDER_ITEMS&lt;/span&gt; &lt;span class="n"&gt;OI&lt;/span&gt;
    &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;OI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiibocgqilecmt6lk8dkj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiibocgqilecmt6lk8dkj.png" alt="2026-03-21_06h12_51" width="800" height="570"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  9. Defining Tasks (Data Ingestion via COPY INTO)
&lt;/h3&gt;

&lt;p&gt;Define tasks using &lt;code&gt;DEFINE TASK&lt;/code&gt; to load CSV files uploaded to the Internal Stage into each table.&lt;/p&gt;

&lt;p&gt;Each task is configured with a CRON schedule to run daily at 2:00 AM (JST). With &lt;code&gt;PURGE = TRUE&lt;/code&gt; specified, files on the stage are automatically deleted after a successful COPY INTO.&lt;/p&gt;

&lt;p&gt;Note that tasks are created in a suspended state when deployed.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; File Format is not included in DCM Projects' supported objects (as of March 20, 2026), so it cannot be defined with &lt;code&gt;DEFINE FILE FORMAT&lt;/code&gt;. In this article, format options are specified inline in the &lt;code&gt;FILE_FORMAT&lt;/code&gt; of the COPY INTO statement to avoid a separate File Format object creation step. For objects not supported by DEFINE, you can supplement them with separate SQL scripts (&lt;code&gt;EXECUTE IMMEDIATE FROM&lt;/code&gt; or &lt;code&gt;snow sql&lt;/code&gt;) after DCM PLAN/DEPLOY.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;sources/definitions/tasks.sql&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- COPY INTO task for CUSTOMERS table&lt;/span&gt;
&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_CUSTOMERS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;SCHEDULE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'USING CRON 0 2 * * * Asia/Tokyo'&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;COPY&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMERS&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_STAGE&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt;
    &lt;span class="n"&gt;FILE_FORMAT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;TYPE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt; &lt;span class="n"&gt;FIELD_OPTIONALLY_ENCLOSED_BY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'"'&lt;/span&gt; &lt;span class="n"&gt;SKIP_HEADER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="n"&gt;NULL_IF&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'NULL'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'null'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="n"&gt;ON_ERROR&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CONTINUE'&lt;/span&gt;
    &lt;span class="n"&gt;PURGE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- COPY INTO task for ORDERS table&lt;/span&gt;
&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDERS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;SCHEDULE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'USING CRON 0 2 * * * Asia/Tokyo'&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;COPY&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDERS&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_STAGE&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt;
    &lt;span class="n"&gt;FILE_FORMAT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;TYPE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt; &lt;span class="n"&gt;FIELD_OPTIONALLY_ENCLOSED_BY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'"'&lt;/span&gt; &lt;span class="n"&gt;SKIP_HEADER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="n"&gt;NULL_IF&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'NULL'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'null'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="n"&gt;ON_ERROR&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CONTINUE'&lt;/span&gt;
    &lt;span class="n"&gt;PURGE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- COPY INTO task for ORDER_ITEMS table&lt;/span&gt;
&lt;span class="n"&gt;DEFINE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDER_ITEMS&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;wh_name&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
  &lt;span class="n"&gt;SCHEDULE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'USING CRON 0 2 * * * Asia/Tokyo'&lt;/span&gt;
  &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;COPY&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ITEMS&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;db_name&lt;/span&gt; &lt;span class="p"&gt;}}.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_STAGE&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;order_items&lt;/span&gt;
    &lt;span class="n"&gt;FILE_FORMAT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;TYPE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt; &lt;span class="n"&gt;FIELD_OPTIONALLY_ENCLOSED_BY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'"'&lt;/span&gt; &lt;span class="n"&gt;SKIP_HEADER&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="n"&gt;NULL_IF&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'NULL'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'null'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="n"&gt;ON_ERROR&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CONTINUE'&lt;/span&gt;
    &lt;span class="n"&gt;PURGE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8a8cmaznl07vybk1e764.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8a8cmaznl07vybk1e764.png" alt="2026-03-21_06h14_29" width="800" height="640"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  10. Plan &amp;amp; Deploy to the DEV Target
&lt;/h3&gt;

&lt;p&gt;Now that the definition files are ready, let's run Plan against the DEV target first.&lt;/p&gt;

&lt;p&gt;In the Snowsight Workspace DCM Project screen, switch the target to "DEV", ensure "Plan" is selected, and click the play button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fey729bpxdd2f837w1yab.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fey729bpxdd2f837w1yab.png" alt="2026-03-21_06h15_36" width="800" height="157"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A list of objects to be created is displayed. Clicking on each object shows the detailed configuration values.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft9fnwubbjoo1pgl157fl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft9fnwubbjoo1pgl157fl.png" alt="2026-03-21_06h16_48" width="800" height="1048"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By the way, clicking &lt;code&gt;Summarize&lt;/code&gt; in the upper right provides a summary of what changes will be made.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fprvrbeenjcdscf1ps9bt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fprvrbeenjcdscf1ps9bt.png" alt="2026-03-21_06h18_53" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F223047alpdy7dyksaac0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F223047alpdy7dyksaac0.png" alt="2026-03-21_06h20_17" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If everything looks good, click the "Deploy" button. According to the &lt;a href="https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-use" rel="noopener noreferrer"&gt;official docs&lt;/a&gt;, the "Alias" is described as &lt;code&gt;Think of the deployment alias like a commit message for your code change.&lt;/code&gt;, so I entered "first deploy".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ad0ui74le4d3bulvvyz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ad0ui74le4d3bulvvyz.png" alt="2026-03-21_06h20_56" width="800" height="207"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe70syos7eeahb9xcw70k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe70syos7eeahb9xcw70k.png" alt="2026-03-21_06h23_49" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When the deployment completes successfully, it displays as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fih2yvi05plkeo09210ht.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fih2yvi05plkeo09210ht.png" alt="2026-03-21_06h24_41" width="800" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After deployment, verify that the objects have been created. (Below is an example using queries.)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;SCHEMAS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TASKS&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmkxj4h97a56bp9s78ojz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmkxj4h97a56bp9s78ojz.png" alt="2026-03-21_06h25_52" width="800" height="407"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Verify roles and grants&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;ROLES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'DEV_EXPLORER_ROLE'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;GRANTS&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;DEV_EXPLORER_ROLE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqwmslzhj81xvteehirqm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqwmslzhj81xvteehirqm.png" alt="2026-03-21_06h26_28" width="800" height="572"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  11. Loading Sample Data and Verifying Operation
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Preparing Sample CSVs
&lt;/h4&gt;

&lt;p&gt;Load sample data into the DEV environment to verify that the COPY INTO tasks and Dynamic Table pipeline work correctly.&lt;/p&gt;

&lt;p&gt;First, prepare the following three CSV files locally.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;customers.csv&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CUSTOMER_ID,CUSTOMER_NAME,EMAIL,REGION,CREATED_AT
1,Taro Tanaka,tanaka@example.com,tokyo,2025-12-01 10:00:00
2,Hanako Sato,sato@example.com,osaka,2025-12-05 11:30:00
3,Ichiro Suzuki,suzuki@example.com,tokyo,2025-12-10 09:15:00
4,Misaki Takahashi,takahashi@example.com,fukuoka,2025-12-15 14:00:00
5,Kenta Ito,ito@example.com,osaka,2025-12-20 16:45:00
6,Yumi Watanabe,watanabe@example.com,nagoya,2026-01-03 08:30:00
7,Takuya Yamamoto,yamamoto@example.com,tokyo,2026-01-08 13:00:00
8,Sakura Nakamura,nakamura@example.com,sapporo,2026-01-12 10:20:00
9,Daisuke Kobayashi,kobayashi@example.com,fukuoka,2026-01-18 15:10:00
10,Ai Kato,kato@example.com,tokyo,2026-01-25 11:00:00
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;orders.csv&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ORDER_ID,CUSTOMER_ID,ORDER_DATE,ORDER_STATUS,TOTAL_AMOUNT,CREATED_AT
1001,1,2026-01-15,completed,15000.00,2026-01-15 10:30:00
1002,2,2026-01-18,completed,8500.00,2026-01-18 14:00:00
1003,1,2026-01-22,completed,23000.00,2026-01-22 09:45:00
1004,3,2026-02-01,completed,4200.00,2026-02-01 11:20:00
1005,4,2026-02-05,shipped,31500.00,2026-02-05 16:00:00
1006,5,2026-02-10,completed,9800.00,2026-02-10 13:30:00
1007,2,2026-02-14,completed,12000.00,2026-02-14 10:15:00
1008,6,2026-02-20,shipped,7600.00,2026-02-20 15:45:00
1009,7,2026-03-01,completed,18500.00,2026-03-01 08:00:00
1010,8,2026-03-05,pending,5400.00,2026-03-05 12:30:00
1011,3,2026-03-08,completed,11200.00,2026-03-08 14:20:00
1012,9,2026-03-10,shipped,26800.00,2026-03-10 09:00:00
1013,10,2026-03-12,completed,3900.00,2026-03-12 11:45:00
1014,1,2026-03-15,pending,14700.00,2026-03-15 16:30:00
1015,5,2026-03-18,completed,8900.00,2026-03-18 10:00:00
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;order_items.csv&lt;/code&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ORDER_ITEM_ID,ORDER_ID,PRODUCT_NAME,QUANTITY,UNIT_PRICE,CREATED_AT
1,1001,Wireless Earbuds,1,8000.00,2026-01-15 10:30:00
2,1001,Smartphone Case,2,3500.00,2026-01-15 10:30:00
3,1002,USB Cable,5,1700.00,2026-01-18 14:00:00
4,1003,Monitor Arm,1,15000.00,2026-01-22 09:45:00
5,1003,Mouse Pad,2,4000.00,2026-01-22 09:45:00
6,1004,Keyboard Cover,3,1400.00,2026-02-01 11:20:00
7,1005,4K Monitor,1,29000.00,2026-02-05 16:00:00
8,1005,HDMI Cable,1,2500.00,2026-02-05 16:00:00
9,1006,Wireless Mouse,2,4900.00,2026-02-10 13:30:00
10,1007,Webcam,1,12000.00,2026-02-14 10:15:00
11,1008,USB Hub,2,3800.00,2026-02-20 15:45:00
12,1009,Mechanical Keyboard,1,18500.00,2026-03-01 08:00:00
13,1010,Mouse Pad,1,4000.00,2026-03-05 12:30:00
14,1010,Keyboard Cover,1,1400.00,2026-03-05 12:30:00
15,1011,Wireless Earbuds,1,8000.00,2026-03-08 14:20:00
16,1011,USB Cable,2,1600.00,2026-03-08 14:20:00
17,1012,4K Monitor,1,29000.00,2026-03-10 09:00:00
18,1013,Smartphone Case,1,3900.00,2026-03-12 11:45:00
19,1014,Webcam,1,12000.00,2026-03-15 16:30:00
20,1014,USB Cable,3,900.00,2026-03-15 16:30:00
21,1015,Wireless Mouse,1,4900.00,2026-03-18 10:00:00
22,1015,Mouse Pad,1,4000.00,2026-03-18 10:00:00
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Uploading CSV Files
&lt;/h4&gt;

&lt;p&gt;Upload the prepared CSV files to the internal stage from Snowsight.&lt;/p&gt;

&lt;p&gt;Since the task definitions specify sub-paths like &lt;code&gt;FROM @{{ db_name }}.RAW.DATA_STAGE/customers&lt;/code&gt;, separate the upload path for each file:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;customers.csv&lt;/code&gt; → Path: &lt;code&gt;/customers/&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;orders.csv&lt;/code&gt; → Path: &lt;code&gt;/orders/&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;order_items.csv&lt;/code&gt; → Path: &lt;code&gt;/order_items/&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fel0hj5z4r3mrs6ozpuwh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fel0hj5z4r3mrs6ozpuwh.png" alt="2026-03-21_06h31_40" width="800" height="803"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After uploading, verify that the files are placed on the stage. You should see three CSV files in their respective sub-paths.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;LIST&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_STAGE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvzhl62htjqzhfbnzq11.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvzhl62htjqzhfbnzq11.png" alt="2026-03-21_06h32_58" width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Resuming and Manually Executing Tasks
&lt;/h4&gt;

&lt;p&gt;Tasks are in a suspended state (&lt;code&gt;state&lt;/code&gt; is &lt;code&gt;suspended&lt;/code&gt;) when deployed. First, check the status with &lt;code&gt;SHOW TASKS&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TASKS&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7b6fr4keup3qiw576dzn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7b6fr4keup3qiw576dzn.png" alt="2026-03-21_06h33_33" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can confirm that the &lt;code&gt;state&lt;/code&gt; column shows &lt;code&gt;suspended&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;To enable the tasks, run &lt;code&gt;ALTER TASK ... RESUME&lt;/code&gt;. Since these are three independent tasks (no parent-child relationship), resume each one individually.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_CUSTOMERS&lt;/span&gt; &lt;span class="n"&gt;RESUME&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDERS&lt;/span&gt; &lt;span class="n"&gt;RESUME&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDER_ITEMS&lt;/span&gt; &lt;span class="n"&gt;RESUME&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run &lt;code&gt;SHOW TASKS&lt;/code&gt; again and confirm the &lt;code&gt;state&lt;/code&gt; column shows &lt;code&gt;started&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TASKS&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3tvr8almhfvf9vvkr72b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3tvr8almhfvf9vvkr72b.png" alt="2026-03-21_06h34_23" width="800" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The task schedule is daily at 2:00 AM (JST), but for verification purposes, manually execute them with &lt;code&gt;EXECUTE TASK&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_CUSTOMERS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDERS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="n"&gt;TASK&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LOAD_ORDER_ITEMS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify that data has been loaded into each table.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CUSTOMERS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;  &lt;span class="c1"&gt;-- 10 rows&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDERS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;     &lt;span class="c1"&gt;-- 15 rows&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ORDER_ITEMS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c1"&gt;-- 22 rows&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4m5bft2g1pt332y37xww.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4m5bft2g1pt332y37xww.png" alt="2026-03-21_06h35_09" width="800" height="242"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; As verified in the bonus section below, the &lt;a href="https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-supported-entities#label-dcm-projects-object-type-task" rel="noopener noreferrer"&gt;official docs&lt;/a&gt; state: &lt;code&gt;When definition changes are deployed for a task that is already started, Snowflake automatically suspends that task (or its root task) temporarily, applies the change, and then resumes it again.&lt;/code&gt; This means you don't need to manually suspend tasks even when modifying task definitions through DCM. This is fantastic!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h4&gt;
  
  
  Bulk-Refreshing Dynamic Tables with REFRESH ALL
&lt;/h4&gt;

&lt;p&gt;After executing COPY INTO, you could wait for Dynamic Tables to auto-refresh based on &lt;code&gt;TARGET_LAG&lt;/code&gt;, but DCM Projects' &lt;code&gt;REFRESH ALL&lt;/code&gt; command lets you immediately bulk-refresh all Dynamic Tables under the project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-pipelines" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/dcm-projects/dcm-projects-pipelines&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When executed via SQL, it looks like this. &lt;code&gt;REFRESH ALL&lt;/code&gt; triggers a refresh for all Dynamic Tables managed by the project and their upstream Dynamic Tables. The result is returned in JSON format, showing &lt;code&gt;inserted_rows&lt;/code&gt;, &lt;code&gt;deleted_rows&lt;/code&gt;, and &lt;code&gt;data_timestamp&lt;/code&gt; (a timestamp indicating data freshness) for each Dynamic Table.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="n"&gt;DCM&lt;/span&gt; &lt;span class="n"&gt;PROJECT&lt;/span&gt; &lt;span class="n"&gt;DCM_ADMIN&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PROJECTS&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MY_DCM_PROJECT_DEV&lt;/span&gt;
  &lt;span class="n"&gt;REFRESH&lt;/span&gt; &lt;span class="k"&gt;ALL&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiyqrgnl2kzurgdhoezhs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiyqrgnl2kzurgdhoezhs.png" alt="2026-03-21_06h38_52" width="800" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Verify that data has been reflected in the MART layer Dynamic Tables. Customer order summaries and product sales summaries should display correctly.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART_CUSTOMER_ORDERS&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;TOTAL_SPENT&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;DEV_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MART_PRODUCT_SALES&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;TOTAL_REVENUE&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjlkq6v9oqrxnyeapmxf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjlkq6v9oqrxnyeapmxf.png" alt="2026-03-21_06h39_46" width="800" height="543"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  12. Plan &amp;amp; Deploy to the PROD Target
&lt;/h3&gt;

&lt;p&gt;Next, deploy to the PROD target in the same manner. Since the PROD target specifies &lt;code&gt;MY_DCM_PROJECT_PROD&lt;/code&gt; as a separate DCM project object in &lt;code&gt;manifest.yml&lt;/code&gt;, the DEV environment is not affected.&lt;/p&gt;

&lt;p&gt;First, run the following query to create the PROD DCM project object.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;ACCOUNTADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="n"&gt;DCM&lt;/span&gt; &lt;span class="n"&gt;PROJECT&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DCM_ADMIN&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PROJECTS&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;MY_DCM_PROJECT_PROD&lt;/span&gt;
    &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'DCM project for PROD'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, in the Workspace DCM Project screen, switch the target to "PROD", ensure "Plan" is selected, and click the play button. (If the PROD DCM project object is not recognized and an error occurs, refreshing the browser page should resolve it.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnlfk1a4ebbamg3kxix1g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnlfk1a4ebbamg3kxix1g.png" alt="2026-03-21_06h45_29" width="800" height="108"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Similar to DEV, the objects are displayed as new creation diffs against &lt;code&gt;PROD_DB&lt;/code&gt;. Confirm that the warehouse size is &lt;code&gt;SMALL&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjqmgastkm8o8kgfv0i0p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjqmgastkm8o8kgfv0i0p.png" alt="2026-03-21_06h48_03" width="800" height="1016"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If everything looks good, click the "Deploy" button to execute the deployment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj2tyoclz5u4ap41hhzqk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj2tyoclz5u4ap41hhzqk.png" alt="2026-03-21_06h48_45" width="800" height="183"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxe9fluaiizngaf0us824.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxe9fluaiizngaf0us824.png" alt="2026-03-21_06h49_09" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the deployment completes successfully, verify the PROD environment objects. Confirm that &lt;code&gt;PROD_DB&lt;/code&gt; has the same schema structure and objects as &lt;code&gt;DEV_DB&lt;/code&gt;, the warehouse &lt;code&gt;PROD_WH&lt;/code&gt; size is &lt;code&gt;SMALL&lt;/code&gt;, and &lt;code&gt;PROD_EXPLORER_ROLE&lt;/code&gt; has been granted privileges on each object.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;SCHEMAS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;TASKS&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;RAW&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;STG&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="k"&gt;DYNAMIC&lt;/span&gt; &lt;span class="n"&gt;TABLES&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;MART&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'PROD_WH'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1oqvg0s1hxvufmswki7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1oqvg0s1hxvufmswki7.png" alt="2026-03-21_06h50_43" width="800" height="339"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Verify roles and grants&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;ROLES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'PROD_EXPLORER_ROLE'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;GRANTS&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;PROD_EXPLORER_ROLE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F087xiw7kkwkikxu3bwv4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F087xiw7kkwkikxu3bwv4.png" alt="2026-03-21_06h51_08" width="800" height="569"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This confirms that using &lt;strong&gt;a single DCM project folder (set of definition files)&lt;/strong&gt;, we can deploy the same configuration to &lt;strong&gt;separate DCM project objects for DEV and PROD&lt;/strong&gt; by switching template variables!&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: Task enablement and Dynamic Table refresh for the PROD environment need to be done separately.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus: What Happens When You Modify or Remove Objects from Definition Files
&lt;/h2&gt;

&lt;p&gt;In DCM Projects, removing a &lt;code&gt;DEFINE&lt;/code&gt; statement from definition files is expected to make that object a drop target on the next Deploy. Let's verify this behavior.&lt;/p&gt;

&lt;h3&gt;
  
  
  Change #1: Commenting Out the Warehouse Definition
&lt;/h3&gt;

&lt;p&gt;Comment out the contents of &lt;code&gt;sources/definitions/warehouses.sql&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6v28b9axymq9thgnc6b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6v28b9axymq9thgnc6b.png" alt="2026-03-21_06h52_24" width="800" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Change #2: Modifying a Task Definition
&lt;/h3&gt;

&lt;p&gt;Change just one task to run at 3:00AM daily instead of 2:00 AM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3d2r0ccrh8xdgqchwq4i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3d2r0ccrh8xdgqchwq4i.png" alt="2026-03-21_06h55_31" width="800" height="887"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Running Plan
&lt;/h3&gt;

&lt;p&gt;Run Plan against the DEV target.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm3z1wdfrb7z2g3xycp72.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm3z1wdfrb7z2g3xycp72.png" alt="2026-03-21_06h56_16" width="536" height="126"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Plan result is shown below. &lt;code&gt;DEV_WH&lt;/code&gt; is displayed as a DROP target, and the task shows as an ALTER target with clear visibility into which configuration values will change.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp307axxuag7u9vrih1tq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp307axxuag7u9vrih1tq.png" alt="2026-03-21_06h57_53" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Running Deploy
&lt;/h3&gt;

&lt;p&gt;Proceed with the Deploy. (As a bonus note, the Alias also works fine with Japanese characters.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn2h4hxelol6mk36juc4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn2h4hxelol6mk36juc4.png" alt="2026-03-21_06h58_43" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After Deploy, verify whether the warehouse was actually deleted. As shown below, nothing is returned, confirming it has been dropped.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SHOW&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSES&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'DEV_WH'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpef82ww0ol7raj4qef5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpef82ww0ol7raj4qef5.png" alt="2026-03-21_06h59_32" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the task, we can confirm that the configuration value has been updated and the &lt;code&gt;state&lt;/code&gt; remains &lt;code&gt;started&lt;/code&gt;. This is fantastic!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpb7dvqlji0xbiooe7qvd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpb7dvqlji0xbiooe7qvd.png" alt="2026-03-21_07h00_59" width="800" height="244"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tried building DEV/PROD environments within a single Snowflake account using Snowflake DCM Projects.&lt;/p&gt;

&lt;p&gt;What stood out most from hands-on experience was how &lt;strong&gt;incredibly simple it is to separate DEV/PROD environments using template variables&lt;/strong&gt;. Simply defining &lt;code&gt;db_name&lt;/code&gt; and &lt;code&gt;wh_size&lt;/code&gt; in &lt;code&gt;manifest.yml&lt;/code&gt;'s &lt;code&gt;templating.configurations&lt;/code&gt; lets you deploy environments with different parameters from the same set of definition files. (If you're familiar with dbt, you'll find the onboarding very smooth.)&lt;/p&gt;

&lt;p&gt;Another great aspect is that &lt;strong&gt;the Plan → Deploy safety workflow is natively built in&lt;/strong&gt;. If you're accustomed to Terraform's &lt;code&gt;plan&lt;/code&gt; → &lt;code&gt;apply&lt;/code&gt;, you can use it with the same mindset. The integration of Dynamic Table &lt;code&gt;REFRESH ALL&lt;/code&gt; and data quality test &lt;code&gt;TEST ALL&lt;/code&gt; as DCM Project features is also very convenient for data pipeline management.&lt;/p&gt;

&lt;p&gt;On the other hand, it's important to note that some objects like File Format are not yet supported by DCM management, and as a Preview feature, there are various limitations including rename restrictions (as of March 20, 2026).&lt;/p&gt;

&lt;p&gt;That said, this is a feature I'm very excited to see evolve with future updates!!&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>iac</category>
    </item>
    <item>
      <title>Automatically Generating Omni Semantic Layer YAML from Tableau .twb Files Using Claude Code</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Mon, 09 Mar 2026 05:06:05 +0000</pubDate>
      <link>https://forem.com/sagara/automatically-generating-omni-semantic-layer-yaml-from-tableau-twb-files-using-claude-code-4548</link>
      <guid>https://forem.com/sagara/automatically-generating-omni-semantic-layer-yaml-from-tableau-twb-files-using-claude-code-4548</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This is an English translation of the original Japanese article published at: &lt;a href="https://dev.classmethod.jp/articles/try-twb-to-omni-yaml-with-claude-code/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/try-twb-to-omni-yaml-with-claude-code/&lt;/a&gt;&lt;/em&gt;&lt;br&gt;
Please note that some images contain Japanese text, as they are from the original article.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;When considering adopting a new BI tool like Omni, the biggest hurdle is usually &lt;strong&gt;"how to migrate from your existing BI tool."&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;However, with the recent advancement of AI technologies like Claude Code, migrating between code-based and text-based content is becoming surprisingly feasible.&lt;/p&gt;

&lt;p&gt;With that in mind, I tried using Claude Code to automatically generate Omni's Semantic Layer YAML code from Tableau's &lt;code&gt;.twb&lt;/code&gt; files, assuming a Tableau-to-Omni migration scenario. This article summarizes what I did and what I learned.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; This does not cover dashboard migration itself. The scope is limited to "generating Omni YAML code and creating Markdown instructions for how to build charts in Omni." If Omni dashboards become definable via code in the future, that could change things significantly.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;DWH:&lt;/strong&gt; Snowflake (AWS Tokyo region, Enterprise edition)

&lt;ul&gt;
&lt;li&gt;Starting from a &lt;code&gt;.twb&lt;/code&gt; file connected to Snowflake via Tableau, I created a Connection and base Model in Omni, then attempted the migration.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Claude Code:&lt;/strong&gt; 2.1.70 (Pro plan, using Opus 4.6)&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Python environment:&lt;/strong&gt; uv 0.10.9

&lt;ul&gt;
&lt;li&gt;Python is required because the Skill I created uses Python scripts.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Tableau Desktop:&lt;/strong&gt; 2025.3.3&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Local environment:&lt;/strong&gt; Ubuntu 24.04 LTS (via WSL2)&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;First, I set things up on the Omni, Tableau, and Claude Code sides respectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  Omni
&lt;/h3&gt;

&lt;p&gt;For Omni preparation, I created a Connection to the same Snowflake instance used in Tableau, created a base Model, and set up Git integration so that Claude Code could easily edit the Model files.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuqp09cpoq32t0wns128j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuqp09cpoq32t0wns128j.png" alt="Omni Connection setup" width="800" height="748"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fexkfpqkqktnga93okjj2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fexkfpqkqktnga93okjj2.png" alt="Omni Model setup" width="800" height="720"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsu7hqa05m2iuzgbb7uc3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsu7hqa05m2iuzgbb7uc3.png" alt="Omni Git integration" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Tableau
&lt;/h3&gt;

&lt;p&gt;I prepared a &lt;code&gt;.twb&lt;/code&gt; file with the following characteristics:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Relationships defined across 4 tables, with a data source filter restricting to the last 180 days&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxtknn2vp58lfmrna2nrh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxtknn2vp58lfmrna2nrh.png" alt="Tableau data source relationships" width="800" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Custom dimensions, measures, and parameters added (the Japanese-labeled dimensions are the custom ones)

&lt;ul&gt;
&lt;li&gt;Including LoD (Level of Detail) fields such as &lt;code&gt;地域別_平均フルフィル日数&lt;/code&gt; (Average Fulfillment Days by Region)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flz8gbeftexr1hsnehpf7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flz8gbeftexr1hsnehpf7.png" alt="Tableau custom fields" width="495" height="1176"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3prgrk6bvmrzyrbshwc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3prgrk6bvmrzyrbshwc.png" alt="Tableau parameters" width="800" height="307"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A dashboard created within the workbook&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2snkkck89mi6qvs168u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2snkkck89mi6qvs168u.png" alt="Tableau dashboard" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Installing model-local-editor
&lt;/h3&gt;

&lt;p&gt;Since Omni is typically developed within its in-browser IDE, you need to install &lt;strong&gt;model-local-editor&lt;/strong&gt; to validate locally developed content on the Omni instance.&lt;/p&gt;

&lt;p&gt;This tool runs as a persistent process, and any files edited during that time are automatically synced to the Omni instance.&lt;/p&gt;

&lt;p&gt;I've written a separate blog post on how to use it, so please refer to it for details:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/sagara/omnis-model-local-editor-sync-your-locally-developed-semantic-layer-code-to-your-omni-instance-5034"&gt;https://dev.to/sagara/omnis-model-local-editor-sync-your-locally-developed-semantic-layer-code-to-your-omni-instance-5034&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Claude Code (Skill Implementation)
&lt;/h3&gt;

&lt;p&gt;I created a Skill to generate Omni YAML from Tableau &lt;code&gt;.twb&lt;/code&gt; files. The implementation approach was to have Claude Code's &lt;code&gt;skill-creator&lt;/code&gt; Skill generate a base, then iterate through trial and error to refine it.&lt;/p&gt;

&lt;p&gt;The Skill details are published in the following repository:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; This Skill does not guarantee perfect conversion of all .twb files to Omni's Semantic Layer. Please use it with that caveat in mind. Also, the Markdown descriptions within the linked Skill are written in Japanese.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://github.com/SSchneider22/tableau-twb-to-omni-semantic-layer" rel="noopener noreferrer"&gt;https://github.com/SSchneider22/tableau-twb-to-omni-semantic-layer&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Auto-Generating Omni YAML Files from .twb Files Using Claude Code
&lt;/h2&gt;

&lt;p&gt;I cloned the Git-integrated Omni repository to my local environment and created a branch. Then I created a &lt;code&gt;tableau&lt;/code&gt; folder.&lt;/p&gt;

&lt;p&gt;Inside it, I placed the target Tableau workbook file (&lt;code&gt;.twb&lt;/code&gt; file) for migration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F38nfphrzun4rctrqxv84.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F38nfphrzun4rctrqxv84.png" alt="Folder structure with .twb file" width="704" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, I started model-local-editor so that local edits would be reflected on the Omni instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffv7hc7uyynp82gndmowr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffv7hc7uyynp82gndmowr.png" alt="model-local-editor running" width="800" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then I launched Claude Code and ran &lt;code&gt;/skills&lt;/code&gt; to verify that the Skill created during preparation was properly loaded. If it displays as shown below, everything is fine.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnm7ciggvenmcjhyv0d50.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnm7ciggvenmcjhyv0d50.png" alt="Claude Code skills verification" width="759" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With that confirmed, I submitted the following prompt in Plan mode (adjust folder paths to match your setup):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Using the Skill "tableau-twb-to-omni-semantic-layer", generate all Omni migration files based on the .twb file in the develop/manufacturing_dataops_test/tableau folder.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Claude Code created a plan as shown below, and I then asked it to generate the code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk4zh16sippyxjcyg90ez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk4zh16sippyxjcyg90ez.png" alt="Claude Code plan" width="800" height="647"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After code generation was complete, I reviewed the generated code. Here are some examples of what was produced:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dimensions/measures/filters definitions added to existing View files&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjwcld3r1ebxbkpunuyw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjwcld3r1ebxbkpunuyw.png" alt="Generated View file - dimensions" width="800" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lwed7amv5hklx9remtf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lwed7amv5hklx9remtf.png" alt="Generated View file - measures" width="800" height="704"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Topics generated per data source&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj6ngfe038qlg3qcr4rlp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj6ngfe038qlg3qcr4rlp.png" alt="Generated Topic file" width="800" height="983"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Markdown files with step-by-step instructions for reproducing each Tableau worksheet/dashboard in Omni&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr2015fkep8wen34pvwtx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr2015fkep8wen34pvwtx.png" alt="Generated Markdown - overview" width="800" height="588"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6b59pu9po4sqgfekiahl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6b59pu9po4sqgfekiahl.png" alt="Generated Markdown - worksheet steps" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw5s1oz5hpes7ioxb7et.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw5s1oz5hpes7ioxb7et.png" alt="Generated Markdown - dashboard steps" width="800" height="579"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, I performed Git operations to update the main branch with the latest content.&lt;/p&gt;

&lt;h2&gt;
  
  
  Verifying on Omni
&lt;/h2&gt;

&lt;p&gt;Since I was using model-local-editor, the code generated by Claude Code was already reflected in Omni on the specified branch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fymbokh5qjnmyprehp3zb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fymbokh5qjnmyprehp3zb.png" alt="Omni IDE showing synced files" width="800" height="790"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From this state, I clicked Explore and followed the Markdown file created by Claude Code to see if I could reproduce the charts originally built in Tableau.&lt;/p&gt;

&lt;p&gt;When reproducing each Query as a chart, I confirmed that Tableau's data source filters were applied as Topic default filters, and that the parameter for switching measures and LoD fields all worked as expected.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0kbgff6ntvbj7ebh2ddy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0kbgff6ntvbj7ebh2ddy.png" alt="Omni chart reproduction 1" width="800" height="702"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpes7092scj7fawp5e6i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpes7092scj7fawp5e6i.png" alt="Omni chart reproduction 2" width="800" height="702"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhrcnx8o232aazb3ab3hc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhrcnx8o232aazb3ab3hc.png" alt="Omni chart reproduction 3" width="800" height="702"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, I also created a dashboard. While the positions of filters and legends differ, the functionality covers the same content as the Tableau dashboard.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmu6jn3dvf6eymp2syxa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmu6jn3dvf6eymp2syxa.png" alt="Omni dashboard" width="800" height="688"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus: Examples of Issues Encountered During Testing
&lt;/h2&gt;

&lt;p&gt;This article highlights the successful results, but during testing I encountered various issues and had to repeatedly feed error logs back to Claude Code to refine the Skill. (I went through this code generation → Skill refinement loop at least 10+ times...)&lt;/p&gt;

&lt;p&gt;Here are some examples of actual issues I encountered:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The generated Omni Semantic Layer code used non-existent parameters, causing error logs during model-local-editor sync

&lt;ul&gt;
&lt;li&gt;Example 1: Specifying non-existent values for &lt;code&gt;aggregation_type&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Example 2: Not creating LoD fields according to Omni's syntax&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Even when sync succeeded, errors/warnings appeared on the Omni UI

&lt;ul&gt;
&lt;li&gt;Example 1: Fields used in LoD calculations were not included in the Topic's field list&lt;/li&gt;
&lt;li&gt;Example 2: Filter fields already defined under &lt;code&gt;filters:&lt;/code&gt; were also duplicated under &lt;code&gt;dimensions:&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Example 3: A measure with &lt;code&gt;aggregate_type: count_distinct&lt;/code&gt; also had COUNT DISTINCT in its &lt;code&gt;sql:&lt;/code&gt; definition&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tried using Claude Code to automatically generate Omni Semantic Layer YAML code from Tableau &lt;code&gt;.twb&lt;/code&gt; files.&lt;/p&gt;

&lt;p&gt;While I haven't achieved automatic dashboard migration, being able to automate the generation of Semantic Layer definitions, Topics, and even Markdown instructions for reproducing charts in Omni gave me a strong sense that a significant portion of migration work can be streamlined.&lt;/p&gt;

&lt;p&gt;In particular, the fact that complex Tableau definitions like data source filters, LoD fields, and parameters could be reproduced on the Omni side through iterative Skill improvements was a major win.&lt;/p&gt;

&lt;p&gt;That said, perfect code isn't generated on the first try — the Skill required many rounds of refinement. While it worked well for the &lt;code&gt;.twb&lt;/code&gt; file I tested, other &lt;code&gt;.twb&lt;/code&gt; files may require additional adjustments.&lt;/p&gt;

&lt;p&gt;Nevertheless, for BI tool migrations — which tend to be daunting undertakings — the advancement of AI technology is making "migration between code-based artifacts" an increasingly realistic option.&lt;/p&gt;

&lt;p&gt;I hope this article helps anyone considering a BI tool migration.&lt;/p&gt;

</description>
      <category>tableau</category>
      <category>omni</category>
      <category>claudecode</category>
      <category>bi</category>
    </item>
    <item>
      <title>Omni's model-local-editor: Sync Your Locally Developed Semantic Layer Code to Your Omni Instance</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Fri, 06 Mar 2026 09:34:24 +0000</pubDate>
      <link>https://forem.com/sagara/omnis-model-local-editor-sync-your-locally-developed-semantic-layer-code-to-your-omni-instance-5034</link>
      <guid>https://forem.com/sagara/omnis-model-local-editor-sync-your-locally-developed-semantic-layer-code-to-your-omni-instance-5034</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;This is an English translation of the original Japanese article below:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/try-model-local-editor/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/try-model-local-editor/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This is Sagara.&lt;/p&gt;

&lt;p&gt;In Omni, you can create Semantic Layer code through GUI operations on the instance itself. However, with the recent rise of AI coding agents like Claude Code, there are increasing cases where you'd want to develop Omni's Semantic Layer locally.&lt;/p&gt;

&lt;p&gt;I actually faced this exact situation. I developed Omni's Semantic Layer code locally, issued a pull request, and merged it into the main branch — only to find that the latest code wasn't reflected on the Omni instance. (I also tried using the API, but it didn't work as expected.)&lt;/p&gt;

&lt;p&gt;When I consulted with an Omni engineer about this, they told me about a package called &lt;strong&gt;"model-local-editor"&lt;/strong&gt;. By using this package, you can sync locally developed Omni Semantic Layer code to your Omni instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.npmjs.com/package/@omni-co/model-local-editor" rel="noopener noreferrer"&gt;https://www.npmjs.com/package/@omni-co/model-local-editor&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this article, I'll walk through my experience trying out this package.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;OS: Ubuntu 24.04 LTS (running on WSL2)&lt;/li&gt;
&lt;li&gt;Package manager: npm 11.11.0&lt;/li&gt;
&lt;li&gt;model-local-editor: 0.2.0&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I'm assuming that the Model you develop in Omni is connected to Git, and you've cloned that repository locally for development. (For Omni setup details, please refer to &lt;a href="https://dev.classmethod.jp/articles/omni-try-initial-setup/" rel="noopener noreferrer"&gt;this blog post&lt;/a&gt;.)&lt;/p&gt;

&lt;p&gt;Also, I'm assuming this repository manages not only Omni code but also dbt code and other resources.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fef8agokoq1ozua8iohip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fef8agokoq1ozua8iohip.png" alt="2026-03-06_17h25_32" width="765" height="962"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Installing model-local-editor
&lt;/h3&gt;

&lt;p&gt;First, install model-local-editor:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-g&lt;/span&gt; @omni-co/model-local-editor
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setting Environment Variables
&lt;/h3&gt;

&lt;p&gt;Set the following two environment variables. (If you want them to persist, consider adding them to &lt;code&gt;~/.bashrc&lt;/code&gt; or a &lt;code&gt;.env&lt;/code&gt; file.)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;OMNI_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"your-api-key-here"&lt;/span&gt;
&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;OMNI_BASE_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"https://my-org.omniapp.co"&lt;/span&gt;  &lt;span class="c"&gt;# or your Omni instance URL&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Trying It Out
&lt;/h2&gt;

&lt;p&gt;Now that preparation is complete, let's sync to the Omni instance using model-local-editor.&lt;/p&gt;

&lt;h3&gt;
  
  
  Recommended: Create a Local Git Branch with the Same Name as the Omni Branch
&lt;/h3&gt;

&lt;p&gt;Before using model-local-editor, I recommend creating a local branch with the same name as the branch you'll create on the Omni side.&lt;/p&gt;

&lt;p&gt;The reason is that model-local-editor only creates a branch on the Omni instance side, so the sync process will push to the specified branch regardless of which local branch you're on.&lt;/p&gt;

&lt;p&gt;For this walkthrough, I'll create and switch to a branch called &lt;code&gt;20260306-local-sync-test&lt;/code&gt; locally:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git switch &lt;span class="nt"&gt;-c&lt;/span&gt; 20260306-local-sync-test
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Initial Setup and Branch Creation
&lt;/h3&gt;

&lt;p&gt;Run the following command to specify the Omni Model, use the &lt;code&gt;--dir&lt;/code&gt; option to point to the folder path managing the target Model, and create a new branch on Omni. (You can find the &lt;code&gt;model-id&lt;/code&gt; from the URL bar when you open the target Model.)&lt;/p&gt;

&lt;p&gt;&lt;em&gt;For other options, please refer to the &lt;a href="https://www.npmjs.com/package/@omni-co/model-local-editor" rel="noopener noreferrer"&gt;package documentation&lt;/a&gt;.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;omni-sync init &amp;lt;model-id&amp;gt; &lt;span class="nt"&gt;--dir&lt;/span&gt; &amp;lt;folder-path-managing-the-target-Omni-Model&amp;gt; &lt;span class="nt"&gt;--branch&lt;/span&gt; &amp;lt;branch-name-to-create-on-Omni&amp;gt; &lt;span class="nt"&gt;--create-branch&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Running this command displays the following:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5296j91s7k6qnccpdnpi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5296j91s7k6qnccpdnpi.png" alt="2026-03-06_17h36_20" width="800" height="247"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally, a &lt;code&gt;.omni-sync.json&lt;/code&gt; file is created in the directory specified by the &lt;code&gt;--dir&lt;/code&gt; option. This file tracks the change history of each file. (This file is used internally by model-local-editor, so users generally should not edit it directly.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgru05m4bj8k6fjsoixsc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgru05m4bj8k6fjsoixsc.png" alt="2026-03-06_17h38_52" width="800" height="650"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you open the target Model in Omni and view the branch list, you can see that the &lt;code&gt;20260306-local-sync-test&lt;/code&gt; branch (specified with &lt;code&gt;--branch 20260306-local-sync-test&lt;/code&gt;) has been created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj7o03dfplifi6paeb2ar.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj7o03dfplifi6paeb2ar.png" alt="2026-03-06_17h41_10" width="800" height="549"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Syncing Local Edits to Omni
&lt;/h3&gt;

&lt;p&gt;Next, let's sync locally edited content to Omni.&lt;/p&gt;

&lt;p&gt;First, run the following command locally to start and keep the model-local-editor sync process running. &lt;strong&gt;Be careful to include the &lt;code&gt;--dir&lt;/code&gt; option — without it, Omni's folders and files will be automatically moved to the root of your repository.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;omni-sync start &amp;lt;model-id&amp;gt; &lt;span class="nt"&gt;--dir&lt;/span&gt; &amp;lt;folder-path-managing-the-target-Omni-Model&amp;gt; &lt;span class="nt"&gt;--branch&lt;/span&gt; &amp;lt;branch-name-specified-in-omni-sync-init&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll see output like the following. Once &lt;code&gt;File watcher ready&lt;/code&gt; appears at the end, you're good to go. &lt;strong&gt;The key point is to keep this process running.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp8pb3ll92uu2nk8303xi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp8pb3ll92uu2nk8303xi.png" alt="2026-03-06_18h00_35" width="800" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the process running, edit and save any Omni file, and you'll see log output like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6a65ehso1ifvvpy3avel.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6a65ehso1ifvvpy3avel.png" alt="2026-03-06_18h03_04" width="800" height="635"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, when you switch to the branch on the Omni instance and check the target file, you'll see it has been synced! (Note: there were a few cases where syncing took some time, possibly due to browser caching.)&lt;/p&gt;

&lt;p&gt;This means you can develop locally and immediately verify your changes on the Omni instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fegpkd8wp9k8vnavl7e6t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fegpkd8wp9k8vnavl7e6t.png" alt="2026-03-06_18h04_53" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Bonus: I Recommend Creating Pull Requests from the Omni UI
&lt;/h3&gt;

&lt;p&gt;By now, you should have a good picture of how to use model-local-editor to develop Omni code locally and verify it on the Omni instance.&lt;/p&gt;

&lt;p&gt;However, when it comes to creating pull requests after you've finished development, &lt;strong&gt;I recommend issuing pull requests from the Omni instance UI&lt;/strong&gt;. The reason is that even if you create a pull request and merge it from your local environment, the main branch code on the Omni instance won't be updated...&lt;/p&gt;

&lt;p&gt;Below is the flow of creating a pull request from the Omni UI. You can see that the dimension &lt;code&gt;updated_at_20260306&lt;/code&gt; that was added earlier gets merged into the main branch and is reflected in the Shared model on the Omni instance (the shared model that syncs with the main branch code).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksb8r9627ziwosblj8ek.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksb8r9627ziwosblj8ek.png" alt="2026-03-06_18h13_02" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpt7egn9yuedmbnx4ksj7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpt7egn9yuedmbnx4ksj7.png" alt="2026-03-06_18h13_41" width="800" height="848"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi7289udwn119gj9wwna5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi7289udwn119gj9wwna5.png" alt="2026-03-06_18h14_38" width="589" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya8u61c39f3oxiqt7swf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya8u61c39f3oxiqt7swf.png" alt="2026-03-06_18h15_41" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Bonus: If There Are Syntax Errors
&lt;/h3&gt;

&lt;p&gt;If the file you want to sync to Omni has syntax errors, it will be displayed as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2rs27gc5171k4akyxvhm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2rs27gc5171k4akyxvhm.png" alt="2026-03-07_07h24_17" width="800" height="522"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tried out "model-local-editor," which lets you sync locally developed Omni code with your Omni instance.&lt;/p&gt;

&lt;p&gt;While Omni's built-in IDE is excellent, as someone who has recently fallen in love with Claude Code, I found this capability to develop locally and immediately verify on the Omni instance incredibly convenient!&lt;/p&gt;

&lt;p&gt;If you're using coding agents locally to develop Omni's Semantic Layer, I highly recommend giving this package a try.&lt;/p&gt;

</description>
      <category>omni</category>
    </item>
    <item>
      <title>Snowflake's New Feature: AI-Powered Data Quality Checks You Can Set Up Directly in Snowsight</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Thu, 26 Feb 2026 01:40:17 +0000</pubDate>
      <link>https://forem.com/sagara/snowflakes-new-feature-ai-powered-data-quality-checks-you-can-set-up-directly-in-snowsight-2mpm</link>
      <guid>https://forem.com/sagara/snowflakes-new-feature-ai-powered-data-quality-checks-you-can-set-up-directly-in-snowsight-2mpm</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This is an English translation of the original Japanese article: &lt;a href="https://dev.classmethod.jp/articles/snowflake-setup-data-quality-checks-in-snowsight/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-setup-data-quality-checks-in-snowsight/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Snowflake has released a new preview feature that allows you to set up data quality checks directly from the Snowsight UI. This feature leverages Cortex AI to automatically suggest quality checks, and also supports manual quality check definitions — all through the Snowsight GUI.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2026/other/2026-02-23-data-quality-monitoring-setup" rel="noopener noreferrer"&gt;Release Notes: Data Quality Monitoring Setup&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/data-quality-ui-setup" rel="noopener noreferrer"&gt;User Guide: Data Quality UI Setup&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I tried it out, so let me walk you through how it works.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Is Data Quality Check Setup in Snowsight?
&lt;/h2&gt;

&lt;p&gt;Previously, to perform data quality checks in Snowflake, you had to define and configure Data Metric Functions (DMFs) using SQL. With this new feature, the following capabilities are now available directly from the Snowsight GUI:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cortex Data Quality (AI-suggested):&lt;/strong&gt; Cortex AI analyzes metadata characteristics and data usage patterns to automatically suggest quality check definitions (DMFs). Once you accept the suggestions, it periodically detects data quality issues.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Manual definition:&lt;/strong&gt; You can manually define quality check types and criteria based on your own knowledge of the data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Execution schedule adjustment:&lt;/strong&gt; You can configure the execution frequency of quality checks using time-based, schedule-based, or DML trigger-based settings.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Cortex Data Quality leverages Snowflake Cortex's &lt;code&gt;AI_COMPLETE&lt;/code&gt; function, and all data and metadata remain securely within Snowflake. It also fully respects Snowflake's access control — suggestions are made based only on data the user has access to.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Quoting from the &lt;a href="https://docs.snowflake.com/en/user-guide/data-quality-ui-setup" rel="noopener noreferrer"&gt;official documentation&lt;/a&gt;, you need to be aware of the following regarding editions, roles, and permissions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Snowflake Edition:&lt;/strong&gt; Enterprise or higher

&lt;ul&gt;
&lt;li&gt;For this walkthrough, I used a Snowflake trial account with Enterprise edition on the AWS Tokyo region.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Required privileges for the operating role:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;OWNERSHIP&lt;/code&gt; privilege on the target table&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;EXECUTE DATA METRIC FUNCTION&lt;/code&gt; privilege on the account&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;SNOWFLAKE.DATA_METRIC_USER&lt;/code&gt; database role&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;SNOWFLAKE.CORTEX_USER&lt;/code&gt; database role&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;LLM Models:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;mistral-7b&lt;/code&gt; and &lt;code&gt;llama3.1-8b&lt;/code&gt; models must be allowed in the &lt;code&gt;CORTEX_MODELS_ALLOWLIST&lt;/code&gt; account parameter (they are allowed by default).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Preparing Test Data and Setting Permissions
&lt;/h2&gt;

&lt;p&gt;Run the following SQL to create the objects and sample data for testing.&lt;/p&gt;

&lt;p&gt;This data intentionally includes the following quality issues:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CUSTOMER_NAME is NULL (ORDER_ID: 5, 13)&lt;/li&gt;
&lt;li&gt;EMAIL is NULL (ORDER_ID: 8, 13)&lt;/li&gt;
&lt;li&gt;EMAIL has an invalid format (ORDER_ID: 22 → invalid-email)&lt;/li&gt;
&lt;li&gt;QUANTITY is negative (ORDER_ID: 17 → -1)&lt;/li&gt;
&lt;li&gt;QUANTITY is zero (ORDER_ID: 19 → 0)&lt;/li&gt;
&lt;li&gt;TOTAL_AMOUNT is negative (ORDER_ID: 17 → -25.00)&lt;/li&gt;
&lt;li&gt;STATUS contains an unexpected value (ORDER_ID: 25 → INVALID_STATUS)
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Create database and schema for testing&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DATA_QUALITY_DEMO&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DATA_QUALITY_DEMO&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;PUBLIC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Create a warehouse for testing (you can use an existing one if available)&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;DATA_QUALITY_WH&lt;/span&gt;
  &lt;span class="n"&gt;WAREHOUSE_SIZE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'XSMALL'&lt;/span&gt;
  &lt;span class="n"&gt;AUTO_SUSPEND&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;
  &lt;span class="n"&gt;AUTO_RESUME&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;DATA_QUALITY_DEMO&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;SCHEMA&lt;/span&gt; &lt;span class="k"&gt;PUBLIC&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="n"&gt;DATA_QUALITY_WH&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Create a test table (simulating e-commerce order data)&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;ORDERS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;ORDER_ID&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;EMAIL&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;ORDER_DATE&lt;/span&gt; &lt;span class="nb"&gt;DATE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;QUANTITY&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;TOTAL_AMOUNT&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;STATUS&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;SHIPPING_COUNTRY&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;CREATED_AT&lt;/span&gt; &lt;span class="n"&gt;TIMESTAMP_NTZ&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;-- Insert sample data (20+ records, intentionally including data quality issues)&lt;/span&gt;
&lt;span class="k"&gt;INSERT&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="n"&gt;ORDERS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ORDER_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;CUSTOMER_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;EMAIL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ORDER_DATE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;PRODUCT_NAME&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;QUANTITY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;UNIT_PRICE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;TOTAL_AMOUNT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;STATUS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;SHIPPING_COUNTRY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;VALUES&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Taro Yamada'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'taro.yamada@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-05'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Wireless Mouse'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Hanako Suzuki'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'hanako.suzuki@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-06'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Mechanical Keyboard'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;89&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;89&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'John Smith'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'john.smith@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-07'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USB-C Hub'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;105&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USA'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Emily Johnson'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'emily.j@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-08'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Monitor Stand'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Processing'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USA'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'unknown@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-09'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Webcam HD'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Canada'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Kenji Tanaka'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'kenji.tanaka@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-10'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Laptop Sleeve'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;29&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;98&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Maria Garcia'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'maria.garcia@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-11'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Wireless Earbuds'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;79&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;79&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Spain'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Li Wei'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-12'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Phone Case'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'China'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;9&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Sakura Ito'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'sakura.ito@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-13'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Desk Lamp'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Cancelled'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'David Brown'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'david.brown@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-14'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Portable Charger'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'UK'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;11&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Yuki Watanabe'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'yuki.w@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-15'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Bluetooth Speaker'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;55&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;55&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Processing'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Anna Mueller'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'anna.mueller@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-16'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Tablet Stand'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Germany'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;13&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-17'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'HDMI Cable'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;89&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;90&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'France'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Ryo Nakamura'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'ryo.nakamura@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-18'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Mouse Pad XL'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;19&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;19&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Sophie Martin'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'sophie.martin@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-19'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USB Flash Drive'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'France'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Takeshi Kobayashi'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'takeshi.k@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-20'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Webcam HD'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;17&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Chen Mei'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'chen.mei@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-21'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Wireless Mouse'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Returned'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'China'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;18&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'James Wilson'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'james.wilson@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-22'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Mechanical Keyboard'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;89&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;89&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Processing'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Australia'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;19&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Aoi Sato'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'aoi.sato@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-23'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Monitor Stand'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Cancelled'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Lucas Dubois'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'lucas.dubois@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-24'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Laptop Sleeve'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;29&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;99&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;98&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'France'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;21&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Mika Yoshida'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'mika.yoshida@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-25'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Desk Lamp'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;22&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Robert Taylor'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'invalid-email'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-26'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Portable Charger'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USA'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;23&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Haruto Kimura'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'haruto.k@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-27'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Bluetooth Speaker'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;55&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;110&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Delivered'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;24&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Isabella Rossi'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'isabella.rossi@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-28'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Phone Case'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;36&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Shipped'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Italy'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Yuto Hayashi'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'yuto.hayashi@example.com'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'2025-01-29'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'USB-C Hub'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'INVALID_STATUS'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'Japan'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, grant the necessary privileges to the role you'll be using. In this case, I'm granting permissions to SYSADMIN.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;USE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;ACCOUNTADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Grant EXECUTE DATA METRIC FUNCTION privilege&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="k"&gt;DATA&lt;/span&gt; &lt;span class="n"&gt;METRIC&lt;/span&gt; &lt;span class="k"&gt;FUNCTION&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;ACCOUNT&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Grant database roles&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DATA_METRIC_USER&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;GRANT&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CORTEX_USER&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="k"&gt;ROLE&lt;/span&gt; &lt;span class="n"&gt;SYSADMIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setting Up Data Quality Checks Manually
&lt;/h2&gt;

&lt;p&gt;Let's set up a data quality check manually against the data we just created.&lt;/p&gt;

&lt;p&gt;Open the target table from Snowsight's Database Explorer and click the &lt;code&gt;Data Quality&lt;/code&gt; tab.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3779gj7zgo4w6nm366n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3779gj7zgo4w6nm366n.png" alt="2026-02-26_08h35_00" width="800" height="227"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The following screen appears. Click &lt;code&gt;Setup manually&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6phn5a3awlg70tjgy6e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6phn5a3awlg70tjgy6e.png" alt="2026-02-26_08h36_10" width="800" height="659"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A list of available data quality check candidates is displayed. Click the one you want to configure. (In this case, I'll click &lt;code&gt;Nulls&lt;/code&gt;.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5epv1txaq4laf8cn32gk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5epv1txaq4laf8cn32gk.png" alt="2026-02-26_08h36_38" width="800" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A screen appears with an English sentence containing dropdown lists for configuring the Nulls check.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpoznz7p0f547juliok8w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpoznz7p0f547juliok8w.png" alt="2026-02-26_08h39_18" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I configured it as shown below. Clicking &lt;code&gt;Edit SQL&lt;/code&gt; lets you see the DMF definition query that will be generated. If everything looks good, click &lt;code&gt;Save&lt;/code&gt; in the bottom right.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fubn8t0lc1729qhisg0ms.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fubn8t0lc1729qhisg0ms.png" alt="2026-02-26_08h41_22" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpgd6q6mx1edmomcm3edd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpgd6q6mx1edmomcm3edd.png" alt="2026-02-26_08h41_45" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at the &lt;code&gt;Data Quality&lt;/code&gt; Monitoring view, you can see that the quality check has been added under the Accuracy section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl15la3q4lyqm3r60n4mn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl15la3q4lyqm3r60n4mn.png" alt="2026-02-26_08h44_15" width="800" height="626"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, clicking &lt;code&gt;Settings&lt;/code&gt; in the &lt;code&gt;Monitoring&lt;/code&gt; section allows you to change the frequency of data quality checks. (The default was set to run every hour.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sedscmvfzh0iahk4f4u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sedscmvfzh0iahk4f4u.png" alt="2026-02-26_09h02_21" width="800" height="586"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up AI-Powered Data Quality Checks (Cortex Data Quality)
&lt;/h2&gt;

&lt;p&gt;Next, let's set up AI-powered data quality checks. (This feature is officially called &lt;code&gt;Cortex Data Quality&lt;/code&gt;.)&lt;/p&gt;

&lt;p&gt;On the &lt;code&gt;Data Quality&lt;/code&gt; tab of the target data, click &lt;code&gt;+ Add quality check&lt;/code&gt;, then select &lt;code&gt;Generate with Cortex Data Quality&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34ol3upnxs7v43byb4ps.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34ol3upnxs7v43byb4ps.png" alt="2026-02-26_09h37_02" width="800" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The following screen appears, where AI scans the data content and generates the necessary data quality checks.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17bhqrxtw6v6qn3kekit.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17bhqrxtw6v6qn3kekit.png" alt="2026-02-26_09h37_32" width="800" height="798"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For my data, after about 10 seconds, 10 data quality checks were generated as shown below.&lt;/p&gt;

&lt;p&gt;It's great that the &lt;code&gt;WHY IS THIS RECOMMENDED?&lt;/code&gt; column on the far right explains why each data quality check was suggested.&lt;/p&gt;

&lt;p&gt;Check the boxes for the quality checks you need, then click &lt;code&gt;Apply&lt;/code&gt; in the bottom right.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b77adykjsk4bf4wdf5o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b77adykjsk4bf4wdf5o.png" alt="2026-02-26_09h43_29" width="800" height="659"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By the way, after clicking &lt;code&gt;Apply&lt;/code&gt;, the following message appeared for my test data. I believe the permissions were sufficient, but I wasn't able to determine the exact cause...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzxqz9olw7yx21i4z78p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzxqz9olw7yx21i4z78p.png" alt="2026-02-26_09h45_27" width="800" height="1056"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, looking at the Data Quality Monitoring screen, you can see that the data quality checks have been added as shown below. (The one check showing an error is because the manually configured check from earlier had already been executed after one hour had passed.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjlyjrvqhoop9uvtvagyo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjlyjrvqhoop9uvtvagyo.png" alt="2026-02-26_09h48_48" width="800" height="706"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Running the Configured Data Quality Checks
&lt;/h2&gt;

&lt;p&gt;Finally, let's run the data quality checks we've configured.&lt;/p&gt;

&lt;p&gt;From &lt;code&gt;Settings&lt;/code&gt;, change the frequency to &lt;code&gt;Every 5 minutes&lt;/code&gt; and wait about 5 minutes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2cepi9wn5y6ic3rqs7j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2cepi9wn5y6ic3rqs7j.png" alt="2026-02-26_09h50_19" width="800" height="595"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After about 5 minutes, the &lt;code&gt;Data Quality&lt;/code&gt; Monitoring screen displayed the results as shown below. Since you can also review the history of past quality check results, this is more than adequate as a quality monitoring feature. (If I had one wish, it would be nice to have notifications when data quality checks fail — but at least as of now, that doesn't seem to be configurable from Snowsight alone.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1o8adhiq2gxeudny33z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1o8adhiq2gxeudny33z.png" alt="2026-02-26_10h06_52" width="800" height="682"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3s4uv9baorwea5lcqtpj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3s4uv9baorwea5lcqtpj.png" alt="2026-02-26_10h07_12" width="800" height="767"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9dx6g6yi6wa4qof2w8g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9dx6g6yi6wa4qof2w8g.png" alt="2026-02-26_10h08_37" width="800" height="665"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tried out the new feature that lets you set up data quality checks using AI directly from Snowsight, and summarized my findings in this article.&lt;/p&gt;

&lt;p&gt;I had previously &lt;a href="https://dev.classmethod.jp/articles/snowsight-data-quality-tab/" rel="noopener noreferrer"&gt;tested the Data Quality tab when it was first introduced&lt;/a&gt;, and at that time I felt that defining DMFs via SQL queries was time-consuming and a bottleneck. However, with this new feature, the fact that DMF definitions can be easily generated by relying on AI is truly impressive!&lt;/p&gt;

&lt;p&gt;This is a feature you can try right away, so I highly encourage you to give it a shot.&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>dataengineering</category>
    </item>
    <item>
      <title>Building a dbt Incremental Model for Parsing and Chunking PDFs for Snowflake Cortex Search Service</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Tue, 10 Feb 2026 23:11:12 +0000</pubDate>
      <link>https://forem.com/sagara/building-a-dbt-incremental-model-for-parsing-and-chunking-pdfs-for-snowflake-cortex-search-service-3eik</link>
      <guid>https://forem.com/sagara/building-a-dbt-incremental-model-for-parsing-and-chunking-pdfs-for-snowflake-cortex-search-service-3eik</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This is an English translation of the following article:&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/snowflake-dbt-incremental-model-for-cortex-search-service/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-dbt-incremental-model-for-cortex-search-service/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;To build RAG using Cortex Search Service, you need to prepare a table where documents such as PDFs have been parsed using functions like &lt;code&gt;AI_PARSE_DOCUMENT&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;I explored how to build this table using dbt, and in this article I'll share what I came up with.&lt;/p&gt;

&lt;h2&gt;
  
  
  The dbt Model Implementation
&lt;/h2&gt;

&lt;p&gt;Let's jump right in — here is the dbt model I implemented. It performs document parsing and chunking against a directory table from an S3 bucket containing &lt;a href="https://www.snowflake.com/ja/customers/all-customers/" rel="noopener noreferrer"&gt;Snowflake customer case study PDFs&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The key points of this dbt model are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configured as an Incremental model with &lt;code&gt;unique_key='relative_path'&lt;/code&gt; + &lt;code&gt;incremental_strategy='delete+insert'&lt;/code&gt;. Data with the same file path is deleted and re-inserted.

&lt;ul&gt;
&lt;li&gt;If a record with the same path and the same or newer modification timestamp already exists in the target table, it is skipped. This means only unprocessed or updated files are processed, reducing the cost of &lt;code&gt;AI_PARSE_DOCUMENT&lt;/code&gt; parsing.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Chunking uses the &lt;code&gt;SNOWFLAKE.CORTEX.SPLIT_TEXT_MARKDOWN_HEADER&lt;/code&gt; function.

&lt;ul&gt;
&lt;li&gt;I adopted this approach after reading &lt;a href="https://zenn.dev/snowflakejp/articles/028b164c7a1a7b" rel="noopener noreferrer"&gt;a blog post by Takada-san from Snowflake&lt;/a&gt;.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="p"&gt;{{&lt;/span&gt;
    &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;materialized&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'incremental'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;unique_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'relative_path'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;incremental_strategy&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'delete+insert'&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}}&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;directory_files&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;last_modified&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;TO_FILE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'@RAW_DB.SAMPLE_SOURCE_SNOWPIPE.STAGE_CORPUS_SAMPLE'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;doc_file&lt;/span&gt;
    &lt;span class="k"&gt;from&lt;/span&gt;
        &lt;span class="n"&gt;DIRECTORY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="n"&gt;RAW_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SAMPLE_SOURCE_SNOWPIPE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;STAGE_CORPUS_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;d&lt;/span&gt;
    &lt;span class="k"&gt;where&lt;/span&gt;
        &lt;span class="n"&gt;relative_path&lt;/span&gt; &lt;span class="k"&gt;like&lt;/span&gt; &lt;span class="s1"&gt;'snowflake-case-studies/%.pdf'&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="n"&gt;if&lt;/span&gt; &lt;span class="n"&gt;is_incremental&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;exists&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
            &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="n"&gt;this&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;t&lt;/span&gt;
            &lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;t&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;d&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;
              &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;t&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_modified&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;d&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_modified&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="n"&gt;endif&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;parsed&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;last_modified&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;SNOWFLAKE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CORTEX&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AI_PARSE_DOCUMENT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;doc_file&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s1"&gt;'mode'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'LAYOUT'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'page_split'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;TRUE&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;parsed_result&lt;/span&gt;
    &lt;span class="k"&gt;from&lt;/span&gt;
        &lt;span class="n"&gt;directory_files&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;concatenated&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_modified&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;LISTAGG&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="n"&gt;within&lt;/span&gt; &lt;span class="k"&gt;group&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;order&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;index&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;full_content&lt;/span&gt;
    &lt;span class="k"&gt;from&lt;/span&gt;
        &lt;span class="n"&gt;parsed&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;lateral&lt;/span&gt; &lt;span class="n"&gt;flatten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;parsed_result&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;pages&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;
    &lt;span class="k"&gt;group&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt;
        &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_modified&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;chunked&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="k"&gt;c&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;c&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_modified&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;header_1&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nb"&gt;varchar&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;header_1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;header_2&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nb"&gt;varchar&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;header_2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;header_3&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nb"&gt;varchar&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;header_3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nb"&gt;varchar&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;index&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;chunk_index&lt;/span&gt;
    &lt;span class="k"&gt;from&lt;/span&gt;
        &lt;span class="n"&gt;concatenated&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="k"&gt;c&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;lateral&lt;/span&gt; &lt;span class="n"&gt;flatten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt;
            &lt;span class="n"&gt;SNOWFLAKE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CORTEX&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SPLIT_TEXT_MARKDOWN_HEADER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="k"&gt;c&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;full_content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="n"&gt;OBJECT_CONSTRUCT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'#'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'header_1'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'##'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'header_2'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'###'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'header_3'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                &lt;span class="mi"&gt;10000&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ch&lt;/span&gt;

&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;select&lt;/span&gt;
    &lt;span class="n"&gt;SHA2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;':'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;chunk_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;256&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;doc_chunk_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;SPLIT_PART&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'/'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;file_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;REPLACE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;REPLACE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;SPLIT_PART&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;relative_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'/'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'Case_Study_'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'.pdf'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;case_study_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;chunk_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;last_modified&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CURRENT_TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;_parsed_at&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
    &lt;span class="n"&gt;chunked&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Additionally, in &lt;code&gt;dbt_project.yml&lt;/code&gt;, I configured &lt;code&gt;pre_hook&lt;/code&gt; and &lt;code&gt;post_hook&lt;/code&gt; as shown below. The assumption is that a &lt;code&gt;corpus&lt;/code&gt; folder is created within the &lt;code&gt;models&lt;/code&gt; folder, and models for parsing and chunking for Cortex Search Service are defined inside it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;pre_hook&lt;/code&gt;: Refreshes the external stage pointing to the S3 bucket containing the PDFs, so the directory table queried by the dbt model is updated before the model is built.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;post_hook&lt;/code&gt;: Since tables for Cortex Search Service require CHANGE_TRACKING to be enabled, a query to enable it is executed after the model is built.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sample_dbt_project'&lt;/span&gt;
&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1.9.0'&lt;/span&gt;
&lt;span class="na"&gt;config-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;2&lt;/span&gt;

&lt;span class="na"&gt;profile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sample_project'&lt;/span&gt;

&lt;span class="na"&gt;models&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;sample_dbt_project&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;corpus&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;sample&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;+schema&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;corpus_sample&lt;/span&gt;
        &lt;span class="na"&gt;+pre-hook&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ALTER&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;STAGE&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;RAW_DB.SAMPLE_SOURCE_SNOWPIPE.STAGE_CORPUS_SAMPLE&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;REFRESH"&lt;/span&gt;
        &lt;span class="na"&gt;+post-hook&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ALTER&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;TABLE&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;this&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;SET&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;CHANGE_TRACKING&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;=&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;TRUE"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  First dbt Model Run &amp;amp; Cortex Search Service Setup
&lt;/h2&gt;

&lt;p&gt;In S3, I initially placed only two case study PDFs — one for Bourbon and one for Aisan Takaoka. (I'll omit the stage creation queries here.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frd6qiclb0ltl1irwa94q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frd6qiclb0ltl1irwa94q.png" alt="2026-02-11_06h59_06"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Running the dbt model above with &lt;code&gt;dbt build&lt;/code&gt; creates a table in Snowflake with the parsed and chunked case study PDFs, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gvnds4obsifuswd8zy6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gvnds4obsifuswd8zy6.png" alt="2026-02-11_07h03_28"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I then created a Cortex Search Service and Cortex Agent for this dbt model using the following queries.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Create Cortex Search Service&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="n"&gt;CORTEX&lt;/span&gt; &lt;span class="k"&gt;SEARCH&lt;/span&gt; &lt;span class="n"&gt;SERVICE&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AIML_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CSS_SNOWFLAKE_CASE_STUDIES&lt;/span&gt;
    &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt;
    &lt;span class="n"&gt;ATTRIBUTES&lt;/span&gt; &lt;span class="n"&gt;file_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;case_study_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;header_1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;header_2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;header_3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chunk_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;last_modified&lt;/span&gt;
    &lt;span class="n"&gt;WAREHOUSE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;CORTEX_SEARCH_WH&lt;/span&gt;
    &lt;span class="n"&gt;TARGET_LAG&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'365 day'&lt;/span&gt;
    &lt;span class="n"&gt;EMBEDDING_MODEL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'voyage-multilingual-2'&lt;/span&gt;
    &lt;span class="k"&gt;INITIALIZE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ON_CREATE&lt;/span&gt;
    &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Cortex Search Service for chunked Snowflake customer case study PDFs'&lt;/span&gt;
&lt;span class="k"&gt;AS&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;file_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;case_study_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;header_3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;chunk_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;last_modified&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CORPUS_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COR_SNOWFLAKE_CASE_STUDIES_PARSED&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Create Agent&lt;/span&gt;
&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="n"&gt;AGENT&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AIML_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AGENT_SNOWFLAKE_CASE_STUDIES&lt;/span&gt;
    &lt;span class="k"&gt;COMMENT&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'Cortex Agent for searching and answering questions about Snowflake customer case study PDFs'&lt;/span&gt;
    &lt;span class="n"&gt;PROFILE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'{"display_name": "Snowflake Case Study Search", "avatar": "search", "color": "#29B5E8"}'&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;SPECIFICATION&lt;/span&gt;
    &lt;span class="err"&gt;$$&lt;/span&gt;
    &lt;span class="n"&gt;instructions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="k"&gt;system&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="n"&gt;You&lt;/span&gt; &lt;span class="k"&gt;are&lt;/span&gt; &lt;span class="n"&gt;an&lt;/span&gt; &lt;span class="n"&gt;assistant&lt;/span&gt; &lt;span class="n"&gt;that&lt;/span&gt; &lt;span class="n"&gt;answers&lt;/span&gt; &lt;span class="n"&gt;questions&lt;/span&gt; &lt;span class="n"&gt;about&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="n"&gt;Snowflake&lt;/span&gt; &lt;span class="n"&gt;customer&lt;/span&gt; &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;study&lt;/span&gt; &lt;span class="n"&gt;PDFs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Use&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="n"&gt;Cortex&lt;/span&gt; &lt;span class="k"&gt;Search&lt;/span&gt; &lt;span class="n"&gt;Service&lt;/span&gt; &lt;span class="k"&gt;to&lt;/span&gt; &lt;span class="k"&gt;search&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;relevant&lt;/span&gt; &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;study&lt;/span&gt; &lt;span class="n"&gt;information&lt;/span&gt;
        &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;provide&lt;/span&gt; &lt;span class="n"&gt;answers&lt;/span&gt; &lt;span class="n"&gt;based&lt;/span&gt; &lt;span class="k"&gt;on&lt;/span&gt; &lt;span class="n"&gt;accurate&lt;/span&gt; &lt;span class="n"&gt;information&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;If&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="n"&gt;information&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;found&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="k"&gt;search&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="n"&gt;speculate&lt;/span&gt; &lt;span class="err"&gt;—&lt;/span&gt; &lt;span class="n"&gt;inform&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="k"&gt;user&lt;/span&gt; &lt;span class="n"&gt;accordingly&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;
      &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt;
        &lt;span class="n"&gt;Please&lt;/span&gt; &lt;span class="n"&gt;respond&lt;/span&gt; &lt;span class="n"&gt;concisely&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;
        &lt;span class="n"&gt;Include&lt;/span&gt; &lt;span class="n"&gt;the&lt;/span&gt; &lt;span class="n"&gt;referenced&lt;/span&gt; &lt;span class="k"&gt;case&lt;/span&gt; &lt;span class="n"&gt;study&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;case_study_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;your&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;

    &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;tool_spec&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"cortex_search"&lt;/span&gt;
          &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"SearchCaseStudies"&lt;/span&gt;
          &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"Search Snowflake customer case study PDFs"&lt;/span&gt;

    &lt;span class="n"&gt;tool_resources&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
      &lt;span class="n"&gt;SearchCaseStudies&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"PROD_DB.AIML_SAMPLE.CSS_SNOWFLAKE_CASE_STUDIES"&lt;/span&gt;
        &lt;span class="n"&gt;max_results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nv"&gt;"5"&lt;/span&gt;
    &lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;-- Add to Snowflake Intelligence&lt;/span&gt;
&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE&lt;/span&gt; &lt;span class="n"&gt;INTELLIGENCE&lt;/span&gt; &lt;span class="n"&gt;SNOWFLAKE_INTELLIGENCE_OBJECT_DEFAULT&lt;/span&gt;
    &lt;span class="k"&gt;ADD&lt;/span&gt; &lt;span class="n"&gt;AGENT&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AIML_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AGENT_SNOWFLAKE_CASE_STUDIES&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After querying through Snowflake Intelligence, I was able to get results as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpnz2f15vjm7l6eta3xa4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpnz2f15vjm7l6eta3xa4.png" alt="2026-02-11_07h10_29"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Updating S3 Data and Running the dbt Model a Second Time
&lt;/h2&gt;

&lt;p&gt;To verify that incremental updates work correctly, I made the following two changes to the S3 bucket:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deleted the 2nd page of the Aisan Takaoka case study PDF and re-uploaded it with the same file name to S3&lt;/li&gt;
&lt;li&gt;Added a new Chiba Bank case study PDF to the S3 bucket&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnabvh2gn37ykwpl7mo7b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnabvh2gn37ykwpl7mo7b.png" alt="2026-02-11_07h16_02"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After running &lt;code&gt;dbt build&lt;/code&gt; again, you can see that the chunk count for Aisan Takaoka decreased (due to the removed page), and new records for Chiba Bank were added.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshzr76wk2j0w9548ifux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshzr76wk2j0w9548ifux.png" alt="2026-02-11_07h36_34"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at the &lt;code&gt;_PARSED_AT&lt;/code&gt; column, which records the parsing timestamp, you can confirm that only the records for Aisan Takaoka and Chiba Bank — the files that had changes — had their timestamps updated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo2wkaptq4d8lknestk7y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo2wkaptq4d8lknestk7y.png" alt="2026-02-11_07h47_39"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the table updated, I refreshed the Cortex Search Service by running the following query:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;CORTEX&lt;/span&gt; &lt;span class="k"&gt;SEARCH&lt;/span&gt; &lt;span class="n"&gt;SERVICE&lt;/span&gt; &lt;span class="n"&gt;PROD_DB&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AIML_SAMPLE&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CSS_SNOWFLAKE_CASE_STUDIES&lt;/span&gt; &lt;span class="n"&gt;REFRESH&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Querying through Snowflake Intelligence again confirmed that the updated data was being used for responses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhhztnhg5lsdd0vmqk2f7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhhztnhg5lsdd0vmqk2f7.png" alt="2026-02-11_07h41_02"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I explored building a dbt Incremental model that parses PDFs and chunks them for Cortex Search Service, and shared the implementation in this article.&lt;/p&gt;

&lt;p&gt;By using dbt's Incremental model, only files with changes are detected and processed by the &lt;code&gt;AI_PARSE_DOCUMENT&lt;/code&gt; function, which helps reduce costs.&lt;/p&gt;

&lt;p&gt;I hope you find this useful!&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>dbt</category>
    </item>
    <item>
      <title>Git-Driven dbt + BI Development in Omni: Dynamic Schemas for Safe Dev/Prod Switching</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Sat, 01 Nov 2025 02:22:55 +0000</pubDate>
      <link>https://forem.com/sagara/git-driven-dbt-bi-development-in-omni-dynamic-schemas-for-safe-devprod-switching-35h1</link>
      <guid>https://forem.com/sagara/git-driven-dbt-bi-development-in-omni-dynamic-schemas-for-safe-devprod-switching-35h1</guid>
      <description>&lt;p&gt;Note: This is the English translation of the original Japanese article:&lt;a href="https://dev.classmethod.jp/articles/omni-dynamic-schema-with-dbt/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/omni-dynamic-schema-with-dbt/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Omni offers several features that work great with dbt, and one standout is Dynamic Schemas, which lets you dynamically switch references to data in your dbt development environments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.omni.co/docs/integrations/dbt#getting-started-with-dynamic-schemas-and-dbt-environments" rel="noopener noreferrer"&gt;https://docs.omni.co/docs/integrations/dbt#getting-started-with-dynamic-schemas-and-dbt-environments&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I tried this feature, so I’ll summarize what I did in this post.&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;First, assume the following are already completed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A Connection to Snowflake has been created

&lt;ul&gt;
&lt;li&gt;Within the Connection, the &lt;a href="https://dev.classmethod.jp/articles/omni-dbt-integration-setup-202510/" rel="noopener noreferrer"&gt;dbt integration setup&lt;/a&gt; is complete&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqjgy6697vowvg3r8ommz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqjgy6697vowvg3r8ommz.png" alt="2025-10-17_15h45_58" width="800" height="628"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffu0yfzpuscb3yhe4fe79.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffu0yfzpuscb3yhe4fe79.png" alt="2025-10-20_11h30_32" width="800" height="764"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You have created one Model for the Connection, integrated it with GitHub, and required pull requests for changes to Shared Models (reference: &lt;a href="https://docs.omni.co/docs/integrations/git/setup" rel="noopener noreferrer"&gt;official docs&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy8mfz1ko5nupv2brp1y1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy8mfz1ko5nupv2brp1y1.png" alt="2025-10-17_15h47_24" width="800" height="501"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Create a new Environment for the dbt development environment
&lt;/h3&gt;

&lt;p&gt;Next, create a new Environment that corresponds to your dbt development environment.&lt;/p&gt;

&lt;p&gt;From the dbt tab of the target Connection, go to the Environments section at the bottom and click &lt;code&gt;Add Environment&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3q8g3pyhuzpwpco94z4o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3q8g3pyhuzpwpco94z4o.png" alt="2025-11-01_04h51_17" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmtt8yf44ue3zg06swgt0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmtt8yf44ue3zg06swgt0.png" alt="2025-11-01_04h52_21" width="800" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Configure the Environment as shown below. The key points are to set &lt;code&gt;Default Schema&lt;/code&gt; to the schema you use for development, and set &lt;code&gt;Target Name&lt;/code&gt; to the dbt target name you use for your development environment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1dosrhhx3t4gc1849f09.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1dosrhhx3t4gc1849f09.png" alt="2025-11-01_04h54_20" width="800" height="791"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Save&lt;/code&gt;. You’ll see a list of schemas used for the dbt development environment as shown below. If everything looks good, simply close the dialog with the X in the upper right.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvl2gvsxehlnbxy2n5frz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvl2gvsxehlnbxy2n5frz.png" alt="2025-11-01_04h58_12" width="800" height="699"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Enable Virtual Schemas
&lt;/h3&gt;

&lt;p&gt;Next, from the dbt tab of the target Connection, enable &lt;code&gt;Enable Virtual Schemas&lt;/code&gt; and click &lt;code&gt;Save&lt;/code&gt;. Enabling this creates a virtual schema named &lt;code&gt;omni_dbt&lt;/code&gt;. When referencing each view, you can use the form &lt;code&gt;omni_dbt__&amp;lt;view name&amp;gt;&lt;/code&gt;. When you switch the dbt environment using the Dynamic Schema feature, this virtual schema abstracts away the environment so that references dynamically switch between production and development schemas.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8rj27yickyc477wyzl5f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8rj27yickyc477wyzl5f.png" alt="2025-11-01_05h10_21" width="800" height="1050"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once Virtual Schemas are enabled, a &lt;code&gt;Migration&lt;/code&gt; panel should appear on the right. Before enabling Virtual Schemas, the existing Omni Semantic Layer code referenced a fixed schema for a particular environment. Running this migration automatically adds code that references the &lt;code&gt;omni_dbt&lt;/code&gt; schema.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Run migration&lt;/code&gt; to perform the migration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0fedd59j17jcg2azq5ck.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0fedd59j17jcg2azq5ck.png" alt="2025-11-01_05h20_30" width="800" height="648"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You’ll see the screen below. Check &lt;code&gt;Overwrite&lt;/code&gt; in the lower left and click &lt;code&gt;Run migration&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frydshdxsdzb7aafgfemc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frydshdxsdzb7aafgfemc.png" alt="2025-11-01_05h24_52" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, if you look at the Shared Model code, you’ll see a &lt;code&gt;VIRTUAL SCHEMAS&lt;/code&gt; section has been added, and each view file has been updated to reference the &lt;code&gt;omni_dbt&lt;/code&gt; schema.&lt;/p&gt;

&lt;p&gt;(From the created VIRTUAL SCHEMAS names, it looks like they’re created as &lt;code&gt;omni_dbt_&amp;lt;custom schema name of dbt production environment&amp;gt;&lt;/code&gt;.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuychdvedjesfc2hm0nlx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuychdvedjesfc2hm0nlx.png" alt="2025-11-01_05h27_08" width="800" height="804"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Update existing content to reference the Virtual Schema views
&lt;/h3&gt;

&lt;p&gt;If you’ve already built content in Omni, you’ll need to update existing references to point to the Virtual Schema views.&lt;/p&gt;

&lt;p&gt;Create a new branch and perform the following steps. Once done, open a pull request and merge it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuoi5lz7mkbu0qh5mkdr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuoi5lz7mkbu0qh5mkdr.png" alt="2025-11-01_05h35_34" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Update the view references in topics
&lt;/h4&gt;

&lt;p&gt;FYI, you can use &lt;code&gt;Ctrl + F&lt;/code&gt; to search/replace text within each file.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Before&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dhv5xsjmdxwybzg8mgk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dhv5xsjmdxwybzg8mgk.png" alt="2025-11-01_05h39_49" width="800" height="835"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl96dvzl419l7mohui8m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl96dvzl419l7mohui8m.png" alt="2025-11-01_05h41_35" width="800" height="831"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Update the view references in each Workbook query
&lt;/h4&gt;

&lt;p&gt;This is necessary if your Workbook queries reference raw views rather than topics.&lt;/p&gt;

&lt;p&gt;In the Content Validator, click &lt;code&gt;Show all documents&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqto56clwkbqfzesinanc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqto56clwkbqfzesinanc.png" alt="2025-11-01_05h51_52" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Replace, confirm you’re on the &lt;code&gt;View&lt;/code&gt; tab, then configure a replacement from the raw table view names to the Virtual Schema view names as shown below, and click &lt;code&gt;Replace&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiulysrhic6iak76h8i9v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiulysrhic6iak76h8i9v.png" alt="2025-11-01_05h53_41" width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya89nl6ladwn1fchcu7a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya89nl6ladwn1fchcu7a.png" alt="2025-11-01_05h56_44" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, while still on the branch, review each affected Workbook query. Verify that charts render correctly and that clicking &lt;code&gt;Go to definition&lt;/code&gt; on any field links to the view definition within the Virtual Schema.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40s3ix3mqtm14559d2ny.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40s3ix3mqtm14559d2ny.png" alt="2025-11-01_05h59_44" width="800" height="824"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsy66lkbxkpo93lui8sla.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsy66lkbxkpo93lui8sla.png" alt="2025-11-01_06h01_03" width="800" height="829"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Trying development with Dynamic Schemas pointing to the dbt development environment
&lt;/h2&gt;

&lt;p&gt;With the setup done, let’s try how development works using Dynamic Schemas to reference dbt’s development environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Task
&lt;/h3&gt;

&lt;p&gt;There’s a column named &lt;code&gt;ordered_at&lt;/code&gt; in one of the dbt Models used by Omni, and we’ll change it to &lt;code&gt;purchased_at&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwm8p2nrj1qbvk8mvkg80.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwm8p2nrj1qbvk8mvkg80.png" alt="2025-11-01_07h31_57" width="800" height="526"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Create a branch in dbt and develop
&lt;/h3&gt;

&lt;p&gt;Create a branch in dbt and make the following changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezthryecq48g0z3xsodt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezthryecq48g0z3xsodt.png" alt="2025-11-01_07h34_29" width="800" height="747"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn46hreq17cpfemwnbsx4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn46hreq17cpfemwnbsx4.png" alt="2025-11-01_07h34_50" width="800" height="625"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, run dbt build in the dbt Cloud IDE; the updated column name will be created in the dbt development schema in Snowflake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiw5w8ac05ttlmbw81ivj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiw5w8ac05ttlmbw81ivj.png" alt="2025-11-01_07h53_28" width="800" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Create a branch in Omni and make fixes while referencing the dbt dev environment via Dynamic Schemas
&lt;/h3&gt;

&lt;p&gt;Next, in Omni, create a branch on the affected Shared Model. (For reference, dbt and Omni are connected to separate GitHub repositories.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefliseljhtlg5dwx95p5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefliseljhtlg5dwx95p5.png" alt="2025-11-01_07h55_31" width="800" height="329"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, run Refresh Schema to reflect the latest dbt schema structure into Omni’s Schema Model.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevdi7esq5xcg0806v5ci.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fevdi7esq5xcg0806v5ci.png" alt="2025-11-01_10h19_50" width="800" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, click the dbt icon next to the branch name at the top and switch to the dbt development Environment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3pbcfxjw8uq4zkmqwhw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3pbcfxjw8uq4zkmqwhw.png" alt="2025-11-01_07h56_35" width="800" height="308"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the Environment switched to dbt’s dev environment, open the target view file and you’ll see &lt;code&gt;ordered_at&lt;/code&gt; is gone and &lt;code&gt;purchased_at&lt;/code&gt; is present.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fups0nccdjjguj94g5en6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fups0nccdjjguj94g5en6.png" alt="2025-11-01_10h23_11" width="800" height="829"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, when viewing the dashboard in the dbt dev Environment, you’ll get errors if those dashboards referenced the old &lt;code&gt;ordered_at&lt;/code&gt; column. Since we’re working on a branch, published dashboards still reference the dbt production Environment and do not error.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On a branch referencing the dbt dev environment: errors occur&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpvytqbbpbk942mygpf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpvytqbbpbk942mygpf2.png" alt="2025-11-01_07h59_50" width="800" height="579"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you exit the branch (published dashboards): no errors&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrwp5kn86q89clu3qmeu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrwp5kn86q89clu3qmeu.png" alt="2025-11-01_08h00_25" width="800" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Switch back to the branch, open the Shared Model IDE, and launch the Content Validator. As shown below, errors are detected due to mismatched field references.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flcnrqm1mzsip32sousn0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flcnrqm1mzsip32sousn0.png" alt="2025-11-01_08h03_25" width="800" height="649"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Replace and perform replacements for FIELD.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jea4pdcku24am26t9dt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jea4pdcku24am26t9dt.png" alt="2025-11-01_08h05_52" width="800" height="661"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you display the dashboard again, the field references have been updated and charts now render correctly—even while working on the branch. That completes the development and fixes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmba9jan8aigziltoqmo4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmba9jan8aigziltoqmo4.png" alt="2025-11-01_08h07_19" width="800" height="579"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Open a PR to main in dbt, merge, and run dbt build against production
&lt;/h3&gt;

&lt;p&gt;Commit on the dbt side, open a pull request, merge into main, and run dbt build against the dbt production environment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F65td1nl6nc5rusono63q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F65td1nl6nc5rusono63q.png" alt="2025-11-01_08h10_46" width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0p0kwxy3pyeyer40a2i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0p0kwxy3pyeyer40a2i.png" alt="2025-11-01_08h11_20" width="800" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi3nljchydnychjvkysi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi3nljchydnychjvkysi.png" alt="2025-11-01_08h14_30" width="800" height="552"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now the &lt;code&gt;purchased_at&lt;/code&gt; column has been added to the dbt production schema in Snowflake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc5posgmqv2hrd1rqcr0e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc5posgmqv2hrd1rqcr0e.png" alt="2025-11-01_08h16_08" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Open a PR to main in Omni and merge
&lt;/h3&gt;

&lt;p&gt;Finally, open a pull request to main in Omni and merge it.&lt;/p&gt;

&lt;p&gt;First, switch the Environment in Omni back to the dbt production environment. If you don’t do this before opening the pull request, all dashboards using this Shared Model will end up referencing the development environment, so be careful. (If you try to open a pull request while the Environment is set to development, Omni will show a warning.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sw8rsrijjfw7671ju9v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3sw8rsrijjfw7671ju9v.png" alt="2025-11-01_09h33_45" width="800" height="334"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, run &lt;code&gt;Sync dbt metadata&lt;/code&gt;. (Without this, the view file definitions for the production Environment were still outdated.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi9mx0bojb6mmh0a7s9vv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi9mx0bojb6mmh0a7s9vv.png" alt="2025-11-01_09h35_55" width="800" height="472"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, when you look at the target view file, &lt;code&gt;ordered_at&lt;/code&gt; is gone and &lt;code&gt;purchased_at&lt;/code&gt; is present.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2wlh12awtjl6ymyhrdel.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2wlh12awtjl6ymyhrdel.png" alt="2025-11-01_10h37_36" width="800" height="838"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, when viewing the target dashboard with the Environment set to production, charts display correctly. (If the chart looks different from the first screenshot, that’s due to the dbt logic used—please ignore it.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2hhbztau9u94vobq4nm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2hhbztau9u94vobq4nm.png" alt="2025-11-01_10h43_39" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With everything confirmed, open a pull request while the Environment is set to production. (If no code changes exist, you won’t be able to open a pull request, so add something minimal like a space to a view file’s description.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8o5zw9v0a5irbg7flcx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8o5zw9v0a5irbg7flcx.png" alt="2025-11-01_10h53_28" width="800" height="297"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa50uhc01ougwi5zqvfc8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa50uhc01ougwi5zqvfc8.png" alt="2025-11-01_10h47_12" width="800" height="326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After merging the pull request, switch out of the branch and check the target dashboard. It renders without errors. This completes the end-to-end flow for changing a field definition in a dbt Model already referenced by Omni.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F81xpnsd2uhltswnf349z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F81xpnsd2uhltswnf349z.png" alt="2025-11-01_10h56_07" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Thoughts
&lt;/h2&gt;

&lt;p&gt;I tried Omni’s “Dynamic Schemas,” which let you update Semantic Layer definitions and fix dashboards while referencing data from dbt’s development environment. When I first read the docs I thought “this could be big,” and indeed it’s a very exciting feature. Being able to do Git-driven development that accounts for not just dbt but also Omni dashboards is excellent.&lt;/p&gt;

&lt;p&gt;On the other hand, a few things stood out at this stage. If these improve, I think we’ll get a truly top-tier dbt + BI development experience!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can’t inject user attributes into the dbt Environment defined in Omni. In other words, if you want to use personal dev schemas like dbt_ssagara, you need to add one Environment per person. (If you want to avoid that, you’d need to insert an intermediate branch like develop or staging between main and define the Omni Environment for that branch.)

&lt;ul&gt;
&lt;li&gt;Omni already has &lt;a href="https://docs.omni.co/docs/connections/manage/dynamic-database-environments" rel="noopener noreferrer"&gt;Dynamic Database Environments based on user attributes&lt;/a&gt;, so I’m hopeful for an update here!&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;The dashboard reference definitions surfaced by Content Validator are not code-ified, so as in this post, I had to add a half-width space to a description to get the final Omni pull request/merge through.&lt;/li&gt;

&lt;li&gt;After running dbt build against production, you still need to issue and merge a pull request on the Omni side to update the dbt production schema and reflect Content Validator changes. If dbt-side field changes are ongoing, published Omni dashboards might not reference the correct fields, causing temporary downtime.

&lt;ul&gt;
&lt;li&gt;There may be a smoother approach for this…&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

</description>
      <category>omni</category>
      <category>dbt</category>
    </item>
    <item>
      <title>Personal Picks: Data Product News (October 1, 2025)</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Wed, 01 Oct 2025 02:35:06 +0000</pubDate>
      <link>https://forem.com/sagara/personal-picks-data-product-news-october-1-2025-2p0a</link>
      <guid>https://forem.com/sagara/personal-picks-data-product-news-october-1-2025-2p0a</guid>
      <description>&lt;h1&gt;
  
  
  Modern Data Stack Information Summary - October 2025
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;This article is an English translation of the original Japanese version: &lt;a href="https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20251001/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20251001/&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Hello, this is Sagara.&lt;/p&gt;

&lt;p&gt;As a consultant specializing in Modern Data Stack, I observe that the Modern Data Stack ecosystem generates a tremendous amount of information daily.&lt;/p&gt;

&lt;p&gt;Among this wealth of information, this article summarizes the Modern Data Stack-related updates that caught my attention over the past two weeks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Disclaimer:&lt;/strong&gt; This is not an exhaustive list of all the latest updates for the mentioned products. The information included here is based on &lt;strong&gt;my personal judgment and interests&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Modern Data Stack General
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Launch of "Open Semantic Interchange (OSI)"
&lt;/h3&gt;

&lt;p&gt;Snowflake, Salesforce, dbt Labs, and other companies have announced the launch of an open-source initiative called "Open Semantic Interchange (OSI)" to promote data utilization for AI.&lt;/p&gt;

&lt;p&gt;This initiative aims to build a common semantic data framework by standardizing fragmented Semantic Layer definitions that vary across different products through a vendor-neutral open specification.&lt;/p&gt;

&lt;p&gt;The following vendors are listed as Launch Partners:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1tvx13hwbdgnusipyq2d.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1tvx13hwbdgnusipyq2d.webp" alt="press-release-open-semantic-interchange-1200x500-blackrock" width="800" height="343"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Below are the press releases from Snowflake and Salesforce regarding this announcement:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.snowflake.com/en/news/press-releases/snowflake-salesforce-dbt-labs-and-more-revolutionize-data-readiness-for-ai-with-open-semantic-interchange-initiative/" rel="noopener noreferrer"&gt;https://www.snowflake.com/en/news/press-releases/snowflake-salesforce-dbt-labs-and-more-revolutionize-data-readiness-for-ai-with-open-semantic-interchange-initiative/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.salesforce.com/blog/agentic-future-demands-open-semantic-layer/" rel="noopener noreferrer"&gt;https://www.salesforce.com/blog/agentic-future-demands-open-semantic-layer/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While other participating products have also published blogs about this announcement, I found Select Star's post particularly interesting. As shown in the figure quoted from their blog, if this can be realized, Select Star could act as a hub to coordinate Semantic Layer definitions with BI tools not participating in the Open Semantic Interchange initiative, which I find exciting.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.selectstar.com/resources/snowflake-ai-ready-semantic-model" rel="noopener noreferrer"&gt;https://www.selectstar.com/resources/snowflake-ai-ready-semantic-model&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi1ejh2xqwzd0des00er.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi1ejh2xqwzd0des00er.png" alt="2025-10-01_10h36_14" width="800" height="692"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  "Everyone's Strongest Data Platform Architecture Vol. 5 - All-Star Special!!" Event Held
&lt;/h3&gt;

&lt;p&gt;On September 25th, "Everyone's Strongest Data Platform Architecture Vol. 5 - All-Star Special!!" was held.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://datatech-jp.connpass.com/event/360596/" rel="noopener noreferrer"&gt;https://datatech-jp.connpass.com/event/360596/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The event had over 100 in-person attendees and more than 500 online participants. You can get a sense of the event's excitement by checking the hashtag "みん強" (Min-Kyō) at the following link:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://x.com/hashtag/%E3%81%BF%E3%82%93%E5%BC%B7?src=hashtag_click" rel="noopener noreferrer"&gt;https://x.com/hashtag/%E3%81%BF%E3%82%93%E5%BC%B7?src=hashtag_click&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Below are links to the presentation materials from each speaker that I was able to find:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://speakerdeck.com/kaz3284/minqiang-di-5hui-kubellnodetaji-pan-kai-fa-nozui-xin-zhuang-kuang-toainohuo-yong-noshi-jian-nituite" rel="noopener noreferrer"&gt;https://speakerdeck.com/kaz3284/minqiang-di-5hui-kubellnodetaji-pan-kai-fa-nozui-xin-zhuang-kuang-toainohuo-yong-noshi-jian-nituite&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://speakerdeck.com/tenajima/data-vaultwoyong-itamarutipurodakutonotamenodetaji-pan-kai-fa" rel="noopener noreferrer"&gt;https://speakerdeck.com/tenajima/data-vaultwoyong-itamarutipurodakutonotamenodetaji-pan-kai-fa&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://speakerdeck.com/pei0804/revops-practice-learned" rel="noopener noreferrer"&gt;https://speakerdeck.com/pei0804/revops-practice-learned&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://speakerdeck.com/foursue/20250924-lt2ben-yaru" rel="noopener noreferrer"&gt;https://speakerdeck.com/foursue/20250924-lt2ben-yaru&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://speakerdeck.com/genshun9/minqiang-nokoremadetokorekara" rel="noopener noreferrer"&gt;https://speakerdeck.com/genshun9/minqiang-nokoremadetokorekara&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Extract/Load
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Airbyte
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Airbyte 2.0 Released
&lt;/h4&gt;

&lt;p&gt;Airbyte has released version 2.0, marking a major version upgrade. (The OSS version has not yet released 2.0.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://airbyte.com/v2" rel="noopener noreferrer"&gt;https://airbyte.com/v2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://airbyte.com/blog/airbyte-2-0" rel="noopener noreferrer"&gt;https://airbyte.com/blog/airbyte-2-0&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As quoted from the links above, the following features have been released:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Enterprise Flex&lt;/strong&gt;: An architecture that separates the control plane and data plane, providing a hybrid model where management is done in the cloud while actual data remains within the customer's infrastructure&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Activation&lt;/strong&gt;: A feature that directly syncs insights from data warehouses to business applications like Salesforce and HubSpot. This allows the Reverse ETL process to be completed within the platform&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speed&lt;/strong&gt;: Connector architecture has been redesigned to improve data sync speed by 4-10x. For example, MySQL to S3 sync is 4.7x faster, and Postgres to S3 is 12x faster&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;New Pricing Plans&lt;/strong&gt;: A new plan structure tailored to team growth stages. The "Capacity Based Pricing" introduced for Pro plans and above is particularly notable, as it's based on required parallel processing capacity (Data Workers) rather than data transfer volume

&lt;ul&gt;
&lt;li&gt;Core (formerly OSS): Free open-source version&lt;/li&gt;
&lt;li&gt;Standard (formerly Cloud): Pay-as-you-go managed service&lt;/li&gt;
&lt;li&gt;Pro (formerly Teams): Capacity-based pricing with governance features like RBAC and SSO&lt;/li&gt;
&lt;li&gt;Enterprise Flex: All Pro features plus the ability to deploy data planes anywhere—cloud, multi-cloud, or on-premises&lt;/li&gt;
&lt;li&gt;Self-Managed Enterprise: Fully self-managed enterprise version for organizations with strict security requirements&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Warehouse/Data Lakehouse
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Snowflake
&lt;/h3&gt;

&lt;h4&gt;
  
  
  FILE Data Type Generally Available
&lt;/h4&gt;

&lt;p&gt;The FILE data type for handling unstructured data in Snowflake is now generally available.&lt;/p&gt;

&lt;p&gt;This enables confident use of generative AI with Cortex AI SQL for images and document files!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2025/other/2025-09-25-file-data-type-ga" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2025/other/2025-09-25-file-data-type-ga&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Cortex Analyst Feature Enhancements
&lt;/h4&gt;

&lt;p&gt;Cortex Analyst has received functional updates with two new features added. Derived metrics is a capability that other Semantic Layers already had, and since actual business often requires calculations using multiple metrics, this is a welcome addition!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Private facts and metrics&lt;/strong&gt;: A feature that defines metrics in the Semantic Model but prevents end users from directly querying these metrics (primarily intended for metrics used only in Derived metrics)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Derived metrics&lt;/strong&gt;: A new type of metric that allows defining metrics based on calculations between multiple metrics&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2025/other/2025-09-30-semantic-model-improvements" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2025/other/2025-09-30-semantic-model-improvements&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  dbt Projects on Snowflake Now Supports docs generate
&lt;/h4&gt;

&lt;p&gt;dbt Projects on Snowflake received a silent update that now enables docs generate functionality.&lt;/p&gt;

&lt;p&gt;While I haven't tested it yet, this should allow the &lt;code&gt;execute dbt project&lt;/code&gt; command to perform docs generate when hosting docs with GitHub Actions, eliminating the need to rewrite &lt;code&gt;profiles.yml&lt;/code&gt; for dbt Core!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://x.com/SS_chneider/status/1973154146976145839" rel="noopener noreferrer"&gt;https://x.com/SS_chneider/status/1973154146976145839&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Claude Sonnet 4.5 Now Available in Snowflake
&lt;/h4&gt;

&lt;p&gt;Claude Sonnet 4.5 is now available within Snowflake. The &lt;a href="https://docs.snowflake.com/en/user-guide/snowflake-cortex/aisql" rel="noopener noreferrer"&gt;official documentation&lt;/a&gt; doesn't mention it yet.&lt;/p&gt;

&lt;p&gt;Additionally, it's accessible in unsupported regions by enabling cross-region inference.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.snowflake.com/en/blog/cortex-ai-claude-sonnet-4-5/" rel="noopener noreferrer"&gt;https://www.snowflake.com/en/blog/cortex-ai-claude-sonnet-4-5/&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  SELECT's Summary Article on Snowflake Features Released in Summer 2025
&lt;/h4&gt;

&lt;p&gt;SELECT has published a summary article on Snowflake features released in summer 2025.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://select.dev/posts/snowflake-summer-2025-product-updates" rel="noopener noreferrer"&gt;https://select.dev/posts/snowflake-summer-2025-product-updates&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Best Practices Article for Combining Snowflake × Power BI
&lt;/h4&gt;

&lt;p&gt;phData has published a best practices article for combining Snowflake × Power BI.&lt;/p&gt;

&lt;p&gt;The article mainly covers the following topics:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use Power BI's native Snowflake Connector&lt;/li&gt;
&lt;li&gt;Carefully select connection mode (Import, DirectQuery, or Composite) based on use case&lt;/li&gt;
&lt;li&gt;Properly model data, including adopting star schema&lt;/li&gt;
&lt;li&gt;Configure Microsoft Entra SSO for Snowflake&lt;/li&gt;
&lt;li&gt;Use appropriate Azure VMs for gateways&lt;/li&gt;
&lt;li&gt;Minimize distance between Snowflake and Power BI data centers&lt;/li&gt;
&lt;li&gt;Increase concurrent query limits for data models&lt;/li&gt;
&lt;li&gt;Leverage AI features like Copilot&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://www.phdata.io/blog/how-to-optimize-power-bi-and-snowflake-for-advanced-analyitcs/" rel="noopener noreferrer"&gt;https://www.phdata.io/blog/how-to-optimize-power-bi-and-snowflake-for-advanced-analyitcs/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  BigQuery
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Column-Level Lineage Now Available in Dataplex
&lt;/h4&gt;

&lt;p&gt;As a new feature in Dataplex, column-level lineage viewing is now available (generally available).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/dataplex/docs/release-notes#September_29_2025" rel="noopener noreferrer"&gt;https://cloud.google.com/dataplex/docs/release-notes#September_29_2025&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/dataplex/docs/lineage-views#column-level-lineage" rel="noopener noreferrer"&gt;https://cloud.google.com/dataplex/docs/lineage-views#column-level-lineage&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feezpj73bfn5pympnaln3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feezpj73bfn5pympnaln3.png" alt="column-level-lineage" width="800" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Array Unnesting Feature Using Gemini Released
&lt;/h4&gt;

&lt;p&gt;A feature using Gemini that can expand each element of an array into independent rows has been released.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/bigquery/docs/release-notes#September_29_2025" rel="noopener noreferrer"&gt;https://cloud.google.com/bigquery/docs/release-notes#September_29_2025&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/bigquery/docs/data-prep-get-suggestions#unnest-arrays" rel="noopener noreferrer"&gt;https://cloud.google.com/bigquery/docs/data-prep-get-suggestions#unnest-arrays&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Summary Article on New BigQuery SQL Features
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://x.com/nii_yan" rel="noopener noreferrer"&gt;yu yamada&lt;/a&gt; from Google Cloud has published an article summarizing five new features related to BigQuery SQL, including UNION based on column names and simplified array operations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/google_cloud_jp/articles/3b20a94df7624e" rel="noopener noreferrer"&gt;https://zenn.dev/google_cloud_jp/articles/3b20a94df7624e&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Databricks
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Databricks One in Public Preview
&lt;/h4&gt;

&lt;p&gt;"Databricks One," a simple user interface designed for business users, has entered public preview.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.databricks.com/aws/ja/workspace/databricks-one" rel="noopener noreferrer"&gt;https://docs.databricks.com/aws/ja/workspace/databricks-one&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As shown in the figure below, it features a UI where you can ask questions about data in natural language and directly link to related dashboards.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsjaejntspl41h7139k6b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsjaejntspl41h7139k6b.png" alt="landing-page-d83506567dae89e178878be9b9506725" width="800" height="565"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Lakeflow Pipelines Editor in Public Preview
&lt;/h4&gt;

&lt;p&gt;Databricks has released "Lakeflow Pipelines Editor," a new IDE for developing and debugging ETL pipelines, as a public preview.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.databricks.com/aws/en/dlt/dlt-multi-file-editor" rel="noopener noreferrer"&gt;https://docs.databricks.com/aws/en/dlt/dlt-multi-file-editor&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As shown in the figure quoted from the link above, it's not just for editing pipeline code but also allows viewing dependencies between tables.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl7ms0sk61ns2dn2kkz8i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl7ms0sk61ns2dn2kkz8i.png" alt="dlt-multi-file-editor-overview-bd4eb971616acd036963cdd1560b1d8f" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  OpenAI GPT-5 and Claude Sonnet 4.5 Now Available in Databricks
&lt;/h4&gt;

&lt;p&gt;While these are separate announcements, both GPT-5 and Sonnet 4.5 are now available within Databricks.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.databricks.com/blog/run-openai-models-directly-databricks" rel="noopener noreferrer"&gt;https://www.databricks.com/blog/run-openai-models-directly-databricks&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.databricks.com/blog/claude-sonnet-45-here" rel="noopener noreferrer"&gt;https://www.databricks.com/blog/claude-sonnet-45-here&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  MotherDuck/DuckDB
&lt;/h3&gt;

&lt;h4&gt;
  
  
  DuckDB ducklake Extension and DuckLake 0.3 Released
&lt;/h4&gt;

&lt;p&gt;The DuckDB ducklake extension and DuckLake 0.3 have been released. Using the ducklake extension requires DuckDB v1.4.0.&lt;/p&gt;

&lt;p&gt;The main updates appear to be data copying between DuckLake and Iceberg using DuckDB's iceberg extension, and using the MERGE statement released in DuckDB v1.4.0 through the ducklake extension.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://duckdb.org/2025/09/17/ducklake-03.html" rel="noopener noreferrer"&gt;https://duckdb.org/2025/09/17/ducklake-03.html&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  MotherDuck Announces First European Cloud Region in Private Preview
&lt;/h4&gt;

&lt;p&gt;MotherDuck has announced its first European cloud region as a private preview.&lt;/p&gt;

&lt;p&gt;This new region runs on AWS &lt;code&gt;eu-central-1&lt;/code&gt;, with official release planned for this fall.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/blog/motherduck-in-europe/" rel="noopener noreferrer"&gt;https://motherduck.com/blog/motherduck-in-europe/&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Business Intelligence
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Looker
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Looker Accessible from Gemini CLI
&lt;/h4&gt;

&lt;p&gt;A feature to access Looker has been released as an extension for Gemini CLI.&lt;/p&gt;

&lt;p&gt;It appears you can check available Explores, confirm dimensions and measures available in specified Explores, and even create Looks and dashboards in Looker.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/looker/docs/release-notes#September_23_2025" rel="noopener noreferrer"&gt;https://cloud.google.com/looker/docs/release-notes#September_23_2025&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/gemini-cli-extensions/looker" rel="noopener noreferrer"&gt;https://github.com/gemini-cli-extensions/looker&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Activation (Reverse ETL)
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Hightouch
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Dashboards Now Available in Hightouch
&lt;/h4&gt;

&lt;p&gt;As a new feature in Hightouch, functionality to consolidate multiple charts into dashboards has been released.&lt;/p&gt;

&lt;p&gt;This should be useful for cases where you want to check everything in Hightouch, such as dashboards for confirming campaign performance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://changelog.hightouch.io/" rel="noopener noreferrer"&gt;https://changelog.hightouch.io/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://hightouch.com/docs/campaign-intelligence/dashboards" rel="noopener noreferrer"&gt;https://hightouch.com/docs/campaign-intelligence/dashboards&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd9kjifoqmeqkeoghwykl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd9kjifoqmeqkeoghwykl.png" alt="dashboard_add_additional_chart" width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Orchestration
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Airflow
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Airflow 3.1 Released
&lt;/h4&gt;

&lt;p&gt;Airflow's latest version 3.1 has been released.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/apache/airflow/releases/tag/3.1.0" rel="noopener noreferrer"&gt;https://github.com/apache/airflow/releases/tag/3.1.0&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Astronomer has published a blog post summarizing the added features.&lt;/p&gt;

&lt;p&gt;It appears that improvements to AI workflow support, updates to a React-based UI interface, and DAG favorites functionality have been added.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.astronomer.io/blog/introducing-apache-airflow-3-1/" rel="noopener noreferrer"&gt;https://www.astronomer.io/blog/introducing-apache-airflow-3-1/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>dataengineering</category>
      <category>snowflake</category>
      <category>databricks</category>
    </item>
    <item>
      <title>Personal Picks: Data Product News (August 20, 2025)</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Wed, 20 Aug 2025 01:46:10 +0000</pubDate>
      <link>https://forem.com/sagara/personal-picks-data-product-news-august-20-2025-85j</link>
      <guid>https://forem.com/sagara/personal-picks-data-product-news-august-20-2025-85j</guid>
      <description>&lt;p&gt;&lt;em&gt;Note: This is an English translation of the Japanese article at &lt;a href="https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20250820/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20250820/&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Hi, this is Sagara.&lt;/p&gt;

&lt;p&gt;As a consultant specializing in Modern Data Stack, I'm constantly exposed to the vast amount of information being shared in this space daily.&lt;/p&gt;

&lt;p&gt;Among the numerous updates, I've compiled the Modern Data Stack-related information that caught my attention over the past two weeks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note: This article doesn't cover all the latest information about the mentioned products. It only includes information that I found interesting based on **my personal judgment and preferences&lt;/strong&gt;.*&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Warehouse/Data Lakehouse
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Snowflake
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Database Backup "Snapshots" That Cannot Be Deleted or Edited Even by ACCOUNTADMIN Now in Public Preview
&lt;/h4&gt;

&lt;p&gt;Snowflake's new Snapshot feature is now in public preview.&lt;/p&gt;

&lt;p&gt;The key feature is that, similar to clones, it can replicate data with zero-copy, but with the Retention lock feature, it can be maintained as an undeletable and uneditable backup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2025/other/2025-08-18-worm-snapshots" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2025/other/2025-08-18-worm-snapshots&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/snapshots" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/snapshots&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've actually tried it myself and written a blog post about it - please check it out!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/snowflake-try-snapshot/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-try-snapshot/&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Stored Procedure "AI_GENERATE_TABLE_DESC" for Generating Descriptions Using Generative AI Now in Public Preview
&lt;/h4&gt;

&lt;p&gt;Snowflake's new stored procedure "AI_GENERATE_TABLE_DESC" for generating descriptions using generative AI is now in public preview. Previously, this could only be done by clicking buttons in Snowsight, but now SQL command-based description generation using generative AI is possible.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/release-notes/2025/other/2025-08-14-sql-object-descriptions" rel="noopener noreferrer"&gt;https://docs.snowflake.com/release-notes/2025/other/2025-08-14-sql-object-descriptions&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/sql-cortex-descriptions" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/sql-cortex-descriptions&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's my blog post about trying this feature. While AI_GENERATE_TABLE_DESC returns descriptions in English, I've also written about a custom stored procedure that translates and stores them in Japanese using the Translate function - please take a look!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/snowflake-try-generate-table-desc/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-try-generate-table-desc/&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Cortex Knowledge Extensions Now Generally Available
&lt;/h4&gt;

&lt;p&gt;Snowflake's Cortex Knowledge Extensions is now generally available. This feature allows content that can be referenced by agent functions like Snowflake Intelligence to be obtained from the Marketplace. In essence, databases with embedded Cortex Search Service can now be obtained through the Marketplace.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.snowflake.com/en/blog/easy-button-context-rich-ai-agents/" rel="noopener noreferrer"&gt;https://www.snowflake.com/en/blog/easy-button-context-rich-ai-agents/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I tried the official Snowflake documentation's Cortex Knowledge Extensions with Snowflake Intelligence, and the answer accuracy clearly improved! This is a great example of how good data leads to good AI results.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/snowflake-try-cortex-knowledge-extensions-with-snowflake-intelligence/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-try-cortex-knowledge-extensions-with-snowflake-intelligence/&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Workload Identity Federation Now Generally Available
&lt;/h4&gt;

&lt;p&gt;Snowflake has released Workload identity federation as a new authentication mechanism.&lt;/p&gt;

&lt;p&gt;With Workload identity federation, you can build service-to-service authentication mechanisms that authenticate to Snowflake using cloud provider ID systems such as AWS IAM, Microsoft Entra ID, and Google Cloud.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2025/other/2025-08-14-wif" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2025/other/2025-08-14-wif&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/workload-identity-federation" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/workload-identity-federation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For practical usage, the following blog is very helpful:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/jimatomo/articles/c514c6e322bf1a" rel="noopener noreferrer"&gt;https://zenn.dev/jimatomo/articles/c514c6e322bf1a&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking ahead, if various SaaS/OSS tools that require authentication when integrating with Snowflake support Workload identity federation, we can connect these tools to Snowflake more securely and easily! (For example, looking at the latest roadmap for terraform-provider-snowflake linked below, it seems to be planned for implementation by the end of 2025.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/snowflakedb/terraform-provider-snowflake/blob/main/ROADMAP.md" rel="noopener noreferrer"&gt;https://github.com/snowflakedb/terraform-provider-snowflake/blob/main/ROADMAP.md&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Snowpipe Billing Model Changed to Simple Volume-Based for Business Critical and Above Plans
&lt;/h4&gt;

&lt;p&gt;With &lt;a href="https://docs.snowflake.com/en/release-notes/2025/9_21#simplified-snowpipe-pricing" rel="noopener noreferrer"&gt;the 9.21 release around August 1st&lt;/a&gt;, the Snowpipe billing model has been changed to a simple volume-based system for Business Critical and above plans.&lt;/p&gt;

&lt;p&gt;This makes estimation much easier than before, and the new billing model seems better for cases where you want to load many small files!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bppkxtuw7m7wup5x5oh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bppkxtuw7m7wup5x5oh.png" alt="2025-08-14_14h20_38_720" width="720" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For details, please check the official information below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/data-load-snowpipe-billing" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/data-load-snowpipe-billing&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.snowflake.com/legal-files/CreditConsumptionTable.pdf" rel="noopener noreferrer"&gt;https://www.snowflake.com/legal-files/CreditConsumptionTable.pdf&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  New Community-Built Snowflake MCP Server
&lt;/h4&gt;

&lt;p&gt;A new Snowflake MCP Server has been released by the international community. While the official MCP Server works through Cortex Agents, this one uses Python Connector for connection, allowing for a broader range of requests to Snowflake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/snowflake/the-general-purpose-snowflake-mcp-server-sql-operation-through-natural-language-ddd33bba4fa7" rel="noopener noreferrer"&gt;https://medium.com/snowflake/the-general-purpose-snowflake-mcp-server-sql-operation-through-natural-language-ddd33bba4fa7&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/uniquejtx/snowflake-generic-mcp" rel="noopener noreferrer"&gt;https://github.com/uniquejtx/snowflake-generic-mcp&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Databricks
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Unity Catalog Adds User Access Request Feature for Data Objects (Public Preview)
&lt;/h4&gt;

&lt;p&gt;Unity Catalog in Databricks has added a new feature for user access requests to data objects.&lt;/p&gt;

&lt;p&gt;The functionality allows you to pre-configure notification destinations such as email addresses or Slack channels, and when users make access requests, notifications are sent to the specified destinations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.databricks.com/aws/en/data-governance/unity-catalog/manage-privileges/access-request-destinations" rel="noopener noreferrer"&gt;https://docs.databricks.com/aws/en/data-governance/unity-catalog/manage-privileges/access-request-destinations&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Unity Catalog REST API Usage Guide
&lt;/h4&gt;

&lt;p&gt;The Unity Catalog blog published an article summarizing baseline usage patterns for the Unity Catalog REST API.&lt;/p&gt;

&lt;p&gt;Specifically, it explains common operations like listing (GET), creating (POST), updating (PATCH), and deleting (DELETE) for Catalogs and Tables using Python's requests library, with concrete code examples.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.unitycatalog.io/blogs/how-to-use-the-unity-catalog-rest-api" rel="noopener noreferrer"&gt;https://www.unitycatalog.io/blogs/how-to-use-the-unity-catalog-rest-api&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Transform
&lt;/h2&gt;

&lt;h3&gt;
  
  
  dbt
&lt;/h3&gt;

&lt;h4&gt;
  
  
  dbt Fusion and Official VS Code Extension Move from Beta to Preview
&lt;/h4&gt;

&lt;p&gt;dbt Fusion and the official VS Code Extension, previously available as Beta, have moved to Preview status.&lt;/p&gt;

&lt;p&gt;According to the article below, they defined a metric called "Fusion conformance" to prove that Fusion performs exactly the same as dbt Core in specific dbt projects. This metric has passed for a sufficient percentage of users' dbt projects, giving them confidence for the preview release.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.getdbt.com/blog/fusion-and-dbt-vs-code-extension-preview-launch" rel="noopener noreferrer"&gt;https://www.getdbt.com/blog/fusion-and-dbt-vs-code-extension-preview-launch&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Business Intelligence
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Looker
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Looker 25.14 Release Notes Published
&lt;/h4&gt;

&lt;p&gt;The release notes for Looker's latest version 25.14 have been published.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/looker/docs/release-notes#August_13_2025" rel="noopener noreferrer"&gt;https://cloud.google.com/looker/docs/release-notes#August_13_2025&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'm particularly excited about the ability to define &lt;code&gt;synonyms&lt;/code&gt; in views! Since internal conversations often use abbreviations for metrics, defining these abbreviations as synonyms should enable more natural interactions in Conversational Analytics to get the desired data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/looker/docs/reference/param-field-synonyms" rel="noopener noreferrer"&gt;https://cloud.google.com/looker/docs/reference/param-field-synonyms&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Omni
&lt;/h3&gt;

&lt;h4&gt;
  
  
  "Omni Spreadsheets" Released - Perform Data Processing and Aggregation with Spreadsheet-Like Operations
&lt;/h4&gt;

&lt;p&gt;Omni has released "Omni spreadsheets," a new feature that allows data processing and aggregation with almost the same operations as spreadsheets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://omni.co/blog/building-our-financial-models-with-omni-spreadsheets" rel="noopener noreferrer"&gt;https://omni.co/blog/building-our-financial-models-with-omni-spreadsheets&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.omni.co/docs/querying-and-sql/workbook/spreadsheet-tabs" rel="noopener noreferrer"&gt;https://docs.omni.co/docs/querying-and-sql/workbook/spreadsheet-tabs&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at the demo below, it's almost like a spreadsheet, making it a great feature for creating rich tabular reports that can only be made in spreadsheets. However, a concern is the risk of accumulating data with various calculated metrics in spreadsheets separate from Omni's defined Semantic Layer. It would be nice if this could be controlled well with permissions!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=aBjnn8FUHxE" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=aBjnn8FUHxE&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Omni Can Now Push dbt Exposures
&lt;/h4&gt;

&lt;p&gt;I've only seen this in the ChangeLog, but Omni can now push dbt exposures as a new feature.&lt;/p&gt;

&lt;p&gt;This allows you to output how Omni content is linked to dbt Models as exposures and view them in lineage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://omni.co/changelog" rel="noopener noreferrer"&gt;https://omni.co/changelog&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Catalog
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Select Star
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Select Star's August Release Summary
&lt;/h4&gt;

&lt;p&gt;Select Star's ChangeLog published a summary of August releases.&lt;/p&gt;

&lt;p&gt;Notable updates include ER diagram refresh and MCP Server release.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.selectstar.com/changelog/aug-7-2025-clearer-erd-ai-ready-metadata-mcp-smarter-metrics-and-more" rel="noopener noreferrer"&gt;https://docs.selectstar.com/changelog/aug-7-2025-clearer-erd-ai-ready-metadata-mcp-smarter-metrics-and-more&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Quality・Data Observability
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Elementary
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Elementary's July Update Summary
&lt;/h4&gt;

&lt;p&gt;Elementary's official blog published an article summarizing July updates.&lt;/p&gt;

&lt;p&gt;I was particularly interested in the MCP Server and the feature to exclude anomalous data during training for anomaly detection.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.elementary-data.com/post/july-product-update" rel="noopener noreferrer"&gt;https://www.elementary-data.com/post/july-product-update&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Orchestration
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Dagster
&lt;/h3&gt;

&lt;h4&gt;
  
  
  MCP Server Released
&lt;/h4&gt;

&lt;p&gt;Dagster has released an MCP Server.&lt;/p&gt;

&lt;p&gt;According to the blog below, use cases include creating project templates and building workflows by integrating with dbt and Snowflake MCP Servers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dagster.io/blog/dagsters-mcp-server" rel="noopener noreferrer"&gt;https://dagster.io/blog/dagsters-mcp-server&lt;/a&gt;&lt;/p&gt;

</description>
      <category>dataengineering</category>
      <category>snowflake</category>
      <category>mcp</category>
    </item>
    <item>
      <title>Running the "Quickstart for dbt and Snowflake" Tutorial with dbt Projects on Snowflake</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Wed, 13 Aug 2025 05:35:24 +0000</pubDate>
      <link>https://forem.com/sagara/running-the-quickstart-for-dbt-and-snowflake-tutorial-with-dbt-projects-on-snowflake-5amm</link>
      <guid>https://forem.com/sagara/running-the-quickstart-for-dbt-and-snowflake-tutorial-with-dbt-projects-on-snowflake-5amm</guid>
      <description>&lt;p&gt;&lt;em&gt;This article is an English translation of: &lt;a href="https://dev.classmethod.jp/articles/dbt-quickstart-for-dbt-and-snowflake-with-dbt-projects-on-snowflake/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/dbt-quickstart-for-dbt-and-snowflake-with-dbt-projects-on-snowflake/&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Hi, this is Sagara.&lt;/p&gt;

&lt;p&gt;An official tutorial called "Quickstart for dbt and Snowflake" is available for dbt Cloud. This tutorial is recommended as a first step to learn about dbt itself.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=1" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=1&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally, as of the end of June 2025, dbt Projects on Snowflake, which allows you to develop and run dbt within Snowflake, is in public preview.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/user-guide/data-engineering/dbt-projects-on-snowflake" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/user-guide/data-engineering/dbt-projects-on-snowflake&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/snowflake-pupr-dbt-projects/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-pupr-dbt-projects/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I tried running this "Quickstart for dbt and Snowflake" tutorial with dbt Projects on Snowflake, and I'll summarize the experience in this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Important Notes
&lt;/h2&gt;

&lt;p&gt;The following dbt Cloud-specific features are &lt;strong&gt;out of scope&lt;/strong&gt; for this article. Please be aware:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using Managed Repository

&lt;ul&gt;
&lt;li&gt;In this tutorial, we'll develop without Git integration&lt;/li&gt;
&lt;li&gt;For information about Git integration in dbt Projects on Snowflake, please refer to &lt;a href="https://dev.classmethod.jp/articles/try-dbt-projects-on-snowflake-with-github/" rel="noopener noreferrer"&gt;this blog post&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Viewing documentation&lt;/li&gt;

&lt;li&gt;Environment and Job configuration (dbt Cloud-only features)

&lt;ul&gt;
&lt;li&gt;In dbt Projects on Snowflake, these are replaced by &lt;code&gt;profiles.yml&lt;/code&gt; definitions and dbt Projects deployment to task execution&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  0. Prerequisites
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Creating a Snowflake Trial Account
&lt;/h3&gt;

&lt;p&gt;We'll use a Snowflake trial account for this tutorial, so please create one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://signup.snowflake.com/?owner=classmethodlead&amp;amp;plan=enterprise&amp;amp;cloud=aws&amp;amp;region=ap-northeast-1&amp;amp;utm_source=dev.classmethod.jp&amp;amp;utm_medium=banner&amp;amp;utm_content=snowflake&amp;amp;utm_campaign=single_article_foot_ads" rel="noopener noreferrer"&gt;https://signup.snowflake.com/?owner=classmethodlead&amp;amp;plan=enterprise&amp;amp;cloud=aws&amp;amp;region=ap-northeast-1&amp;amp;utm_source=dev.classmethod.jp&amp;amp;utm_medium=banner&amp;amp;utm_content=snowflake&amp;amp;utm_campaign=single_article_foot_ads&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Enabling Secondary Roles (if not already enabled)
&lt;/h3&gt;

&lt;p&gt;Next, check if secondary roles are enabled in your worksheet.&lt;/p&gt;

&lt;p&gt;Run the following query and confirm that &lt;code&gt;default_secondary_roles&lt;/code&gt; is set to &lt;code&gt;["ALL"]&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;show&lt;/span&gt; &lt;span class="n"&gt;users&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If it shows a different value, run the following query to enable secondary roles:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;alter&lt;/span&gt; &lt;span class="k"&gt;user&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;username&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt; &lt;span class="n"&gt;default_secondary_roles&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'all'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  1. Introduction
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=1" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=1&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This chapter explains what we'll do in this tutorial.&lt;/p&gt;

&lt;p&gt;Here's what we'll cover in order:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a new Snowflake worksheet&lt;/li&gt;
&lt;li&gt;Load sample data into your Snowflake account&lt;/li&gt;
&lt;li&gt;Convert sample queries into models in your dbt project (dbt models are SELECT statements)&lt;/li&gt;
&lt;li&gt;Add sources to your dbt project. Sources allow you to name and describe raw data already loaded into Snowflake&lt;/li&gt;
&lt;li&gt;Add tests to your models&lt;/li&gt;
&lt;li&gt;Document your models&lt;/li&gt;
&lt;li&gt;Schedule job runs

&lt;ul&gt;
&lt;li&gt;*Note: For jobs, we'll substitute with dbt Projects deployment to task execution&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  2. Create a new Snowflake worksheet
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=2" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll create a worksheet to run queries in Snowflake.&lt;/p&gt;

&lt;p&gt;From the left menu in Snowflake, go to &lt;code&gt;Projects&lt;/code&gt;, then click &lt;code&gt;Worksheets&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9o23kt6yb7sczdlolni.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9o23kt6yb7sczdlolni.png" alt="2025-08-13_08h51_05" width="292" height="256"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click the &lt;code&gt;+&lt;/code&gt; button in the upper right to create a new worksheet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiy0p49af5wapb7mj6l38.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiy0p49af5wapb7mj6l38.png" alt="2025-08-13_08h52_45" width="800" height="243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Load data
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=3" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=3&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll load the data needed for the tutorial.&lt;/p&gt;

&lt;p&gt;First, run the following query to create the necessary database, schemas, tables, and warehouse for the tutorial. The tables will also load data from public S3.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Important: For line 8 &lt;code&gt;create schema analytics.dbt_ssagara;&lt;/code&gt;, please use your own name or something that identifies it as your schema.&lt;/strong&gt; (Since my name is Sagara Satoshi, I'm using &lt;code&gt;dbt_ssagara&lt;/code&gt;.)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;use&lt;/span&gt; &lt;span class="k"&gt;role&lt;/span&gt; &lt;span class="n"&gt;accountadmin&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="n"&gt;warehouse&lt;/span&gt; &lt;span class="n"&gt;transforming&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;database&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;database&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;schema&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;schema&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;schema&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dbt_ssagara&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; 
&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;schema&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;table&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt; 
&lt;span class="p"&gt;(&lt;/span&gt; &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;first_name&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;last_name&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;copy&lt;/span&gt; &lt;span class="k"&gt;into&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;last_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="s1"&gt;'s3://dbt-tutorial-public/jaffle_shop_customers.csv'&lt;/span&gt;
&lt;span class="n"&gt;file_format&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt;
    &lt;span class="n"&gt;field_delimiter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;','&lt;/span&gt;
    &lt;span class="n"&gt;skip_header&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;table&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt;
&lt;span class="p"&gt;(&lt;/span&gt; &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;order_date&lt;/span&gt; &lt;span class="nb"&gt;date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;_etl_loaded_at&lt;/span&gt; &lt;span class="nb"&gt;timestamp&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="k"&gt;current_timestamp&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;copy&lt;/span&gt; &lt;span class="k"&gt;into&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="s1"&gt;'s3://dbt-tutorial-public/jaffle_shop_orders.csv'&lt;/span&gt;
&lt;span class="n"&gt;file_format&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt;
    &lt;span class="n"&gt;field_delimiter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;','&lt;/span&gt;
    &lt;span class="n"&gt;skip_header&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;table&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payment&lt;/span&gt; 
&lt;span class="p"&gt;(&lt;/span&gt; &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;orderid&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;paymentmethod&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;amount&lt;/span&gt; &lt;span class="nb"&gt;integer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;created&lt;/span&gt; &lt;span class="nb"&gt;date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;_batched_at&lt;/span&gt; &lt;span class="nb"&gt;timestamp&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="k"&gt;current_timestamp&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;copy&lt;/span&gt; &lt;span class="k"&gt;into&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payment&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;orderid&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;paymentmethod&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;amount&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;created&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="s1"&gt;'s3://dbt-tutorial-public/stripe_payments.csv'&lt;/span&gt;
&lt;span class="n"&gt;file_format&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'CSV'&lt;/span&gt;
    &lt;span class="n"&gt;field_delimiter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;','&lt;/span&gt;
    &lt;span class="n"&gt;skip_header&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, run the following queries in order to confirm that data has been loaded successfully into each table:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stripe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;payment&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F190odfiuvnra44cyo6hc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F190odfiuvnra44cyo6hc.png" alt="2025-08-13_08h58_02" width="800" height="522"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Connect dbt to Snowflake
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=4" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=4&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This chapter covers setting up the connection from dbt Cloud to Snowflake, but since this is not needed for dbt Projects on Snowflake, we'll skip it.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. Set up a dbt managed repository
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=5" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=5&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This chapter covers setting up the Managed Repository, which is a dbt Cloud-specific feature. Since this is not relevant to dbt Projects on Snowflake, we'll skip it.&lt;/p&gt;

&lt;p&gt;As mentioned in the important notes at the beginning of this article, we'll develop without Git integration in this tutorial.&lt;/p&gt;

&lt;p&gt;For reference, you can learn about Git integration in dbt Projects on Snowflake in the following blog post:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/try-dbt-projects-on-snowflake-with-github/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/try-dbt-projects-on-snowflake-with-github/&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Initialize your dbt project and start developing
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=6" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=6&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll set up the dbt Project. Let's move on to dbt Projects on Snowflake operations!&lt;/p&gt;

&lt;p&gt;From the left menu in Snowflake, go to &lt;code&gt;Projects&lt;/code&gt;, then click &lt;code&gt;Workspaces&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bhtasy9895lrnacr8rk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bhtasy9895lrnacr8rk.png" alt="2025-08-13_09h09_12" width="389" height="568"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;+ Add new&lt;/code&gt; in the upper left, then click &lt;code&gt;dbt Project&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbv7oq1ss30ehnouudqa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbv7oq1ss30ehnouudqa.png" alt="2025-08-13_09h13_28" width="600" height="505"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the popup that appears, enter the following information and click &lt;code&gt;Create&lt;/code&gt; in the bottom right:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Project Name&lt;/code&gt;: dbt_quickstart

&lt;ul&gt;
&lt;li&gt;Any name is fine&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;
&lt;code&gt;Role&lt;/code&gt;: accountadmin

&lt;ul&gt;
&lt;li&gt;The role used when queries are executed through dbt Projects in the Workspace. We're using ACCOUNTADMIN for this tutorial, but any role with read permissions for the databases, schemas, and tables to be used, and create permissions for the target database and schema where dbt will create objects, will work.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;
&lt;code&gt;Warehouse&lt;/code&gt;: transforming

&lt;ul&gt;
&lt;li&gt;The warehouse used when queries are executed through dbt Projects in the Workspace&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;
&lt;code&gt;Database&lt;/code&gt;: analytics

&lt;ul&gt;
&lt;li&gt;The target database for dbt output&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;
&lt;code&gt;Schema&lt;/code&gt;: dbt_ssagara

&lt;ul&gt;
&lt;li&gt;The development output schema for dbt&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fphl7c7g9q7cmkzr6ftso.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fphl7c7g9q7cmkzr6ftso.png" alt="2025-08-13_09h24_55" width="800" height="631"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This operation creates a new dbt Project!&lt;/p&gt;

&lt;p&gt;The information you entered is written to &lt;code&gt;profiles.yml&lt;/code&gt;. The official documentation for &lt;code&gt;profiles.yml&lt;/code&gt; is here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/docs/core/connect-data-platform/profiles.yml" rel="noopener noreferrer"&gt;https://docs.getdbt.com/docs/core/connect-data-platform/profiles.yml&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at the contents of &lt;code&gt;profiles.yml&lt;/code&gt;, you'll see &lt;code&gt;dev:&lt;/code&gt; listed. This is the name of the &lt;a href="https://docs.getdbt.com/reference/dbt-jinja-functions/target" rel="noopener noreferrer"&gt;target&lt;/a&gt; that can be used to switch between environments (development, production, etc.) where dbt commands are executed. (You can set a name other than &lt;code&gt;dev&lt;/code&gt;, but since the currently configured profile is for development, &lt;code&gt;dev&lt;/code&gt; is appropriate.)&lt;/p&gt;

&lt;p&gt;Also, &lt;code&gt;target: dev&lt;/code&gt; means "use &lt;code&gt;dev&lt;/code&gt; if no target is specified when running dbt commands." We'll add a production profile to &lt;code&gt;profiles.yml&lt;/code&gt; later, but to avoid accidentally modifying the production environment with careless commands, it's recommended to keep &lt;code&gt;target: dev&lt;/code&gt; unchanged.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fujvdna5c8bwijjpi68g3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fujvdna5c8bwijjpi68g3.png" alt="2025-08-13_09h27_20" width="800" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's also try running &lt;code&gt;dbt run&lt;/code&gt; at this point. &lt;code&gt;dbt run&lt;/code&gt; is a command that executes all SELECT statements written in &lt;code&gt;.sql&lt;/code&gt; files in the &lt;code&gt;models&lt;/code&gt; folder as CTAS/CVAS statements against the specified database and schema.&lt;/p&gt;

&lt;p&gt;Select &lt;code&gt;Run&lt;/code&gt; from the command selection button in the upper right and press the play button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" alt="2025-08-13_09h48_18" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Logs will appear at the bottom of the Workspace.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F29nc56puxqy446wunqyj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F29nc56puxqy446wunqyj.png" alt="2025-08-13_09h49_08" width="755" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm8of11zg65cdinvqo0h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm8of11zg65cdinvqo0h.png" alt="2025-08-13_09h49_25" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once processing is complete, check the &lt;code&gt;dbt_ssagara&lt;/code&gt; schema from the &lt;code&gt;Object Explorer&lt;/code&gt; in the lower left of the Workspace. You can see that &lt;code&gt;MY_FIRST_DBT_MODEL&lt;/code&gt; and &lt;code&gt;MY_SECOND_DBT_MODEL&lt;/code&gt;, which were in the default &lt;code&gt;models&lt;/code&gt; folder, have been created!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fclaugt35m68xmp1laaye.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fclaugt35m68xmp1laaye.png" alt="2025-08-13_09h50_58" width="631" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  7. Build your first model
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=7" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=7&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll create a new &lt;code&gt;.sql&lt;/code&gt; file using the data loaded into Snowflake and run &lt;code&gt;dbt run&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The word "model" appears here - in dbt, &lt;code&gt;.sql&lt;/code&gt; files stored in the &lt;code&gt;models&lt;/code&gt; folder are called "models."&lt;/p&gt;

&lt;p&gt;Click the &lt;code&gt;+&lt;/code&gt; next to the &lt;code&gt;models&lt;/code&gt; folder, select &lt;code&gt;SQL File&lt;/code&gt;, and create a file called &lt;code&gt;customers.sql&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftgs6e3960139lovfmkd7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftgs6e3960139lovfmkd7.png" alt="2025-08-13_09h58_05" width="748" height="578"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqntoa33ya4tpmbr3bjhb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqntoa33ya4tpmbr3bjhb.png" alt="2025-08-13_09h58_40" width="599" height="297"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can now edit &lt;code&gt;customers.sql&lt;/code&gt; in the editor pane on the right. Copy and paste the following SQL:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;last_name&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;orders&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;order_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;status&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;customer_orders&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

        &lt;span class="k"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;first_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;most_recent_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;count&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;number_of_orders&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;orders&lt;/span&gt;

    &lt;span class="k"&gt;group&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="k"&gt;final&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;first_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;most_recent_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;coalesce&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;number_of_orders&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;number_of_orders&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt;

    &lt;span class="k"&gt;left&lt;/span&gt; &lt;span class="k"&gt;join&lt;/span&gt; &lt;span class="n"&gt;customer_orders&lt;/span&gt; &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="k"&gt;final&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After pasting, it should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5fjv3kbk6lfjtkg5t1k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5fjv3kbk6lfjtkg5t1k.png" alt="2025-08-13_09h59_57" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To run &lt;code&gt;dbt run&lt;/code&gt;, select &lt;code&gt;Run&lt;/code&gt; from the command selection button in the upper right and press the play button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" alt="2025-08-13_09h48_18" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once processing is complete, check the &lt;code&gt;dbt_ssagara&lt;/code&gt; schema from the &lt;code&gt;Object Explorer&lt;/code&gt; in the lower left of the Workspace. You can see that a view called &lt;code&gt;CUSTOMERS&lt;/code&gt; has been created.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fny0spvjmo4pf7iqwrgp1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fny0spvjmo4pf7iqwrgp1.png" alt="2025-08-13_10h04_13" width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  8. Change the way your model is materialized
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=8" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=8&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll experiment with &lt;a href="https://docs.getdbt.com/docs/build/materializations" rel="noopener noreferrer"&gt;Materialization&lt;/a&gt;, which determines how objects are created in Snowflake (as tables, views, etc.).&lt;/p&gt;

&lt;p&gt;First, open &lt;code&gt;dbt_project.yml&lt;/code&gt;. This is an essential file for dbt that handles overall project settings and folder path specifications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flft3etccd8575pv4zmqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flft3etccd8575pv4zmqt.png" alt="2025-08-13_10h20_02" width="576" height="227"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Modify the &lt;code&gt;models:&lt;/code&gt; code at the bottom as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Before
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;models&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;dbt_quickstart&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;+materialized&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;view&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;After
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;models&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;dbt_quickstart&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;+materialized&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;table&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's run &lt;code&gt;dbt run&lt;/code&gt; in this state to see how the behavior changes. Select &lt;code&gt;Run&lt;/code&gt; from the command selection button in the upper right and press the play button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" alt="2025-08-13_09h48_18" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once processing is complete, check the &lt;code&gt;dbt_ssagara&lt;/code&gt; schema from the &lt;code&gt;Object Explorer&lt;/code&gt; in the lower left of the Workspace. You should see that everything has been created as tables.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzp9zf9syp9rh68ixue1y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzp9zf9syp9rh68ixue1y.png" alt="2025-08-13_10h23_56" width="589" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For reference, Materialization settings can be configured at the project or folder level in &lt;code&gt;dbt_project.yml&lt;/code&gt;, and also specified within each &lt;code&gt;.sql&lt;/code&gt; file. For example, looking at &lt;code&gt;my_first_dbt_model.sql&lt;/code&gt;, you'll see a &lt;code&gt;config&lt;/code&gt; specification as shown below. When the same config setting exists in both &lt;code&gt;dbt_project.yml&lt;/code&gt; and a &lt;code&gt;.sql&lt;/code&gt; file, the &lt;code&gt;.sql&lt;/code&gt; file takes precedence.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo4m0u5zxeqy06fsapxvu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo4m0u5zxeqy06fsapxvu.png" alt="2025-08-13_10h25_37" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As additional reference, Materialization includes not only tables and views but also &lt;code&gt;incremental&lt;/code&gt; for incremental updates and &lt;code&gt;ephemeral&lt;/code&gt; which doesn't create objects but is only used as a WITH clause. For details on these, please refer to the official documentation below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/docs/build/materializations" rel="noopener noreferrer"&gt;https://docs.getdbt.com/docs/build/materializations&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  9. Delete the example models
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=9" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=9&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll delete the sample &lt;code&gt;.sql&lt;/code&gt; files that were added by default.&lt;/p&gt;

&lt;p&gt;From the &lt;code&gt;models&lt;/code&gt; folder, click the "..." next to &lt;code&gt;my_first_dbt_model.sql&lt;/code&gt; and &lt;code&gt;my_second_dbt_model.sql&lt;/code&gt;, and click &lt;code&gt;Delete&lt;/code&gt; for each.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvswko3m7911blf3uh8f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvswko3m7911blf3uh8f.png" alt="2025-08-13_10h32_14" width="725" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw754k0be301imt4qvwv9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw754k0be301imt4qvwv9.png" alt="2025-08-13_10h32_36" width="767" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This completes what we need to do in this chapter, but it's important to note that &lt;strong&gt;dbt doesn't automatically delete the corresponding tables/views when &lt;code&gt;.sql&lt;/code&gt; files are deleted&lt;/strong&gt;. After deleting &lt;code&gt;.sql&lt;/code&gt; files, you need to manually delete the tables/views through SQL commands or other methods.&lt;/p&gt;

&lt;h2&gt;
  
  
  10. Build models on top of other models
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=10" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=10&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, following &lt;a href="https://docs.getdbt.com/best-practices/how-we-structure/1-guide-overview" rel="noopener noreferrer"&gt;dbt best practices&lt;/a&gt;, we'll break down the data cleanup logic and create separate &lt;code&gt;.sql&lt;/code&gt; files (corresponding to the &lt;a href="https://docs.getdbt.com/best-practices/how-we-structure/2-staging" rel="noopener noreferrer"&gt;Staging layer&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;First, create the following two new &lt;code&gt;.sql&lt;/code&gt; files directly under the &lt;code&gt;models&lt;/code&gt; folder for the Staging layer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;stg_customers.sql&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;last_name&lt;/span&gt;

&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customers&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;stg_orders.sql&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;order_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;status&lt;/span&gt;

&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;raw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;jaffle_shop&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;orders&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, update the previously created &lt;code&gt;customers.sql&lt;/code&gt; with the following content. The key point is using &lt;code&gt;{{ ref() }}&lt;/code&gt; notation to specify existing &lt;code&gt;.sql&lt;/code&gt; file names, which builds processing dependencies ensuring that this &lt;code&gt;.sql&lt;/code&gt; file executes after the specified &lt;code&gt;.sql&lt;/code&gt; files have completed.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="k"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'stg_customers'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;orders&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="k"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'stg_orders'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="n"&gt;customer_orders&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

        &lt;span class="k"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;first_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;most_recent_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;count&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;order_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;number_of_orders&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;orders&lt;/span&gt;

    &lt;span class="k"&gt;group&lt;/span&gt; &lt;span class="k"&gt;by&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

&lt;span class="p"&gt;),&lt;/span&gt;

&lt;span class="k"&gt;final&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;

    &lt;span class="k"&gt;select&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;first_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;most_recent_order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;coalesce&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;customer_orders&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;number_of_orders&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;number_of_orders&lt;/span&gt;

    &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt;

    &lt;span class="k"&gt;left&lt;/span&gt; &lt;span class="k"&gt;join&lt;/span&gt; &lt;span class="n"&gt;customer_orders&lt;/span&gt; &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="k"&gt;final&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To view these model dependencies in the Workspace, run the &lt;code&gt;dbt compile&lt;/code&gt; command which compiles each SQL file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1i37ch01an493sgat605.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1i37ch01an493sgat605.png" alt="2025-08-13_10h52_07" width="569" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, click &lt;code&gt;DAG&lt;/code&gt; in the bottom right of the Workspace to see the lineage showing the dependencies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6w7ymypjak5hvzprnou.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff6w7ymypjak5hvzprnou.png" alt="2025-08-13_10h53_14" width="800" height="492"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally for this chapter, let's run &lt;code&gt;dbt run&lt;/code&gt; to output to the development schema. Select &lt;code&gt;Run&lt;/code&gt; from the command selection button in the upper right and press the play button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo7nzm2mup8y0zxv1pvy.png" alt="2025-08-13_09h48_18" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once processing is complete, check the &lt;code&gt;dbt_ssagara&lt;/code&gt; schema from the &lt;code&gt;Object Explorer&lt;/code&gt; in the lower left of the Workspace. You can see that the two newly created &lt;code&gt;.sql&lt;/code&gt; files have been output as tables.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fha7x0ajbwmiy6saezqec.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fha7x0ajbwmiy6saezqec.png" alt="2025-08-13_10h55_51" width="800" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  11. Build models on top of sources
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=11" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=11&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll configure "sources" which define the source data for dbt transformations. By configuring sources, you can display source data in the lineage and run tests on source data before performing dbt transformations (covered in Chapter 12).&lt;/p&gt;

&lt;p&gt;First, create &lt;code&gt;sources.yml&lt;/code&gt; under the &lt;code&gt;models&lt;/code&gt; folder with the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;2&lt;/span&gt;

&lt;span class="na"&gt;sources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;jaffle_shop&lt;/span&gt;
      &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;This is a replica of the Postgres database used by our app&lt;/span&gt;
      &lt;span class="na"&gt;database&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;raw&lt;/span&gt;
      &lt;span class="na"&gt;schema&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;jaffle_shop&lt;/span&gt;
      &lt;span class="na"&gt;tables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customers&lt;/span&gt;
            &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;One record per customer.&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;orders&lt;/span&gt;
            &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;One record per order. Includes cancelled and deleted orders.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, modify the two &lt;code&gt;.sql&lt;/code&gt; files we created earlier for the Staging layer to reference sources. When referencing sources, use notation like &lt;code&gt;{{ source('jaffle_shop', 'customers') }}&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;stg_customers.sql&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;last_name&lt;/span&gt;

&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="k"&gt;source&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'jaffle_shop'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'customers'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;stg_orders.sql&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;order_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;customer_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;order_date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;status&lt;/span&gt;

&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="p"&gt;{{&lt;/span&gt; &lt;span class="k"&gt;source&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'jaffle_shop'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'orders'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;}}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To view the dependencies including sources in the Workspace, run the &lt;code&gt;dbt compile&lt;/code&gt; command to compile each SQL file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1i37ch01an493sgat605.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1i37ch01an493sgat605.png" alt="2025-08-13_10h52_07" width="569" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After this, click &lt;code&gt;DAG&lt;/code&gt; in the bottom right of the Workspace to see the lineage including sources.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F847osq7pknz8lgrz48zl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F847osq7pknz8lgrz48zl.png" alt="2025-08-13_11h07_49" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, by running &lt;code&gt;dbt compile&lt;/code&gt;, you can check the compiled &lt;code&gt;.sql&lt;/code&gt; files in the &lt;code&gt;compiled&lt;/code&gt; folder within the &lt;code&gt;target&lt;/code&gt; folder. As shown below, you can see that the source references have been replaced with the appropriate table names.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3yxh0a8lqrmheobpumcv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3yxh0a8lqrmheobpumcv.png" alt="2025-08-13_11h10_48" width="800" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  12. Add tests to your models
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=12" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=12&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll try the data testing feature, which is unique to dbt.&lt;/p&gt;

&lt;p&gt;Data tests are defined in YAML files, so create &lt;code&gt;schema.yml&lt;/code&gt; in the &lt;code&gt;models&lt;/code&gt; folder with the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;2&lt;/span&gt;

&lt;span class="na"&gt;models&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customers&lt;/span&gt;
    &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;One record per customer&lt;/span&gt;
    &lt;span class="na"&gt;columns&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customer_id&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Primary key&lt;/span&gt;
        &lt;span class="na"&gt;data_tests&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;unique&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;not_null&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;first_order_date&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;NULL when a customer has not yet placed an order.&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;stg_customers&lt;/span&gt;
    &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;This model cleans up customer data&lt;/span&gt;
    &lt;span class="na"&gt;columns&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customer_id&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Primary key&lt;/span&gt;
        &lt;span class="na"&gt;data_tests&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;unique&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;not_null&lt;/span&gt;

  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;stg_orders&lt;/span&gt;
    &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;This model cleans up order data&lt;/span&gt;
    &lt;span class="na"&gt;columns&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;order_id&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Primary key&lt;/span&gt;
        &lt;span class="na"&gt;data_tests&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;unique&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;not_null&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;status&lt;/span&gt;
        &lt;span class="na"&gt;data_tests&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;accepted_values&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;values&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;placed'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;shipped'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;completed'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;return_pending'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;returned'&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customer_id&lt;/span&gt;
        &lt;span class="na"&gt;data_tests&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;not_null&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;relationships&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;to&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ref('stg_customers')&lt;/span&gt;
              &lt;span class="na"&gt;field&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;customer_id&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the above &lt;code&gt;schema.yml&lt;/code&gt;, the key point for data test definitions is the &lt;code&gt;data_tests&lt;/code&gt; section, where you define necessary tests for each column from the following four test types:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;unique

&lt;ul&gt;
&lt;li&gt;Tests whether the column values are unique across all records&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;not_null

&lt;ul&gt;
&lt;li&gt;Tests whether the column values are not null across all records&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;accepted_values

&lt;ul&gt;
&lt;li&gt;Tests whether the column values are one of the values specified in the list&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;relationships

&lt;ul&gt;
&lt;li&gt;Technically, this checks referential integrity&lt;/li&gt;
&lt;li&gt;In the above code, it confirms that "the &lt;code&gt;customer_id&lt;/code&gt; column in &lt;code&gt;stg_orders&lt;/code&gt; consists of values from the &lt;code&gt;customer_id&lt;/code&gt; column in &lt;code&gt;stg_customers&lt;/code&gt;"&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;After creating &lt;code&gt;schema.yml&lt;/code&gt;, run &lt;code&gt;dbt test&lt;/code&gt; to actually perform the data tests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjvnr48rfvo6b9s81j6r2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjvnr48rfvo6b9s81j6r2.png" alt="2025-08-13_11h23_08" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After execution completes, check the logs from the bottom right of the Workspace. When each test is executed and shows &lt;code&gt;PASS&lt;/code&gt;, it means the test has passed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6rzhhvv1v98vtp174uhs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6rzhhvv1v98vtp174uhs.png" alt="2025-08-13_11h25_25" width="800" height="1064"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  13. Document your models
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=13" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=13&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This chapter covers trying dbt's documentation generation feature. However, as of August 13, 2025, dbt Projects on Snowflake cannot execute &lt;code&gt;dbt docs generate&lt;/code&gt; as a standard feature, so we'll skip this.&lt;/p&gt;

&lt;h2&gt;
  
  
  14. Commit your changes
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=14" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=14&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This chapter covers Git commit and push, assuming the use of dbt Cloud's Managed Repository. However, since we're excluding Git integration for this tutorial, we'll skip this chapter.&lt;/p&gt;

&lt;p&gt;For reference, once again, you can learn about Git integration in dbt Projects on Snowflake in the following blog post:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/try-dbt-projects-on-snowflake-with-github/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/try-dbt-projects-on-snowflake-with-github/&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  15. Deploy dbt
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/guides/snowflake?step=15" rel="noopener noreferrer"&gt;https://docs.getdbt.com/guides/snowflake?step=15&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this chapter, we'll deploy the developed content to the production environment. Note that dbt Projects on Snowflake requires completely different operations from dbt Cloud.&lt;/p&gt;

&lt;h3&gt;
  
  
  Adding production environment information to &lt;code&gt;profiles.yml&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;First, add information about the schema where you want to output to production to &lt;code&gt;profiles.yml&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Open &lt;code&gt;profiles.yml&lt;/code&gt; and modify it as follows. The key point is adding a new target called &lt;code&gt;prod&lt;/code&gt; with &lt;code&gt;schema: PROD&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;dbt_quickstart&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;target&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;
  &lt;span class="na"&gt;outputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;snowflake&lt;/span&gt;
      &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ACCOUNTADMIN&lt;/span&gt;
      &lt;span class="na"&gt;warehouse&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TRANSFORMING&lt;/span&gt;
      &lt;span class="na"&gt;database&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ANALYTICS&lt;/span&gt;
      &lt;span class="na"&gt;schema&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DBT_SSAGARA&lt;/span&gt;
      &lt;span class="na"&gt;account&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;'&lt;/span&gt;
      &lt;span class="na"&gt;user&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;'&lt;/span&gt;
    &lt;span class="na"&gt;prod&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;snowflake&lt;/span&gt;
      &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ACCOUNTADMIN&lt;/span&gt;
      &lt;span class="na"&gt;warehouse&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TRANSFORMING&lt;/span&gt;
      &lt;span class="na"&gt;database&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ANALYTICS&lt;/span&gt;
      &lt;span class="na"&gt;schema&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;PROD&lt;/span&gt;
      &lt;span class="na"&gt;account&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;'&lt;/span&gt;
      &lt;span class="na"&gt;user&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Deploying the dbt Project to the specified schema
&lt;/h3&gt;

&lt;p&gt;dbt Projects on Snowflake can be scheduled to run as tasks, but this requires deploying the dbt Project to a specified schema.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Connect&lt;/code&gt; in the upper right of the Workspace, then click &lt;code&gt;Deploy dbt project&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4h9165ggfnx0sapce80.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4h9165ggfnx0sapce80.png" alt="2025-08-13_11h41_26" width="490" height="264"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enter the following information and click &lt;code&gt;Deploy&lt;/code&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Select location&lt;/code&gt;

&lt;ul&gt;
&lt;li&gt;Database: ANALYTICS&lt;/li&gt;
&lt;li&gt;Schema: PROD *Note: We're using the same schema as the data output destination, but a different schema is also fine.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Select or create dbt project

&lt;ul&gt;
&lt;li&gt;Click &lt;code&gt;+ Create dbt project&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;Enter Name&lt;/code&gt;: dbt_quickstart&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2zbd59ugnro7yrc43d62.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2zbd59ugnro7yrc43d62.png" alt="2025-08-13_11h43_30" width="800" height="535"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fotxcikv8o4io2oiscnn5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fotxcikv8o4io2oiscnn5.png" alt="2025-08-13_11h45_11" width="800" height="682"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If deployment is successful, it will display as shown below. Looking at the &lt;code&gt;prod&lt;/code&gt; schema, you can see that the dbt Project has been deployed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmkzi65yvzb41mewkl32.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmkzi65yvzb41mewkl32.png" alt="2025-08-13_11h46_10" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq8pitvm21if9yon13xnz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq8pitvm21if9yon13xnz.png" alt="2025-08-13_11h47_42" width="640" height="707"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, go to the worksheet and run the following query to define a task using the dbt Project. The key point is &lt;code&gt;args='build --target prod'&lt;/code&gt;, where &lt;code&gt;build&lt;/code&gt; executes the &lt;code&gt;dbt build&lt;/code&gt; command which alternately runs &lt;code&gt;dbt run&lt;/code&gt; and &lt;code&gt;dbt test&lt;/code&gt; for each model, and &lt;code&gt;--target prod&lt;/code&gt; executes the dbt command against the production target &lt;code&gt;prod&lt;/code&gt; we created earlier.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;use&lt;/span&gt; &lt;span class="k"&gt;role&lt;/span&gt; &lt;span class="n"&gt;accountadmin&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;create&lt;/span&gt; &lt;span class="k"&gt;or&lt;/span&gt; &lt;span class="k"&gt;alter&lt;/span&gt; &lt;span class="n"&gt;task&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dbt_execute&lt;/span&gt;
  &lt;span class="n"&gt;warehouse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;transforming&lt;/span&gt;
  &lt;span class="n"&gt;schedule&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'using cron 0 9 1 1 * Asia/Tokyo'&lt;/span&gt; &lt;span class="c1"&gt;-- Runs at 9 AM on January 1st every year&lt;/span&gt;
&lt;span class="k"&gt;as&lt;/span&gt;
&lt;span class="k"&gt;execute&lt;/span&gt; &lt;span class="n"&gt;dbt&lt;/span&gt; &lt;span class="n"&gt;project&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dbt_quickstart&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'build --target prod'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After this, run the following command to manually execute the task:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;execute&lt;/span&gt; &lt;span class="n"&gt;task&lt;/span&gt; &lt;span class="n"&gt;analytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prod&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dbt_execute&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For reference, you can check the execution status of the dbt project itself by clicking &lt;code&gt;dbt projects&lt;/code&gt; from &lt;code&gt;Monitoring&lt;/code&gt; on the left side of the screen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhyzvbbcqo0zergr9oduz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhyzvbbcqo0zergr9oduz.png" alt="2025-08-13_11h57_07" width="404" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflpnn06cmma70imr1swe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fflpnn06cmma70imr1swe.png" alt="2025-08-13_11h57_47" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After confirming the task succeeded, looking inside the &lt;code&gt;prod&lt;/code&gt; schema, we can see that the tables have been successfully created!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmesmqlc9xe7fvsz363f8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmesmqlc9xe7fvsz363f8.png" alt="2025-08-13_11h59_39" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I tried running the "Quickstart for dbt and Snowflake" tutorial with dbt Projects on Snowflake.&lt;/p&gt;

&lt;p&gt;I think this is perfect content for learning about dbt itself while trying out dbt Projects on Snowflake! Please give it a try.&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>dbt</category>
      <category>dataengineering</category>
    </item>
    <item>
      <title>Personal Picks: Data Product News (July 23, 2025)</title>
      <dc:creator>Sagara</dc:creator>
      <pubDate>Wed, 23 Jul 2025 05:04:05 +0000</pubDate>
      <link>https://forem.com/sagara/personal-picks-data-product-news-july-23-2025-6kf</link>
      <guid>https://forem.com/sagara/personal-picks-data-product-news-july-23-2025-6kf</guid>
      <description>&lt;p&gt;※This is an English translation of the following article:&lt;br&gt;
&lt;a href="https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20250723/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/modern-data-stack-info-summary-20250723/&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Hi, this is Sagara.&lt;/p&gt;

&lt;p&gt;As a consultant in the Modern Data Stack field, I see a vast amount of information being released every day.&lt;/p&gt;

&lt;p&gt;With so much happening, I'd like to use this article to summarize the Modern Data Stack-related news that has caught my eye over the past couple of weeks.&lt;/p&gt;

&lt;p&gt;*Disclaimer: This is not an exhaustive list of all product updates. It only includes information that I found interesting based on &lt;strong&gt;my personal judgment and bias&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  General Modern Data Stack
&lt;/h2&gt;

&lt;h3&gt;
  
  
  "Data Engineering Study #30 - Celebrating 30 Sessions! A Look Back and Forward at Data Engineering Tech and Careers with Past Speakers" was held.
&lt;/h3&gt;

&lt;p&gt;Data Engineering Study #30 has taken place. To mark this 30th milestone, we welcomed back 10 past speakers for a 5-minute lightning talk session where they discussed the "aftermath" of the technical initiatives and their own careers they had previously talked about.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://forkwell.connpass.com/event/357942/" rel="noopener noreferrer"&gt;https://forkwell.connpass.com/event/357942/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For more details on Data Engineering Study #30, the following event report is very helpful. Please check it out as well.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/shinyaa31/articles/aaadbefa457197" rel="noopener noreferrer"&gt;https://zenn.dev/shinyaa31/articles/aaadbefa457197&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Also, &lt;a href="https://x.com/yuzutas0" rel="noopener noreferrer"&gt;Yuzutaso&lt;/a&gt;, who has been the advisor for Data Engineering Study, has stepped down after this #30 event, and a new advisor team of the following three members has been formed. &lt;strong&gt;I, Sagara, have also become a member of this new advisor team!&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://x.com/syou6162" rel="noopener noreferrer"&gt;Yasuhisa Yoshida&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://x.com/ikki_mz" rel="noopener noreferrer"&gt;ikki&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://x.com/SS_chneider" rel="noopener noreferrer"&gt;Sagara&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As someone who has learned a lot about the technology and careers in the data engineering world by watching Data Engineering Study, I will do my best to make it even more exciting than before!&lt;/p&gt;

&lt;p&gt;The next event, #31, will feature lightning talks and a public planning meeting by the three members of the advisor team. Please join us!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://forkwell.connpass.com/event/363198/" rel="noopener noreferrer"&gt;https://forkwell.connpass.com/event/363198/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Summer Data Engineering Roadmap
&lt;/h3&gt;

&lt;p&gt;MotherDuck's blog has published an article outlining a roadmap for how to learn data engineering.&lt;/p&gt;

&lt;p&gt;It's broken down into three levels—Foundation, Core, and Advanced—making it an easy-to-understand guide on where to start.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/blog/summer-data-engineering-roadmap/" rel="noopener noreferrer"&gt;https://motherduck.com/blog/summer-data-engineering-roadmap/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally, MotherDuck's blog has previously published articles summarizing which tools are useful for data engineering. These are also systematically organized and serve as a great reference.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/blog/data-engineering-toolkit-essential-tools/" rel="noopener noreferrer"&gt;https://motherduck.com/blog/data-engineering-toolkit-essential-tools/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/blog/data-engineering-toolkit-infrastructure-devops/" rel="noopener noreferrer"&gt;https://motherduck.com/blog/data-engineering-toolkit-infrastructure-devops/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Databricks vs. Snowflake: The Final Chapter
&lt;/h3&gt;

&lt;p&gt;A blog post from Orchestra explains their perspective that the rivalry between Databricks and Snowflake is over, and both companies are moving into new markets with different strategies (developer-focused vs. business-focused).&lt;/p&gt;

&lt;p&gt;The following is just a summary of the article's content, but it outlines these predictions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Databricks → To a Developer Platform&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Leveraging its strong developer community, it might aim to provide the best development environment on top of hyperscalers like Azure.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Snowflake → To the Business App Market&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Utilizing its customer base of business users, it might aim to become a "composable" application platform that could replace giant SaaS products like Salesforce.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://www.getorchestra.io/blog/databricks-vs-snowflake-the-final-chapter" rel="noopener noreferrer"&gt;https://www.getorchestra.io/blog/databricks-vs-snowflake-the-final-chapter&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Data Extract/Load
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Fivetran
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Fivetran's Comparison Article with Airbyte
&lt;/h4&gt;

&lt;p&gt;Fivetran's official blog has published an article comparing their service with Airbyte.&lt;/p&gt;

&lt;p&gt;It's important to consider that this article is "written by Fivetran," and I felt some of the descriptions of Airbyte were a bit dated. (For example, there was no mention of Airbyte's &lt;a href="https://docs.airbyte.com/platform/using-airbyte/schema-change-management" rel="noopener noreferrer"&gt;Schema Change Management&lt;/a&gt; or &lt;a href="https://docs.airbyte.com/platform/connector-development/connector-builder-ui/overview" rel="noopener noreferrer"&gt;Connector Builder&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;However, it's still a useful reference for a rough understanding of the differences between the two companies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.fivetran.com/blog/fivetran-vs-airbyte-features-pricing-services-and-more" rel="noopener noreferrer"&gt;https://www.fivetran.com/blog/fivetran-vs-airbyte-features-pricing-services-and-more&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Data Warehouse/Data Lakehouse
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Snowflake
&lt;/h3&gt;

&lt;h4&gt;
  
  
  "QUERY_INSIGHTS" View Released, Providing Analysis and Improvement Suggestions for Queries Executed in Snowflake
&lt;/h4&gt;

&lt;p&gt;A new ACCOUNT_USAGE view, &lt;code&gt;QUERY_INSIGHTS&lt;/code&gt;, has been released. It automatically analyzes the execution of queries within Snowflake and stores the results, identifying areas that may be impacting performance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.snowflake.com/en/release-notes/2025/other/2025-07-03-query-insights" rel="noopener noreferrer"&gt;https://docs.snowflake.com/en/release-notes/2025/other/2025-07-03-query-insights&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I tried it out myself, and I found it excellent that Snowflake handles the detection automatically. All the user has to do is check the &lt;code&gt;QUERY_INSIGHTS&lt;/code&gt; view, review the findings, and take action. It simplifies the process greatly!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.classmethod.jp/articles/snowflake-query-insights-view/" rel="noopener noreferrer"&gt;https://dev.classmethod.jp/articles/snowflake-query-insights-view/&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Official Snowflake MCP Server with Cortex AI Functionality Released
&lt;/h4&gt;

&lt;p&gt;Snowflake has officially released an MCP Server that supports Cortex AI features. It currently supports Cortex Search and Cortex Analyst.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Snowflake-Labs/mcp" rel="noopener noreferrer"&gt;https://github.com/Snowflake-Labs/mcp&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An official blog post about this release also mentioned that Snowflake-managed MCP servers, where Snowflake manages the infrastructure, are planned for a future release.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.snowflake.com/en/blog/mcp-servers-unify-extend-data-agents/" rel="noopener noreferrer"&gt;https://www.snowflake.com/en/blog/mcp-servers-unify-extend-data-agents/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  BigQuery
&lt;/h3&gt;

&lt;h4&gt;
  
  
  "source_column_match" and "null_markers NULL" Options Added to CREATE EXTERNAL TABLE and LOAD DATA
&lt;/h4&gt;

&lt;p&gt;BigQuery has added &lt;code&gt;source_column_match&lt;/code&gt; and &lt;code&gt;null_markers NULL&lt;/code&gt; options to &lt;code&gt;CREATE EXTERNAL TABLE&lt;/code&gt; and &lt;code&gt;LOAD DATA&lt;/code&gt;, which are used for querying and loading data from external storage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/bigquery/docs/release-notes#July_22_2025" rel="noopener noreferrer"&gt;https://cloud.google.com/bigquery/docs/release-notes#July_22_2025&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Databricks
&lt;/h3&gt;

&lt;h4&gt;
  
  
  RSS Feed for Databricks Release Notes
&lt;/h4&gt;

&lt;p&gt;Databricks has started providing an RSS feed for its release notes, which includes the latest product information and other feature release notes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.google.com/search?q=https://docs.databricks.com/aws/en/release-notes/%23feed" rel="noopener noreferrer"&gt;https://docs.databricks.com/aws/en/release-notes/#feed&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Recursive CTEs are in Public Preview in Databricks
&lt;/h4&gt;

&lt;p&gt;As a new feature in Databricks, Recursive CTEs are now in public preview.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.databricks.com/blog/introducing-recursive-common-table-expressions-databricks" rel="noopener noreferrer"&gt;https://www.databricks.com/blog/introducing-recursive-common-table-expressions-databricks&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  MotherDuck/DuckDB
&lt;/h3&gt;

&lt;h4&gt;
  
  
  MotherDuck Announces New "Mega" and "Giga" Instances for Large, Complex Data Processing
&lt;/h4&gt;

&lt;p&gt;To meet the demand for more intensive data processing that exceeds the capabilities of the existing "Jumbo" ducklings, MotherDuck has announced two new, larger instance sizes: "Mega" and "Giga."&lt;/p&gt;

&lt;p&gt;According to the article, Mega is designed for large-scale workloads, while Giga is intended for extremely complex and massive transformation jobs for which there are no other alternatives.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/blog/announcing-mega-giga-instance-sizes-huge-scale/" rel="noopener noreferrer"&gt;https://motherduck.com/blog/announcing-mega-giga-instance-sizes-huge-scale/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For a list of instance types offered by MotherDuck, please see the official documentation below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://motherduck.com/docs/about-motherduck/billing/instances/" rel="noopener noreferrer"&gt;https://motherduck.com/docs/about-motherduck/billing/instances/&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Data Transform
&lt;/h2&gt;

&lt;h3&gt;
  
  
  dbt
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Build Iceberg Tables via BigLake Metastore in dbt-bigquery
&lt;/h4&gt;

&lt;p&gt;Starting with the dbt-bigquery 1.10 release, it is now possible to build Iceberg tables via BigLake Metastore.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.getdbt.com/blog/dbt-supports-apache-iceberg-tables-bigquery" rel="noopener noreferrer"&gt;https://www.getdbt.com/blog/dbt-supports-apache-iceberg-tables-bigquery&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.getdbt.com/docs/mesh/iceberg/bigquery-iceberg-support" rel="noopener noreferrer"&gt;https://docs.getdbt.com/docs/mesh/iceberg/bigquery-iceberg-support&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although it's still a preview feature, BigLake Metastore also provides Apache Iceberg REST Catalog functionality. This means it's now theoretically possible to "build an Iceberg table using dbt with BigLake Metastore as the catalog, and then query that Iceberg table from an external engine." (I'd love to try this out sometime...)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/bigquery/docs/blms-rest-catalog?hl=en" rel="noopener noreferrer"&gt;https://cloud.google.com/bigquery/docs/blms-rest-catalog?hl=en&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Business Intelligence
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Tableau
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Analyzing Pro Wrestling Match Videos with Generative AI on Vertex AI and Tableau
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/cavernaria" rel="noopener noreferrer"&gt;rtama&lt;/a&gt; published an article on converting pro wrestling match videos into structured data and analyzing it with Tableau.&lt;/p&gt;

&lt;p&gt;This article is an excellent reference for understanding what kind of prompts can be used to convert video into structured data suitable for analysis. (The choice of pro wrestling as a subject is also brilliant!)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://zenn.dev/cavernaria/articles/ec04775eec5c4b" rel="noopener noreferrer"&gt;https://zenn.dev/cavernaria/articles/ec04775eec5c4b&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Lightdash
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Lightdash Now Supports dbt Fusion and dbt 1.10
&lt;/h4&gt;

&lt;p&gt;Lightdash has announced support for dbt Fusion and dbt 1.10.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://changelog.lightdash.com/we-now-support-dbt-fusion-and-dbt-1-10-projects-319440" rel="noopener noreferrer"&gt;https://changelog.lightdash.com/we-now-support-dbt-fusion-and-dbt-1-10-projects-319440&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One key point to note is that from dbt 1.10 onwards, &lt;code&gt;meta:&lt;/code&gt; must be defined under &lt;code&gt;config:&lt;/code&gt;. For Lightdash users, migrating this part will be a significant hurdle.&lt;/p&gt;

&lt;p&gt;Recognizing this challenge, Lightdash has released a Migration Guide and a migration tool called &lt;a href="https://www.google.com/search?q=https://github.com/lightdash/metamove" rel="noopener noreferrer"&gt;MetaMove&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.lightdash.com/dbt-guides/dbt-1.10-migration" rel="noopener noreferrer"&gt;https://docs.lightdash.com/dbt-guides/dbt-1.10-migration&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.lightdash.com/dbt-guides/dbt-fusion-migration" rel="noopener noreferrer"&gt;https://docs.lightdash.com/dbt-guides/dbt-fusion-migration&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Omni
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Official Omni MCP Server Released
&lt;/h4&gt;

&lt;p&gt;Omni has released its official MCP Server. This enhancement expands the potential of Omni, as it allows for connecting the semantic models built in Omni with other tools to obtain more accurate answers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://omni.co/blog/introducing-omnis-mcp-server" rel="noopener noreferrer"&gt;https://omni.co/blog/introducing-omnis-mcp-server&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.omni.co/docs/ai/mcp" rel="noopener noreferrer"&gt;https://docs.omni.co/docs/ai/mcp&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Data Catalog
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Select Star
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Select Star Releases MCP Server
&lt;/h4&gt;

&lt;p&gt;Select Star has released a new managed MCP Server. It uses a Select Star API token for authentication.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.selectstar.com/resources/data-mcp-for-ai-agents" rel="noopener noreferrer"&gt;https://www.selectstar.com/resources/data-mcp-for-ai-agents&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.selectstar.com/features/mcp-server" rel="noopener noreferrer"&gt;https://docs.selectstar.com/features/mcp-server&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  OpenMetadata
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Collate Raises $10M in Series A Funding
&lt;/h4&gt;

&lt;p&gt;Collate, the company that supports the OpenMetadata project and offers a SaaS version, has raised $10 million in Series A funding.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.open-metadata.org/collate-raises-series-a-funding-to-accelerate-openmetadata-growth-4c859f0b9813" rel="noopener noreferrer"&gt;https://blog.open-metadata.org/collate-raises-series-a-funding-to-accelerate-openmetadata-growth-4c859f0b9813&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Data Activation (Reverse ETL)
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Census (A Fivetran Company)
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Census Releases "Mesh Datasets" for Cross-Warehouse Queries
&lt;/h4&gt;

&lt;p&gt;Census has released a new feature called "Mesh Datasets," which allows users to execute queries across multiple data warehouses.&lt;/p&gt;

&lt;p&gt;It seems you just need to write queries using PostgreSQL syntax. (The image below is from the linked blog post.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.getcensus.com/blog/break-down-data-silos-with-cross-warehouse-sql-introducing-mesh-datasets" rel="noopener noreferrer"&gt;https://www.getcensus.com/blog/break-down-data-silos-with-cross-warehouse-sql-introducing-mesh-datasets&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.getcensus.com/datasets/overview/mesh-datasets" rel="noopener noreferrer"&gt;https://docs.getcensus.com/datasets/overview/mesh-datasets&lt;/a&gt;&lt;/p&gt;

</description>
      <category>dataengineering</category>
      <category>snowflake</category>
      <category>dbt</category>
      <category>iceberg</category>
    </item>
  </channel>
</rss>
