What are Snowflake Tasks?
Snowflake Tasks are user-defined objects in Snowflake that enable the automation of repetitive SQL operations. They allow for the scheduling and execution of SQL statements on a regular basis while also providing flexibility in terms of time intervals and chaining of tasks. Snowflake Tasks help streamline the process of data ingestion, transformation, and monitoring by offering an efficient and flexible mechanism for managing these operations.
Source: Snowflake Tasks Documentation
What are the use cases for Snowflake Tasks
There are numerous use cases for Snowflake Tasks, including:
Scheduled queries
Snowflake Tasks can be employed to automate the execution of recurring SQL queries. By scheduling queries, you can ensure that data is consistently refreshed and that analytics are always up-to-date. These scheduled queries may be employed for various purposes, such as data validation, monitoring, and generating reports or dashboards.
Transformations
Data transformations are a critical aspect of any data pipeline. Snowflake Tasks can facilitate these transformations by allowing the scheduling of SQL statements that manipulate and transform raw data into a consumable format. This enables users to maintain clean and structured data while reducing manual intervention.
Change data capture
In combination with Snowflake Streams, Tasks can be used for change data capture (CDC). This process involves monitoring changes to your source data and applying those changes to a target system. Snowflake Tasks and Streams make CDC efficient and reliable, allowing you to keep your data synchronized across multiple systems.
Source: Snowflake Streams and Tasks for CDC
Snowflake Tasks vs. Airflow: Key Differences
While Snowflake Tasks and Apache Airflow both provide automation and scheduling capabilities, there are key differences between the two:
Snowflake Task example: Creating a Task for a Scheduled Query
To create a Snowflake Task for a scheduled query, follow the steps below:
1. Create a warehouse for the task:
2. Create the scheduled SQL statement:
3. Create the Snowflake Task:
4. Describe the task you just created. You will notice that it is created in a suspended state.
5. Enable the task:
Snowflake Tasks can be chained together to create complex data processing workflows. Chaining tasks allows you to define dependencies between tasks, ensuring that a task is executed only after its predecessor tasks have been completed successfully. This feature enables you to create a directed acyclic graph (DAG) of tasks, making the execution of multi-stage workflows more manageable.
Source: Snowflake CREATE TASK Documentation
Chaining Tasks Example
Consider a scenario where you need to perform two sequential operations on your data: first, you aggregate the data at a daily level, and then you compute the moving average over a seven-day window. You can achieve this by chaining two Snowflake Tasks.
1. Create the table for storing the daily summary:
2. Create the table for storing the seven-day moving average:
3. Create the first task for the daily summary:
4. Create the second task for the moving average, with a dependency on the first task. Note that the second task does not have a predefined schedule. It runs after the first task.
5. Enable the tasks:
Visual DAG
In this example, the DAG of tasks consists of four nodes ( <span class="code-exp">PARENT_TASK</span>, <span class="code-exp">CHILD_TASK1</span>, <span class="code-exp">CHILD_TASK2</span>, and <span class="code-exp">CHILD_TASK3</span>) and a single edge representing the dependency between them.
The DAG can be a bit tricky to find in the Snowflake UI. Make sure you are in Snowsight the new UI, then go to “Data”, then “Databases”, then “Tasks”.
How to debug Snowflake Tasks
Debugging Snowflake Tasks is essential to ensure that your data pipelines run smoothly and as expected. This subsection covers the process of debugging tasks.
Debugging Tasks
When a Snowflake Task encounters an error, the task execution stops and does not automatically resume. To investigate issues with tasks, you can use the following techniques:
1. Examine the task history: Snowflake maintains a history of task executions, including their status, the query ID, and the execution time. Use the <span class="code-exp">TASK_HISTORY</span> table function to review this information:
2. Inspect the query history: Once you have identified a problematic task execution from the task history, examine the corresponding query using the <span class="code-exp">QUERY_HISTORY</span> function to gain more insight into the issue:
3. Review the error message: The error message associated with the failed query can provide valuable information about the root cause of the issue. Use the <span class="code-exp">SYSTEM$EXTRACT_ERROR_MESSAGE</span> function to retrieve the error message:
In summary, debugging Snowflake Tasks is a crucial step in ensuring that your data pipelines run smoothly. In a future blog post, we’ll cover how to send notifications to AWS SNS. By employing the techniques and examples provided in this subsection, you can effectively monitor your tasks, identify issues, and promptly address any problems that arise.
Conclusion
Snowflake Tasks offer a powerful and flexible way to automate SQL operations, making them an essential tool for managing data pipelines. With a variety of use cases, such as scheduled queries, transformations, and change data capture, Snowflake
Tasks help streamline and optimize data processing workflows. While Apache Airflow provides more extensibility and advanced monitoring features, Snowflake Tasks' native integration and simplicity make them a compelling choice for Snowflake users.
By understanding the fundamentals of Snowflake Tasks and leveraging their capabilities, you can improve the efficiency and reliability of your data pipelines, ensuring that your data is always up-to-date and ready for analysis.
Further reading
- How to build a Snowflake API
- Propel UI Kit: Data visualization and dashboard React components
- 5-Minute demo: How to expose your Snowflake data via a blazing-fast GraphQL API
If you don’t have a Propel account yet, you can try Propel for free and start building data apps.