Mastering data loading in BigQuery using Dataform

Efficient data loading is crucial for managing and updating tables in Dataform. Various strategies exist to handle different use cases, including truncate and load, appending data, and leveraging incremental tables with unique keys. This blog explores these primary methods and more:

Truncate and Load

In this method, all existing records in the target table are deleted and replaced with a fresh table. This approach works well when a full table refresh is necessary or if managing slowly changing data.

Implementation in Dataform:

  • Set the table type to table.

Example:

config {
  type: "table"
}

SELECT
  id,
  name,
  timestamp
FROM
  ${ref("source_table")}
Code language: JavaScript (javascript)

By defining the table type as table, Dataform ensures that each run recreates the table, effectively performing a truncate and load operation.

Append or Insert-Only Loads

This method appends new data to an existing table while preserving historical records. It is ideal for use cases where past data must remain unchanged, and only new records are added.

Implementation in Dataform:

  • Set the table type to incremental.
  • Define an incremental condition to capture only new records.

Example:

config {
  type: "incremental"
}

SELECT
  id,
  name,
  timestamp
FROM
  ${ref("source_table")}
WHERE
  timestamp = CURRENT_DATE() - 2Code language: JavaScript (javascript)

This ensures that only records from two days ago, based on CURRENT_DATE() – 2, are appended to the target table.

Incremental Loads with Unique Keys

This method ensures that only one row per unique key is retained and updates existing records if changes occur. This is useful for deduplicating or updating records efficiently.

Implementation in Dataform:

  • Set the table type to incremental.
  • Define an uniqueKey to update existing records with new data instead of inserting duplicates.

Example:

config {
  type: "incremental",
  uniqueKey: "id"
}

SELECT
  id,
  name,
  position,
  timestamp
FROM
  ${ref("source_table")}
WHERE
  timestamp = CURRENT_DATE() - 2Code language: SQL (Structured Query Language) (sql)

Dataform automatically handles the merge process—if a record with the same id already exists, it gets updated. If a new id appears, it is inserted as a new row.

Incremental Load with Rolling Delete

This method ensures that before inserting new data incrementally, the system deletes records from the last two days to accommodate any late-arriving updates. This is useful for ensuring data freshness while still maintaining an incremental approach.

Implementation in Dataform:

  • Set the table type to incremental.
  • Use a pre-operation delete step to remove data from the last two days before inserting new records.

Example:

config {
  type: "incremental"
}

pre_operations {
  DELETE 
  FROM 
    ${self()} 
  WHERE 
    timestamp >= CURRENT_DATE() - 2;
  }

  SELECT
    id,
    name,
    position,
    timestamp
  FROM
    ${ref("source_table")}
  WHERE
    timestamp >= CURRENT_DATE() - 2
Code language: PHP (php)

This ensures that any late-arriving updates from the last two days are reflected correctly while keeping the rest of the data intact.

Choosing the Right Method in Dataform

MethodBest Use Case
Truncate and LoadWhen a full table refresh is needed.
Append or Insert-Only When historical records must be preserved.
Incremental Load with Unique KeysWhen deduplication and updates are required.
Incremental Load with Rolling DeleteWhen handling late-arriving updates for the last n number of days.

Understanding these approaches in Dataform allows you to optimise your ETL/ELT workflows and effectively manage data changes for various use cases in BigQuery and other Data Warehouses.

Have you used any of these methods in Dataform? Reach out to let us know, and contact us if we can help you with anything Dataform/BigQuery related! 🚀

Share:
Written by

Prasanna is a Senior Analytics Engineer at Measurelab. With a passion for building scalable and efficient data solutions, he is dedicated to making data more accessible and actionable for businesses. Recently, he has been exploring Machine Learning and AI to enhance data engineering processes and drive deeper insights. As a certified Google Cloud and AWS Data Engineer, Prasanna brings expertise in cloud-based data architectures, automation, and advanced analytics to help organizations unlock the full potential of their data.

Subscribe to our newsletter: