Changelog

dbt adapter: automated blue/green deployments

04.19.2024

ICYMI: we've been working on supporting blue/green deployments in Materialize, which allow you to deploy schema changes to production without downtime. Now — safely cutting over to a new version of your data model in a continually running system takes a non-trivial amount of coordination...and we wouldn't want to burden you with that.

To make blue/green deployments easier to manage, we built the workflow into the dbt-materialize adapter.

important

Note: The latest release of dbt-materialize (v1.7.6), which allows including sources in blue/green deployments, introduced a breaking change to the syntax of source and sink materializations. You must migrate your models to the new syntax before upgrading! ✋

Blue/green deployment workflow

As a recap: in a blue/green deployment, you first deploy your code changes to a deployment environment (green) that is a clone of your production environment (blue), so you can validate the results without causing unavailability. These environments are later swapped transparently.

We baked each step of this workflow into dbt macros that you can invoke in sequence to perform a deployment:

1
dbt run-operation deploy_init    # Create a clone of your production environment
2
dbt run --vars 'deploy: True'    # Deploy the changes to the new deployment environment
3
dbt run-operation deploy_await   # Wait for all objects in the deployment environment to be hydrated (i.e. lag < 1s)
4
                                 # Validate the results (important!)
5
dbt run-operation deploy_promote # Swap environments
bash

Behind the scenes, these macros take care of details like preserving object permissions, preventing against possibly destructive changes (e.g. cutting over too soon, concurrent operations), and letting you know when it's safe to promote your changes. For a full rundown of the workflow, and each workflow step, check out the brand new dbt development guide in the Materialize documentation!

Breaking change: new syntax for source and sink models

To allow including sources in the blue/green deployment workflow, we finally came around to making the syntax of source (and sink) materializations a little uglier, but much more consistent with the other materialization types.

The new syntax omits the CREATE { SOURCE | SINK } clause, and now accepts configuration options like cluster (🤝). Please adjust your models accordingly before upgrading to the latest version of the dbt-materialize adapter (v1.7.6)!

New syntax

1
{{ config(
2
     materialized='source',
3
     cluster='quickstart'
4
   ) }}
5
FROM KAFKA CONNECTION kafka_connection (TOPIC 'test-topic')
6
FORMAT BYTES
sql

Old syntax

1
{{ config(
2
     materialized='source'
3
   ) }}
4
CREATE SOURCE {{ this }} IN CLUSTER 'quickstart'
5
FROM KAFKA CONNECTION kafka_connection (TOPIC 'test-topic')
6
FORMAT BYTES
sql

To upgrade, run:

1
pip install --upgrade dbt-materialize`
bash

And remember to migrate your source and sink models! If you have any feedback on this new workflow, or requests for new features, ping our team on Slack. 🫡