ICYMI: we’ve been working on supporting blue/green deployments in Materialize, which allow you to deploy schema changes to production without downtime. Now — safely cutting over to a new version of your data model in a continually running system takes a non-trivial amount of coordination…and we wouldn’t want to burden you with that.
To make blue/green deployments easier to manage, we built the workflow
into the dbt-materialize
adapter.
important
Note: The latest release of dbt-materialize
(v1.7.6), which allows
including sources in blue/green deployments, introduced a breaking change
to the syntax of source
and sink
materializations. You must migrate your
models to the new syntax before upgrading! ✋
Blue/green deployment workflow
As a recap: in a blue/green deployment, you first deploy your code changes to a
deployment environment (green
) that is a clone of your production
environment (blue
), so you can validate the results without causing
unavailability. These environments are later swapped transparently.
We baked each step of this workflow into dbt macros that you can invoke in sequence to perform a deployment:
dbt run-operation deploy_init # Create a clone of your production environment
dbt run --vars 'deploy: True' # Deploy the changes to the new deployment environment
dbt run-operation deploy_await # Wait for all objects in the deployment environment to be hydrated (i.e. lag < 1s)
# Validate the results (important!)
dbt run-operation deploy_promote # Swap environments
Behind the scenes, these macros take care of details like preserving object permissions, preventing against possibly destructive changes (e.g. cutting over too soon, concurrent operations), and letting you know when it’s safe to promote your changes. For a full rundown of the workflow, and each workflow step, check out the brand new dbt development guide in the Materialize documentation!
Breaking change: new syntax for source
and sink
models
To allow including sources in the blue/green deployment workflow, we finally
came around to making the syntax of source
(and sink
) materializations a
little uglier, but much more consistent with the other materialization types.
The new syntax omits the CREATE { SOURCE | SINK }
clause, and now accepts
configuration options like cluster
(🤝).
Please adjust your models accordingly before upgrading to the latest version of
the dbt-materialize
adapter (v1.7.6)!
New syntax
{{ config(
materialized='source',
cluster='quickstart'
) }}
FROM KAFKA CONNECTION kafka_connection (TOPIC 'test-topic')
FORMAT BYTES
Old syntax
{{ config(
materialized='source'
) }}
CREATE SOURCE {{ this }} IN CLUSTER 'quickstart'
FROM KAFKA CONNECTION kafka_connection (TOPIC 'test-topic')
FORMAT BYTES
To upgrade, run:
pip install --upgrade dbt-materialize`
And remember to migrate your source
and sink
models! If you have any
feedback on this new workflow, or requests for new features, ping our team on Slack. 🫡