Drizly: reinventing the alcohol shopping experience
Drizly is an Ecommerce platform that allows consumers to buy alcohol online and get it delivered right to their door in under 60 minutes. In an industry predicated upon speed of delivery and ease of shopping, Drizly is committed to providing a first-class customer experience.
Reducing cart abandonment: triggering a notification in the absence of an event
The likelihood that an online shopper will not purchase goods added to their cart increases as more time passes. This holds even more true in the context of spirits and entertainment, where day-of planning and spur-of-the-moment events are part of Drizly’s target persona.
Drizly sought to reduce the time required to generate a cart abandonment notification using real-time data. This is a common challenge across ecommerce companies－triggering a notification in the absence of an event is difficult. In Drizly’s case, existing solutions like ksqlDB and Faust were not able to solve this problem due to limited functionality with joining a high number of data streams.
The ability to join various event types is required for tracking cart abandonment as you need to join data such as user actions (add to cart, remove from cart), session status (active inactive), and checkouts. Additionally, it’s stateful because you need to know the history of a user’s orders to take action.
|Before Materialize||After Materialize|
When a Drizly shopper abandoned their cart, they received a shopping cart notification the next day, as batch processing in Snowflake required a 24-hour window.
Cart abandonment was highly likely, resulting in lost revenue and a suboptimal user experience.
Once a Drizly customer abandons their cart, they are notified of their pending order in a 30-minute window, enabled by streaming data into Materialize.
By easily switching from batch processing to streaming data with Materialize, Drizly was able to reduce cart abandonment, increase revenue, and provide a more seamless customer experience.
Easily moving from batch to streaming from an Architectural View
Pictured Above: Drizly Architecture Diagram (Stream infrastructure = Kafka, Stream Processing = Materialize, Stream Modeling = dbt)
Drizly uses Confluent Cloud to manage their schema registries and Kafka topics. User actions including “Add to Cart” and “Checkout” are streamed as events into Kafka and then read into Materialize as sources. The Drizly team writes SQL (managed in dbt) to join and transform data as needed to identify a user with an abandoned cart — that is, the absence of a corresponding checkout event after 30 minutes of adding items to their cart. Flagged users are added into an incrementally-updated materialized view.
Each flagged user is then streamed by Materialize into a separate Kafka topic. Those enriched Kafka topics trigger downstream services that generate reminder notifications to users to complete the purchase remaining in their cart.
Materialize + dbt offer standard SQL for streaming data
Whereas other solutions were not able to provide this kind of workflow, Materialize and dbt offered an out-of-the-box connection. Drizly was an established dbt user － and as both services utilize standard SQL, Drizly was able to significantly reduce the time required to build the service and quickly ramp the team on new infrastructure changes. Leveraging standard SQL allowed Drizly to avoid engineering bottlenecks commonly associated with tools that required learning a new programming language. Both data scientists and analysts could also use Materialize and dbt, empowering their data teams to own more of the transformation logic rather than relying on engineering efforts.
As Emily Hawkins, Data Infrastructure Lead at Drizly best states, ”Working with Materialize has been an incredibly seamless process as we can continue to write real-time SQL, exactly the same way as we already are in Snowflake with batch, so it was a much lower barrier to entry. It was also a huge plus for us that we could continue using dbt within the real-time platform to help us address our online cart conversion challenges where customers can be reminded of a pending purchase in the span of minutes versus hours.”
From a Drizly presentation on their streaming decision:
|Why Materialize?||Why dbt?|
Build real-time data products with unbounded scale on Materialize, in the cloud.Get Access
What’s next for Drizly and Materialize?
Building real-time personalization and recommendation services
As the service matures, Drizly plans to implement more data science models for increased personalization in their real-time customer alerts — including recommendations for related items based on their selections or past purchases.
Drizly will also use Materialize to improve their real-time dynamic experiences through customer triggered workflows/actions. For example, a first-time browser of wine might receive a quiz for wine preferences and tastes — which would then suggest specific wines in real-time for the customer’s first Drizly purchase.
“Abandon browse” is another example of customer triggered workflows, in which Drizly will send notifications to users who view products on-site, but don’t add anything to cart, and then leave the site. In this case, Drizly will send a notification after “abandon browse” takes place to predict three items a user might be interested in given their recent session and any other sessions preceding that.
Moving to Materialize Cloud
Drizly plans to transition to Materialize Cloud to spin up multiple deployments within the same account and have Materialize hold both development and production instances. This would alleviate Drizly from having to host and manage all their data themselves.
Want to learn more about Materialize?
Materialize is a streaming database for real-time analytics. Materialize simplifies how developers build with real-time data, using incremental computation to provide low latency, correct answers — all using standard SQL. With nearly a decade of technical research behind it, Materialize was launched to address the growing need to build real-time applications easily and efficiently on streaming data so that businesses can obtain actionable intelligence.
For a how-to-guide on how to implement the workflows that Drizly used for real-time notifications and alerting, you can read our Temporal Filters blog post here. Temporal Filters, also known as “windowed” or time-sensitive queries, is the mechanism Drizly used to ‘flag a user if idle for 30 minutes.’ You can also check out our blog post on the Materialize-dbt integration here, which shows how the adapter is a useful way to manage your models and SQL for both batch and streaming.
We encourage you to try us out by signing up for Materialize or join the discussion in our Community!
***Update: When Materialize was implemented Drizly was using Confluent Cloud to manage their schema registries and Kafka topics. However, they recently migrated from Confluent Cloud to Amazon MSK. Materialize continues to work seamlessly, powering Drizly’s abandoned cart services, regardless of their choice of Kafka service.