Building application architectures for agent-scale at Delphi

Delphi

Overview: Scaling Human Expertise Through Interactive AI

Delphi is a platform that lets people create a digital version of their mind. Whether it’s a coach, DJ, author, or investor, people with something to teach or share can scale their expertise by making it interactively available through Delphi. As Sam Spelsberg, co-founder and CTO at Delphi, put it:

Delphi is the first way that people can actually create a digital representation that scales one-to-one to one-to-many—scalable interactively, which is a first.

This novel approach to knowledge transfer brought new challenges—particularly as the volume of usage and complexity of views increased, and users expected fast, seamless access no matter where they were in the world.

Challenges Faced: Bottlenecks and Outages: The Cost of Growing Demand

As Delphi’s user base expanded, the insights and audience views inside Studio—the backend creator dashboard—quickly became performance bottlenecks. These pages aggregated data from multiple sources, including SQL tables and S3, and grew increasingly expensive and time-consuming to render.

What started as slow page loads soon escalated into a more serious issue: system fragility. In some cases, a user opening a particularly complex view could max out database connections and crash the container entirely, halting all new queries.

We even had some customers where we would think to ourselves, ‘I hope this person doesn't open their studio today because it's just going to cause one of our database connections to die.

The team needed a more reliable, scalable foundation—one that could support deep, real-time insights without putting system stability at risk.

Choosing Materialize: Avoiding Infrastructure Overhead to Stay Product-Focused

Delphi initially debated whether they needed to move toward a data warehouse or invest in a more complex data pipeline. But the team didn’t want to spend precious engineering cycles on infrastructure work that didn’t directly improve product experience.

What’s our data pipeline going to look like? We didn’t want our competitive advantage to be how effective our data warehouse is. We wanted engineers focused on product and delivering incredible user experience.

Materialize allowed the team to avoid building that whole stack—no need for a full data warehouse or orchestration tools. Instead, they just wired Materialize into their existing RDS database and wrote views in SQL. The setup was so straightforward that, as they noted, even an intern was able to get it up and running.

Transformational Results: Faster Loads, More Users, More Velocity

The performance and productivity impact of Materialize was immediate:

  • 120× reduction in page load times: Dropped from over two minutes (or failed loads) to under a second.
  • 60% improvement in engineering productivity: Avoided weeks of full-time engineering effort building pipelines and a warehouse to serve these views—just plugged Materialize into RDS and wrote SQL.

Evolving Architecture: Streaming + SQL-Powered Views for Real-Time Scale

Delphi began their integration with Materialize using a pattern similar to a smart read replica: streaming changes directly from their Postgres database into Materialize and writing SQL views to serve application data. This allowed them to offload complex queries from their transactional database without disrupting their existing infrastructure.

Materialize’s VPC peering setup made the connection process simple. With a few AWS configuration changes, Delphi was able to establish a secure link between their RDS instance and Materialize, enabling continuous ingestion of change data.

As their architecture evolves, Delphi is adopting a new integration pattern that unifies data across sources. One upcoming change involves offloading a high-volume table from RDS into Kafka, and then using Materialize to join the Kafka event stream with relational data:

We have a table that is starting to get too big to put in RDS… Materialize will actually let us unify the Kafka event stream with other records that are sitting in SQL to create a view that combines those two things.

This approach allows Delphi to maintain high performance while simplifying the logic needed to support increasingly complex views. By expressing these views in SQL, the team can reason about them declaratively and version-control them like code. Delphi is also exploring DBT to manage these data products with configuration files and DAGs, making the architecture easier for new team members to understand and extend.

Supporting Agent Workloads: Instant Queries for AI-Driven Experiences

Delphi is deeply invested in agent-based experiences, with AI agents operating behind every Delphi. These agents increasingly rely on rapid, context-rich queries. Sam highlighted how Materialize is well-suited for this shift:

Agents and LLMs are just making queries directly and accessing the data at incredible scale... and especially if you have complex transformations... how are we going to do that at scale with agents? We need something that's actually doing the work upfront.

Materialize’s incremental view maintenance ensures that data transformations happen ahead of time, enabling consistent sub-100ms latencies even under agent-level query volume.

Looking Ahead: Expanding Real-Time Use Cases and Data Source Integration

Delphi’s next steps include deepening their use of Materialize on the consumer side to serve agents even faster, with a goal of ensuring that all queries executed by their agents—known as Delphis—are completed in under 100 milliseconds. As they continue to deliver richer insights to creators, the complexity of these views is increasing, opening up significant greenfield opportunities for growth and optimization.

The team is also planning to evolve their backend architecture. For tables that have outgrown RDS, they are beginning a migration to Kafka and Materialize, enabling them to unify event streams with existing SQL records in a seamless way:

We’re gonna rip it out and use Kafka, and Materialize will actually let us unify that event stream with records sitting in SQL.

Conclusion: Building Magical Experiences for the AI-Native Era

Delphi didn’t just improve performance—they fundamentally upgraded their architecture for the AI-native future. Materialize helped them meet rising user expectations without pulling engineers away from building magical product experiences.

It’s not sufficient anymore to have a product that’s just okay or just good. It needs to be really good and it needs to be magical. And you have to be leveraging new tools like Materialize if you want to meet that bar.

Get Started with Materialize