All articles

Data & Infrastructure

PostgreSQL Anchors Enterprise Data as AI Workloads Demand Reliability at Scale

AI Data Press - News Team
|
January 13, 2026

Robert Haas, VP and Chief Database Architect at EnterpriseDB, shows why AI hasn’t changed the fundamentals of data and why PostgreSQL endures by prioritizing reliability, openness, and time.

Credit: Outlever

Key Points

  • AI moves into production and exposes a persistent problem: advanced models fail without data that stays reliable, accessible, and usable over time.

  • Robert Haas, VP and Chief Database Architect at EnterpriseDB, frames AI as an evolution that reinforces long-standing data fundamentals rather than replacing them.

  • PostgreSQL endures by prioritizing stability, openness, and careful engineering, allowing new AI capabilities to layer onto a platform designed to protect data for decades.

AI is revolutionary, but it doesn't change the fundamental need to interact with your data. The core need to make sure data is accessible, to have high availability, and to run queries never goes away.

Robert Haas

VP, Chief Database Architect
EnterpriseDB

Robert Haas

VP, Chief Database Architect
EnterpriseDB

As AI pushes deeper into production systems, PostgreSQL is reasserting itself as foundational infrastructure. While the models evolve rapidly, the demands placed on data remain the same: reliability, availability, and long-term usability still matter most. In an environment obsessed with speed, PostgreSQL earns its place by prioritizing endurance.

A recent conversation on the AI & Data Horizons podcast between Robert Haas, VP and Chief Database Architect at EnterpriseDB, and EDB CMO Michael Gale, offers a grounded lens on why PostgreSQL continues to matter as AI reshapes data workloads. Haas is a longtime contributor and committer to the open-source PostgreSQL project, with his work shaping core features such as parallel query. Drawing on decades of experience inside the Postgres community, he brings a steady, systems-level perspective to an industry increasingly distracted by novelty, reminding listeners that AI may be new, but the fundamentals of managing, accessing, and preserving data are not.

"AI is revolutionary, but it doesn't change the fundamental need to interact with your data. The core need to make sure data is accessible, to have high availability, and to run queries never goes away. That's why I see this as an evolution, not a revolution," says Haas. In his view, the technologies may shift quickly, but the responsibility of the data layer is to remain stable, predictable, and usable over time.

PostgreSQL has endured by doing a few things exceptionally well. Its technical robustness, permissive licensing, and broad ecosystem allow it to absorb new capabilities without destabilizing its core. Even as the database market reorients around AI, Postgres evolves through integration rather than reinvention.

  • A test of time: Its architecture continues to align well with modern, data-intensive workloads, offering reassurance to organizations that care about long-term viability as much as near-term performance. "If you put your data in PostgreSQL, you are going to be able to read that data five years from now, 10 years from now, 20 years from now," explains Haas. "It's large and established enough that it’s not just going to cease to exist and leave you wondering how to interact with your data."

  • Lacking core strength: That focus on longevity helps explain why PostgreSQL has endured while many specialized databases have faded. According to Haas, those systems failed because their narrow focus on a particular problem came at the expense of the enduring work of data management. "Many databases that have disappeared were special purpose systems. They might have been very good at solving one particular problem, but they couldn't do the core work that never goes away: the need to store your data, query it, and make it accessible."

The Postgres community follows a pragmatic slow-follower approach to avoid that fate. Rather than chasing sweeping reinvention, it integrates proven ideas into a mature foundation, sidestepping the "perfect data" trap that often leads to over-engineered systems, missed timelines, and eventual failure.

  • Engineering ethics: That conservatism, Haas says, comes from the stakes involved. "If your web server crashes, you have an outage, but you haven’t permanently lost data. If the database corrupts or loses data, you have a categorically worse problem that’s much harder to recover from," he explains. "As database engineers, we have a responsibility to be conservative. Users may want new features, but what they care about most is that we take good care of their data and keep it secure. We have to stay true to that responsibility first, and only then add new functionality on top of it."

That sense of responsibility also shapes how Haas thinks about PostgreSQL’s future. He points to openness as the central concern, especially as more managed and proprietary variants enter the market and recreate familiar forms of vendor lock-in. For a platform grounded in stability and trust, openness is inseparable from longevity. "We need to make sure it truly does stay open," Haas concludes. "That’s what lets people run it where they want, how they want, and trust it over the long term."