All articles

Leadership

The Memory Keychain: Who Holds the Keys to Trust in the Era of Stateful AI?

AI Data Press - News Team
|
November 7, 2025

Sanver Gozen, ex-AI/ML leader at Expedia and Founder of Nemroot on AI memory and hybrid architecture.

Credit: Outlever

Key Points

  • The rise of stateful AI transformed the enterprise build-vs-buy decision from a simple cost trade-off into a strategic question of control, trust, and long-term vision.
  • Sanver Gozen, ex-AI/ML leader at Expedia Group and Founder of a new agentic workflow startup called Nemroot, explains that because AI memory creates a personalized, trust-based user relationship, companies must decide if that relationship is a core differentiator they need to own.
  • Gozen advocates for a hybrid architecture where organizations initially partner with hosted vendors, while building their own core memory systems in the background to eventually gain more control and reduce costs.

 

The mistake I often see is teams trying to build from scratch just because it feels innovative. It makes them feel like, 'I did this. I own this.' But the solution is already out there; you could just buy it. Without the right architecture, governance, or even a clear use case, that approach can backfire really quickly.

Sanver Gozen

ex-AI/ML Leader at Expedia
Founder Nemroot

Sanver Gozen

ex-AI/ML Leader at Expedia
Founder Nemroot

The sudden viability of enterprise-grade AI has flipped the script on cautious technology implementation. For years, the choice to build or buy technology was a straightforward equation of speed versus cost and human capital constraints. Today, the rise of stateful AI with persistent memory has transformed that simple trade-off into a broader question of strategy, control, and trust where the ease of adoption and fast track to ROI must be weighed against the relative risks of technical nascency.

We spoke with Sanver Gozen, Founder of newly-launched agentic AI startup Nemroot and an ex-AI/ML Technical Program Manager at Expedia Group where he led initiatives to enhance personalized recommendations for global platforms. Gozen noted that as AI evolves from a stateless tool that "forgets" after every interaction into an intelligent partner that remembers and grows with the user, it creates immense opportunity alongside new liabilities. If that memory is inaccurate, biased, or insecure, the consequences can be severe. Enterprises aren't just adopting AI tools anymore; they are deciding who owns the trust relationship with their users. For Gozen, the entire debate comes down to the human element at the center of the machine.

Gozen highlighted the need to "own the memory keychain" and educate users in business settings about the disparity between what users think generative AI is doing, and how it actually works. While off-the-shelf API providers like ChatGPT offer the illusion of memory in their consumer-facing products, enterprise tools must be held to a higher standard.

  • The stateless illusion: "We are no longer building software that generates predictable outputs. Generative and agentic AI contains database-like long-term persistent context, which often gains artificial user trust. When a user is getting personalized answers, they think, 'Okay, it knows me. I trust it because it knows me.' People think that AI can 'remember things,' but that's not true. When you are dealing with APIs in the background, each response is really just a stateless transaction. You ask something, it returns the conversation, and that's it."

  • The architectural blueprint: "To build true, persistent memory, enterprises must go beyond simple RAG. You need to create an agentic system which has a database option built inside it, and this database ideally needs to be managed locally. This way, you can perform CRUD operations closest to where the actual work is being done."

The defining question for leaders is if memory is a differentiator worth building, or a commodity that is best outsourced? With memory now at the core of the user experience, the traditional build-versus-buy decision requires a new framework. The choice is no longer about budget alone, but about whether an AI capability is a strategic differentiator that an organization must own and control.

  • The control premium: "With data ownership, you also need to manage regulatory and legal risk. From an architect's point of view, you need solid versioning, access control, and a clear boundary between what the model knows versus what it remembers. Without that, you're inviting chaos. You need to define what's being remembered, for how long, and who can see or delete it."

  • The power of persistence: "If persistent memory is a strategic part of your product and you have the right people, building gives you more control for a long-term vision. But if speed and proven outcomes matter more, buying or partnering can save time and money. Now the risk belongs to the others because you've delegated the whole thing to a third party." Gozen warned against a common pitfall where the allure of creation overrides strategic sense. "The mistake I often see is teams trying to build from scratch just because it feels innovative. It makes them feel like, 'I did this. I own this.' But the solution is already out there; you could just buy it. Without the right architecture, governance, or even a clear use case, that approach can backfire really quickly."

  • The hybrid evolution: "Most mature orgs end up with a hybrid approach," Gozen said. "They start with a partner to learn, then selectively build out their own version of what becomes core later." This strategy allows companies to move fast while preparing for long-term ownership. "After you say, 'Okay, this is not enough for me, I need more freedom,' you can start to build something hybrid in the background while the product is still running reliably."

 

Creating a memory system is only half the battle. Ensuring that memory remains accurate and uncorrupted over time introduces questions of governance. This risk of longer-horizon hallucinations is forcing companies to create entirely new functions.

  • The new auditors: Gozen described a two-pronged approach to governance, combining tactical process with strategic oversight. The first is disciplined data hygiene. "In companies like Expedia, we were always checking the data over time, and constantly cleaning the data if there is a problem. The second is an organizational shift, where responsibility belongs to a new department, such as an AI audit department tasked with checking to see if tools are obeying the rules and working securely and consistently without any bias. When I hear people discuss the prospect of being 'fired by AI,' I point to the fact that many will actually be 'rehired by AI' for the exact same reasons."