Competing data rules are making old methods of storing and using data obsolete. Now, some companies are building new automated systems just to manage their data. Meanwhile, regulations like the EU AI Act are creating new rights and litigation risks with implications for organizations worldwide. The regulatory pressure is significant enough that even some major cloud providers are adapting their services.
For an expert's take, we spoke with Rebecca Evans, Director of Product Marketing at OneTrust, who leads the go-to-market strategy for the company's AI Governance product. From her perspective, the traditional waterfall approach to governance is proving inadequate for the democratized, general-purpose nature of modern AI.
"AI adoption is forcing a complete rethink of governance. Legacy tools and the old waterfall approach break under the weight of AI. With today's pace of innovation, leaders must modernize governance by embedding it directly into workflows and tools by design," Evans says.
The early stages of AI adoption often reveal unexpected challenges, especially for leaders who assumed their data systems were already in order, Evans explains. But any technical failure is just a symptom of a deeper business problem.
Everybody's problem: As many as 95% of AI projects fail due to a "miscalibrated ROI," Evans continues. "Visibility and accountability are non-negotiable. Without clear ownership, governance becomes fragmented, and that's when teams either slow down or bypass controls entirely. To solve this, governance must become everybody's responsibility through new structures like hub-and-spoke or federated models, rather than relying on a single, siloed team."
The trust deficit: Un-costed variables push out timelines and erode value, Evans continues. But the most costly consequence is a breakdown in trust. "A majority of end users don't trust the AI tools they're being asked to use. If you've spent a lot of money on a new AI-powered tool and your end users don't trust it, they won't use it. You will not see any of the promised value."
One solution to this mix of geopolitical risk, technical debt, and human distrust is to embed rules directly into the machine. By turning policy into code, the system can automatically check and enforce it in real time, she explains.
Code becomes law: Today, the trend toward automated compliance even has government endorsement, Evans notes. For her, the evolution feels natural simply because it's rooted in decades of established practice. Put simply, every conversation about AI governance is a conversation about data governance, she says. "AI policies need to become more programmatic and enforced at run time, acting as controls at the point where AI is being accessed."
Before implementing technical controls, however, leaders must reframe governance first. "Instead of the old 'brake pads' model, I see governance as the 'steering wheel' that allows an organization to capture AI's value responsibly and sustainably, all while staying in control." The organizations that win, Evans concludes, will be those transforming governance from a perceived cost center into a strategic, competitive advantage to innovate confidently at scale. "Done right, guardrails turn into green lights. It becomes a driver of trust, not a checkpoint. This is the biggest mindset shift organizations need to adapt."