AI adoption is dividing Governance, Risk, and Compliance organizations into two camps. One is redesigning governance around automation, technical fluency, and controls that can actually run in production. The other is clinging to legacy compliance workflows built for a slower regulatory cycle and hoping scrutiny doesn’t catch up. The distance between them is growing, and regulators are already closing in.
Elena Garvin is an AI Governance and Regulatory Analyst and a GRC and Data Governance professional with over 15 years of audit and compliance experience. A Certified Fraud Examiner with a Master of Legal Studies, Garvin has spent her career leading multi-state audit programs and standardizing processes at various firms. Now serving as a Regional Ambassador for Women in CyberSecurity (WiCyS) Florida, her message is that clinging to legacy operating models now creates significant compliance and operational risks.
"Automation in GRC is going to be a non-negotiable. If you're not building policies and controls with automation in mind, there's no way to keep up," says Garvin. The core tension is the persistent disconnect between policy and implementation, she explains. Organizations write governance frameworks that satisfy regulatory language but lack the technical infrastructure to make those frameworks operational. Legacy equipment, fragmented systems, and under-resourced IT departments mean that what looks compliant on paper often fails in practice.
Paper compliance breaks down: "You can write the most beautiful policy in the world, but if it doesn't translate into operations, it's just paper," Garvin says. "Regulatory requirements state how things should be controlled, but sometimes the organization's infrastructure doesn't respond to that. They have legacy equipment, they don't have the controls. There's always that disconnect between policies and operations." Bridging that gap, she argues, requires technical literacy at the policy formation stage, not just the implementation stage.
Regulators want logs, not screenshots: The regulatory side of the equation is evolving just as fast. Garvin points to a shift in what auditors and regulators expect as evidence of compliance. "It's no longer a case of 'give me a screenshot,' because that screenshot is already outdated by the time you print it," she says. "Regulators are becoming more savvy. They know enough now to say, 'There should be a record here.' They may not ask to see your script, but they'll know when something should have been logged and wasn't."
That shift changes the baseline for what counts as adequate governance. Percentage-based audit sampling, once standard, gives way to full-population testing enabled by automation. Garvin sees this as the most concrete expression of the broader transformation, where GRC engineering becomes a core discipline rather than a support function.
From sampling to scripting. "The big thing with sampling is that we used to take a percentage. Now the movement is toward companies that can sample 100%," Garvin says. "GRC engineering is already a non-negotiable. That's the main reason I started learning Python and cloud security. In order to sample a population, you need to run the script that allows for that type of sampling. That goes a long way toward bridging the gap between policy and operations."
Fumbling until enforcement: Garvin is candid about the likely trajectory for most organizations. Larger enterprises, in particular, tend to react to enforcement rather than anticipate it. "A lot of times, businesses are going to fumble their way through until they get in trouble. AI is evolving at a rapid pace, and companies that are rightsizing and being lean probably don't have a department focused on this. They're lumping it onto IT," she says. "Until there's a problem, that's when we start checking who had access and who shouldn't have." She sees case law and regulatory precedent as the likely forcing function, mirroring how compliance standards have historically emerged across industries.
The skill set required to operate in this environment is expanding. Garvin frames continuous learning as a shared obligation between professionals and the organizations that employ them. The information exists. Free training platforms, professional communities, and industry associations like ISACA and WiCyS provide entry points. But availability alone does not close the gap.
What closes it, Garvin argues, is a cultural commitment to making governance technically executable before regulatory pressure forces the issue. Map controls to frameworks now. Ensure data lineage and access governance are in place now. Build the automation capability before the audit letter arrives. "AI needs to be understood. It needs to be controlled. It needs to be secure. The train has already left. There's no way to go back," she concludes.