ICR

← News & Resources · Governance

Board AI Oversight in 2026

AI risk has moved from a procurement question to a board-level governance concern. Five questions every public-company board should be able to answer this proxy season.

May 8, 2026ICR AI & Intelligence5 min read

Two things changed in 2025 that turned AI from a CIO line item into a board agenda item. The SEC settled its first AI-washing cases against advisors who overstated their AI capabilities in client communications. And Edelman, FGS Global, and Brunswick all began publicly counseling boards on AI governance, signaling that institutional investors and proxy advisors are starting to score it.

By proxy season 2026, every public-company board should be able to answer five questions clearly, in plain language, on the record.

1. What is our AI risk inventory?

Boards do not need a vendor list. They need a categorized inventory of where AI touches the business: customer-facing, employee-facing, regulated-output (financial, medical, legal), and decision-support. Each row needs a named owner, a stated risk class, and a control framework. If the company cannot produce this in two pages, that itself is the disclosure issue.

2. Who is accountable when an AI system causes harm?

Models hallucinate. Vendors fail. Data leaks. The board's question is not whether these will happen but who is on the line when they do. The answer should reference a specific officer, a specific committee charter, and a specific reporting cadence.

3. What is our policy on AI-generated content in disclosures?

If the press release that announced last quarter's earnings was drafted by a language model and reviewed by a junior staffer, the board should know that, know who validated it, and know what controls exist to prevent a hallucinated number from making it into the next 10-K.

4. How would we respond to a deepfake or AI-driven misinformation event?

FGS Global acquired Memetica in early 2026 specifically because deepfake detection and AI-misinformation response have become baseline crisis-comms expectations. The board's question: do we have a vendor on retainer, a holding-statement template grounded in approved messaging, and a protocol for forensic verification before responding?

5. What is our AI governance structure?

There is no settled best practice here, but there are now three reference patterns: a dedicated AI committee of the board (large-cap, regulated industries), a sub-charter under audit or technology (most common), or full-board oversight with management reporting (smaller companies). Pick one and document the choice. Saying nothing is the worst posture.

The right answer to most of these questions is not the same answer your peers have. The right answer is the one your board can defend in the proxy, in front of an activist, and in a deposition.

ICR's AI & Intelligence practice helps boards translate these questions into governance documentation, disclosure language, and crisis playbooks. The goal is not to over-disclose. It is to be ready.

Read more

Field notes, sector briefings, and the D.C. Insider monthly column.