Skip to main content

AI Governance in the Boardroom: Advocating for responsible adoption

office meeting computer 1204607

NPOs operate in a rapidly evolving information landscape in which artificial intelligence (AI) is increasingly becoming integral to mission delivery. Has your Board considered the implications of AI? The emerging concept of AI Governance is now an undeniable reality, which cannot be ignored.

AI Governance in the Boardroom

The Institute of Directors (IOD), in its Business Paper entitled AI Governance in the Boardroom, underscores that overseeing AI is a responsibility at the Board level, not solely an IT concern. The IOD states that: “For board directors, AI regulation is no longer a speculative concern - it is a growing strategic, legal, and ethical issue. Boards must understand how national policy, international standards, and foreign legislation may impact their business operations, directly or indirectly.” Organisations are urged to monitor emerging regulations and geopolitical changes, regularly audit AI systems, and conduct comprehensive risk and impact assessments for both business and stakeholders.

According to the IOD’s Policy Voice survey, which was conducted in March 2025 and gathered perspectives from nearly 700 directors and business leaders, highlights a dynamic landscape in AI adoption. While the use of AI tools is increasing, nearly two-thirds of directors now employ them in their roles, and half report organisational usage, many organisations lack clear internal policies, strategies, or data governance frameworks for AI.

The survey identifies significant barriers to AI adoption, particularly due to limited expertise in AI at both management and board levels. Leaders also express concerns about trust in AI outcomes, security risks, and ethical issues like bias. These barriers can be addressed through targeted skills development, transparent governance, and effective risk management is essential for nonprofits to harness AI’s potential responsibly. The IOD concludes that: “AI adoption and governance must be treated as a strategic boardroom issue, not a purely technical matter. Boards must lead on risk oversight, capacity building, and stakeholder communication to ensure the responsible and effective adoption of AI.”

What guidance does the draft King V provide?

Governing bodies must understand how AI intersects with established information disciplines to preserve trust, ensure compliance and harness new opportunities. The draft King V Report makes express reference to artificial intelligence as part of emerging technologies, which must be governed to safeguard ethical deployment and human oversight. The draft King V report recommends that every AI system “bought, built, used or sold” should adhere to ethical and trustworthy principles, with clear processes for human intervention where risks are material.

Including AI on the Board’s agenda shifts governance from a reactive stance to a proactive strategy.

King V further recommends that AI should ensure that arrangements are in place to safeguard, ongoing oversight of AI systems which perform continuous learning and change behaviour to ensure that these systems remain to be deployed and used responsibly. The final iteration of King V may look different from the current draft – it is however evident that, since King IV, AI is an emerging topic which require Board engagement.

Has AI made it onto the Board’s Agenda?

The IOD states that Boards are increasingly addressing AI as it becomes more widely used by directors and within organisations. Although adoption is rising, many boards do not yet have comprehensive internal policies, defined strategies, or data governance frameworks, raising important questions about responsible implementation.

Deloitte, in the paper entitled: Governance of AI: A critical imperative for today’s boards, states that:

Overseeing a company’s AI adoption and its related opportunities and risks requires a board to first understand the company’s current AI maturity. Questions for the board to consider in this area, especially if they are looking to add it to the board agenda, might include:

    • Do management and the board understand how AI is impacting or will impact the company, whether directly or indirectly?
    • Is there an “inventory” of how the company is currently leveraging—or potentially missing opportunities to leverage—AI?
    • Does the board have a clear vision of how AI initiatives will be overseen across the governance structure?

Including AI on the Board’s agenda shifts governance from a reactive stance to a proactive strategy. We invite you to explore this transformative journey.

Ricardo Wyngaard

The NPO Lawyer | Ricardo Wyngaard Attorneys

Ricardo Wyngaard is passionate about the non-profit sector and has been focusing on non-profit law since 1999. He is a lawyer by profession who has obtained his LLB degree at the University of the Western Cape in South Africa and his LLM degree at the University of Illinois in the USA. He has authored a number of articles and booklets on non-profit law and governance.

Related articles


Governing through a crisis
Ricardo Wyngaard | The NPO Lawyer
The International Federation of Red Cross and Red Crescent Societies (IFRC), in their Strategy 2020 document, states that: “A disaster or crisis may arise as a sudden emergency or it may be slow on...
[New Free Resource] How to Organise, Manage, and Care for Your Archive: A Guide for Community Organisations in South Africa
Helen Joannides
The Western Cape Archives and Records Service has released a new, free resource aimed at helping community organisations better manage their archival materials. Whether you belong to a religious g...
Five agenda items for NPO Boards
Ricardo Wyngaard | The NPO Lawyer
The King IV Report on Corporate Governance for South Africa (King IV) was published on 1 November 2016 and includes a sector supplement for non-profit organisations. One implication is that the cor...