Governance and leadership experts, Professors Nada and Andrew Kakabadse argue that while artificial intelligence (AI) is transforming the corporate landscape, boards must focus on governance responsibilities, rather than acquiring AI-specific expertise.
The AI era: Opportunities and risks
As we move into 2025, the disruptive potential of generative AI is evident. Tools like ChatGPT can enhance creativity and efficiency, but also pose risks such as spreading misinformation, or exposing organisations to security breaches.
For instance, OpenAI’s ChatGPT experienced a significant security breach in March 2023, leading to Italy temporarily banning its use. These incidents underscore the need for robust oversight.
AI adoption introduces unprecedented governance complexity, requiring boards to integrate AI thoughtfully into organisational strategies while managing associated risks.
Four key challenges for boards
The Kakabadses highlight that many existing guidelines for boards regarding AI are too vague to be effective. To address this, they say, boards should consider:
- Competitive advantage: How can AI technologies deliver clear benefits, differentiate the organisation, and justify associated costs?
- Risk: What vulnerabilities might emerge from adopting or ignoring AI advancements?
- Reputation: What are the potential impacts—positive or negative—on the organisation’s image?
- Value: How does AI adoption align with strategic objectives, balancing competitive advantage, risk, and reputation?
The pathway forward: Strategic governance
Boards must pursue a disciplined, evidence-based approach to AI. The professors propose a four-step plan to ensure strategic clarity:
- Define a clear, accountable action plan for AI integration
- Embrace the operational improvements AI offers, including efficiency, cost savings, and predictive accuracy
- Establish sound controls to mitigate risks, particularly concerning misinformation and data security
- Engage with both C-suite executives and general management to align strategic vision with operational realities.
Insights from management
General managers, with their localised knowledge, play a vital role in implementing board-approved strategies. Listening to their insights helps avoid strategic distortion and ensures actionable, market-aligned decisions.
Oversight, not expertise
The Kakabadses further stress that board directors don’t need to be experts in AI, ESG, or sustainability. Instead, they should provide meaningful oversight, guided by expert input, and apply the disciplines of competitive advantage, risk, reputation, and value.
The broader implications
The global economic impact of AI continues to grow. In the UK, AI-specific businesses contributed £9.1 billion to the economy, while cybercrime, fuelled by AI vulnerabilities, is projected to reach $10.5 trillion globally by 2025.
To navigate this evolving landscape, boards must exercise rigorous governance, leveraging technology to strengthen organizational resilience while avoiding undue reliance on technology-driven oversight.
Professors Nada and Andrew Kakabadse conclude: “Boards don’t need a new generation of AI experts. They need directors who effectively exercise their governance duties, informed by expert advice and disciplined scrutiny, to protect and enhance the organizations they serve.”