Corporate Adviser
  • Content Hubs
  • Magazine
  • Alerts
  • Events
  • Video
    • Master Trust Conference 2024 videos
  • Research & Guides
  • About
  • Contact
  • Home
  • News
  • In Depth
  • Profile
  • Pensions
    • Auto-enrolment
    • DB
    • DC
    • Defaults
    • Investment
    • Master Trusts
    • Sipps & SSAS
    • Taxation
  • Group Risk
    • Group Life
    • Group IP
    • Group CIC
    • Mental Health
    • Rehab
    • Wellbeing
  • Healthcare
    • Musculoskeletal
    • Mental Health
    • IPT
    • Wellbeing
    • Trusts
    • Cash Plans
  • Wellbeing
    • Mental Health
    • Health & Wellbeing
    • Financial resilience
  • ESG
No Result
View All Result
Corporate Adviser
No Result
View All Result

The AI challenge – unpicking risks and opportunities for investors

Eva Cairns, Head of Responsible Investment at Scottish Widows on why the widespread use of AI highlights the need for robust governance.

by Scottish Widows
July 1, 2025
Share on FacebookShare on TwitterShare on LinkedInShare on Pinterest

Sponsored Content

AI is fast becoming an essential part of our daily lives and often in surprising ways. From the road network and the healthcare we receive, to the news we read and food we order, AI is transforming the world we live in.

AI will shape the next era of corporate strategy, economic growth and market transformation. Indeed, in the UK alone 75% of financial firms are already using AI1 and a further 10% are planning to use it in future. 

But with such rapid and widespread adoption, AI is quickly becoming a core governance and sustainability challenge that presents material risks to companies. This is a core theme in our new report, Governing the Algorithm: Investor Priorities for Responsible AI in which we analyse the clear opportunities – and responsibilities – for investors to help shape the governance standards needed to manage the emerging risks of AI. 

Ultimately, it’s in everyone’s interests to ensure that AI is developed and deployed in a way that supports not only innovation, but inclusion, stability and shared prosperity. 

This is exactly why we incorporated AI and ethics into our stewardship priorities in 2023. Since then, we have been researching and engaging with asset managers to explore how AI oversight can be more effectively embedded into ESG analysis and investment practice.

AI governance priorities

While AI offers efficiency and innovation, it also introduces systems that lack transparency where decision-making logic is hard to pin-point. 

These so-called ‘black box’ AI models – where even the models’ developer is unable to determine how it makes decisions – raise serious risks related to bias, misinformation, privacy and operational integrity. This could create serious challenges for businesses around issues such as legal exposure, reputational damage and eroded stakeholder trust.

Despite the scale of adoption, Stanford’s 2024 AI Index2 finds that fewer than 20% of public companies currently disclose details about their AI risk mitigation strategies, and only 10% report on fairness or bias assessments. 

This lack of transparency presents a material blind spot for both investors, and regulators. In our report we found this transparency gap makes it increasingly difficult for investors to understand how AI is being governed, especially in high-impact sectors such as healthcare, finance and retail.

To tackle this, boards must consider AI as a cross-cutting governance concern – much like cyber-security or climate risk – that requires appropriate oversight and clear risk mitigation processes.

Our framework for investor action

Our report highlights analysis by ISS-Corporate3 that reveals only 15% of S&P 500 companies disclosed some form of board oversight of AI in their proxy statements. Even fewer, just 1.6%, provide explicit disclosure of full board or committee-level responsibility.

To help address this, we advocate a three-part approach. 

First, we believe AI governance should be integrated into ESG investment analysis, with investors assessing how companies disclose AI use, establish internal safeguards, and assign oversight to executive or board-level leaders. 

Second, stewardship and engagement must focus on how companies govern AI day-to-day. This includes engaging on bias assessments, explainability mechanisms, and ensuring human oversight is embedded in high-impact use cases. Where transparency or risk management is lacking, escalation through proxy voting can be an appropriate tool.

And lastly, investors have a crucial role in setting clear expectations. This means aligning stewardship practices with global standards like the OECD AI principles, and EU AI Act. By setting and following clear standards, we can help shape an investment environment where innovation is matched by accountability.

Supporting responsible AI 

With long-term investment horizons and systemic influence, pension schemes are uniquely placed to drive stronger governance standards across the economy. As long-term stewards of capital, we are accountable not only for today’s performance, but for the sustainability and resilience of people’s futures.

By encouraging better governance and disclosure, pension funds can carefully guide the widespread adoption of AI and contribute to more transparent, equitable, and future-fit corporate behaviour. This doesn’t mean limiting innovation but ensuring that it is guided in a way that aligns with societal expectations and legal standards, and supports long-term economic stability, inclusion and accountability.

We’re committed to working alongside companies, policymakers and industry peers to ensure this governance evolves in step with innovation. Because contributing towards a resilient, trustworthy economy in the age of AI is not just good governance, but an essential part of our duty to current and future beneficiaries.

 

Read the report: Governing the Algorithm: Investor Priorities for Responsible AI

 

Sources:

  1. Artificial intelligence in UK financial services – 2024 | Bank of England
  2. https://hai.stanford.edu/ai-index/2024-ai-index-report
  3. https://adviser.scottishwidows.co.uk/assets/literature/docs/61448.pdf

 

Read the latest news, expertise and thought leadership from Scottish Widows’ workplace pensions experts – here.

Corporate Adviser Special Report

REQUEST YOUR COPY

Most Popular

  • Aviva launches ‘flex first, fix later’ retirement option for master trust savers

  • GAD to assess state pension age in third review

  • Forget the dream holiday: Mercer reveals 50pc of over-55s experience ‘FORO’

  • Stancombe to lead retirement platform business

  • Laura Mason: This is the moment for targeted pension support

  • ‘Pound for Pound’ initiative launches to pilot new pension value metrics

Corporate Adviser

© 2017-2024 Definite Article Media Limited. Design by 71 Media Limited.

  • About
  • Advertise
  • Privacy policy
  • T&Cs
  • Contact

Follow Us

X
No Result
View All Result
  • Home
  • News
  • In Depth
  • Profile
  • Pensions
    • Auto-enrolment
    • DB
    • DC
    • Defaults
    • Investment
    • Master Trusts
    • Sipps & SSAS
    • Taxation
  • Group Risk
    • Group Life
    • Group IP
    • Group CIC
    • Mental Health
    • Rehab
    • Wellbeing
  • Healthcare
    • Musculoskeletal
    • Mental Health
    • IPT
    • Wellbeing
    • Trusts
    • Cash Plans
  • Wellbeing
    • Mental Health
    • Health & Wellbeing
    • Financial resilience
  • ESG

No Result
View All Result
  • Home
  • News
  • In Depth
  • Profile
  • Pensions
    • Auto-enrolment
    • DB
    • DC
    • Defaults
    • Investment
    • Master Trusts
    • Sipps & SSAS
    • Taxation
  • Group Risk
    • Group Life
    • Group IP
    • Group CIC
    • Mental Health
    • Rehab
    • Wellbeing
  • Healthcare
    • Musculoskeletal
    • Mental Health
    • IPT
    • Wellbeing
    • Trusts
    • Cash Plans
  • Wellbeing
    • Mental Health
    • Health & Wellbeing
    • Financial resilience
  • ESG

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.