AI: the legal impact for charities
AI is the fourth industrial revolution. It is changing how we interact with the world and each other in ways we don’t yet fully understand. Its impact and potential is transformative in ways which are potentially both damaging and beneficial.
Given that the financial and fundraising outlook continues to appear bleak as we move into 2025 – and facing the additional burden of employers’ national insurance contributions – charities would be remiss if they did not consider AI’s potential to save costs and redeploy resources of personnel and time.
On 11 February, Stone King will be running a webinar which (i) demystifies the current legal position on AI and (ii) explains the practical steps which we think charities should take to use AI safely and lawfully.
Topics we will cover include:
What is the UK Government’s view?
Policymakers face a difficult choice between stifling innovation and positive impact and failing to regulate against problematic outcomes like misinformation and discrimination. The current Government has continued to apply its predecessor’s “pro-innovation” (i.e. light-regulation) approach, which is similar to the US but different from the EU (which is drafting its first comprehensive AI legislation).
What is the state of currently applicable law?
In brief, there is currently no AI-specific legislation, regulation or regulator in the UK. There is fledgling case law in progress. For the time being, the UK AI legal framework can be divided into three areas:
- The UK Government’s 10 “regulatory principles”
- AI-specific guidance and codes of practice from industry regulators such as the Charity Commission or ICO (which are currently overviews in nature)
- Established practice areas which apply in an AI context just as they do in others; the most commonly considered are:
- Data protection & privacy
- Intellectual property
- Libel
- Equality laws
- Data protection & privacy
What does this mean now for charities?
We think that charities should absolutely be taking advantage of this light-regulation period to see how AI might help achieve charitable objects or use resources more effectively, but at the same time should put recorded and visible measures in place to show that they have considered applicable law and how to use AI safely and responsibly. There are benefits to be gained but legal risks to navigate (especially where negative outcomes – such as bias, discrimination or infringement of legal rights – amount to non-compliance with trustees’ duties).
How can Stone King help?
We have created a 10-step practical plan for charities to follow which will enable them to feel confident that AI use is lawful, safe and responsible. In brief, those steps are:
- Ensuring meaningful human intervention
- What level of knowledge is required?
- Awareness of AI’s limits and flaws
- Is your use ethical?
- Available and appropriate safeguards
- How to choose the appropriate tool
- What are the appropriate use cases?
- Organisational permission structure
- In-house governance and oversight
- Documentary compliance trail
We have also created AI-specific legal compliance tools, such as software impact assessments, trustee training and AI-specific due diligence & contract terms.
If you are available, we’d love to see you on 11 February. Sign up to this free webinar here.