Date updated: Wednesday 28th May 2025

AI is becoming ever present in our working lives, with the opportunities it presents regularly discussed by all, from the Government to the press. However, in an employment context, charities need to proceed with care.

Artificial intelligence is a broad catch-all phrase that includes automated processes and software like ChatGPT. 

Automated processes

There is the opportunity for automated processes to increase efficiency and reduce costs, an important consideration for charities in the current climate. There are a variety of ways in which automation can influence employment law considerations. 

Automated recruitment and staff management processes are often discussed. Examples include using AI to manage recruitment or performance processes. However, care needs to be taken with automation. 

Firstly, some argue that automation removes unconscious bias, however, there is evidence that human biases are adopted by the AI systems and, consequently, discrimination may still occur. It is important to remember that, for direct discrimination, it does not matter how the discrimination occurs. Equally, for indirect discrimination, it may be difficult to establish the defence of “proportionate means of achieving a legitimate aim” if no human thought has been involved in the process.

Similar discrimination risks arise on the automation of performance processes. For example, if AI identifies patterns of lateness or absence leading to a disciplinary process, the employee may not be afforded the opportunity to explain “why” and, in turn, those with protected characteristics may be disproportionately affected. These processes may also cause difficulty in defending unfair dismissal claims: if the employee has not been given the opportunity to explain “why” before disciplinary proceedings are instigated, effectively, the investigation stage of a fair process may be missed. 

ChatGPT

ChatGPT and equivalent systems enable users to create content and undertake research. It can be a useful resource for employees, enabling more efficient working, reducing workload and, in turn, pressure and workplace stress. However, real care needs to be taken in relying on ChatGPT output, as it is not always accurate or factually correct. Employees’ use of ChatGPT therefore creates a potential liability for charities and, in turn, may result in conduct or capability questions for the employee. Capability issues may also be relevant if an employee is relying on ChatGPT to mask their poor performance, lack of training, or lack of knowledge. The use of ChatGPT therefore may make it more difficult for employers to identify capability issues at an early stage. It may also be difficult to evidence the employees’ use of AI, making disciplinary and dismissal processes more challenging. 

Managing the risks

We would recommend that charities have clear policies in place, that are communicated clearly to staff, as to how AI can or cannot be used in the workplace. Where AI is used, it may be appropriate for staff to be trained in its proper use and to ensure they understand and manage the risks.

In addition to the employment considerations, there are also important data protection, confidentiality, and intellectual property risks that need to be appropriately understood and managed. 

For more information on the opportunities and threats posed by AI, listen to Lucas Atkin, Senior Associate in Stone King’s Information Law Team, and Zoe Dipple, a Paralegal in the same team, in this informative audio bite on LinkedIn.

The Employment Team at Stone King are involved in wider ongoing work around the education of organisations about AI. If you would like advice on such matters, please do get in touch.