"We created 14 different guides to cover common knowledge and skill gaps. But then we had a brand new problem – how on earth do we get people to engage with content over 14 different guides?"
The challenge
The Royal College of Nursing (RCN) wanted to ensure all officers felt confident having difficult conversations with members about discrimination cases.
To help their officers in building their knowledge and confidence, they developed 14 comprehensive guides. These covered everything from technical legal information to advice on handling emotional conversations.
However, with around 180 officers across the organisation – ranging from those with 20+ years' experience to brand new starters – making this guidance accessible and usable to all was a challenge.
Officers were unlikely to read through multiple lengthy documents when they needed quick answers. The traditional approach of searching through documents wasn't practical, particularly with complex legislation like the Equality Act. People simply gave up when trying to navigate giant documents.
The solution
The team decided to build an AI chatbot using Microsoft Copilot Studio that could search across all the discrimination guidance and provide quick, relevant answers to officers' questions.
How it works
The chatbot, officially named the "Discrimination Conversation Assistant," connects to a curated SharePoint site containing:
All 14 internally written and approved guides
Trusted external resources, including a full copy of the Equality Act
The Equality and Human Rights Commission Code of Practice
Crucially, the chatbot is not connected to the internet – it only searches the approved resources on the SharePoint site. This means it cannot make up information from sources the organisation hasn't vetted.
"It literally brings up the content. The content is actually our content. And that content was written with loads of people in the organisation based on that original piece of work with the users."
Officers can access the chatbot in two ways: through a dedicated landing page for discrimination queries, or directly through the Copilot icon on their desktop by selecting the agent.
More than 500 union staff who have now been through one of the Unions 21 AI training courses.
What Kevin and the team have done with AI is an example of what our AI training is all about: knowing where and how AI can help union staff focus on what matters most: working with members and workers to build collective strength and improve conditions for all.
Think AI could support you in your work?
We have a few more spaces on our AI Fluency for Union Staff course remaining. We have versions tailored to industrial officers and to communications officers. The next course starts on 26 January 2026.
Key benefits
Faster access to guidance: Officers can get answers in seconds rather than spending time searching through multiple lengthy documents
Handles both technical and human questions: The chatbot can answer detailed legal questions but also provide guidance on difficult conversations, such as "How do I tell my member that I don't think they've got a case when they think they do?"
Learning tool: Officers can use it to test their own knowledge by asking it questions
Psychological safety: Officers can ask questions they might feel uncomfortable raising with colleagues, without any tracking of who asks what
Consistent, approved answers: All responses come from vetted, signed-off guidance rather than potentially unreliable external sources
Common questions
How difficult was it to implement?
The actual construction was relatively straightforward, but testing and refining the responses took considerable time. Getting IT environments set up for deployment also added to the timeline.
What skills were needed?
The core team was just three people: someone to lead the project and build the bot (learning Copilot Studio along the way), IT support for technical deployment, and information governance oversight.
What were the main challenges?
Getting the responses right required significant testing. Early versions were either too detailed or too creative. The team had to adjust the "creativity" settings and add specific instructions about response length and format.
"Sometimes responses were really brilliant, but massively detailed, and that isn't how officers were likely to engage with them. So then we were having to tell it specific prompts like ‘give a short answer, a couple of sentences, always direct to the top two guides’."
The biggest ongoing challenge is adoption. Despite positive reactions when officers try the chatbot in training sessions, actual usage remains lower than hoped.
How do you prevent “hallucinations”?
The team took several steps to avoid the common challenge with generative AI - that it can make things up. They:
Restricted the knowledge base to curated, approved resources only
Turned off internet access completely
Adjusted the "creativity" setting to a moderate level
Added explicit instructions not to make things up
Included a disclaimer reminding users that AI can make mistakes
"I think the reason it rarely hallucinates is because it doesn't have a vast data set."
What about data protection?
The choice of Microsoft Copilot Studio was deliberate. As a Microsoft organisation, using Copilot meant all data stays within their existing data environment. The chatbot cannot access the wider internet, and the team turned off the option to collect user queries in the back end, addressing concerns about psychological safety.
Getting started
How would you replicate what the RCN have gone through for your guidance? Here is the process they followed.
First steps
Start with your content, not the technology – having well-organised, approved guidance is essential
Get information governance and IT involved early; if they say no, you'll save time learning that upfront
Keep the team small; small teams of people can move quickly and iterate
Learn by doing – build a prototype without organisational data first to understand how it works
Common pitfalls to avoid
Don't point the bot at your entire SharePoint as the results will be chaotic
Don't expect officers to adopt it immediately, even when impressed in demonstrations
Don't underestimate how much testing will be needed to get responses right
Resources needed
Microsoft 365 with Copilot Studio access
A curated set of approved guidance documents
IT support for deployment through test and production environments
Time for iterative testing and refinement
Looking ahead
The team plans to continue promoting the chatbot and monitoring adoption. The broader recommendation being developed is that all officers and admin staff should undertake basic AI fundamentals training to help them feel confident engaging with tools like this.
There's recognition that the chatbot sits within a wider organisational need to address AI literacy. Many staff have access to Copilot but have never clicked on it because they don't understand what it is or feel permission to use it.
AI Transparency statement
This case study was generated using AI based on a recorded conversation between Nick Scott (Centre for Responsible Union AI) and Kevin Michael (RCN) in December 2025.


