HOW TO Reduce Hallucinations in Microsoft 365 Copilot
Reducing AI hallucinations is one of the most practical skills you can master to make Copilot actually work for your business. It is frustrating when the tool provides confident but incorrect information, but most of these errors stem from how the prompt is structured.
In this video, Jonathan Pollinger breaks down the specific techniques you can use to keep Copilot grounded in facts. We look at how to provide better context, how to set clear boundaries for the AI, and the exact phrases that force the system to admit when it does not know an answer.
#ai #copilot #hallucination
⌚ TIMESTAMPS:
00:00 – AI Hallucinations Explained
00:59 – Prompt for Reducing Hallucinations
02:50 – Creating Copilot Custom Instructions
03:44 – Outro
🎓 Explore our AI Courses:
https://www.nilc.co.uk/category/business-skills/artificial-intelligence/
➡️ Subscribe for weekly tutorials:
https://www.youtube.com/channel/UCh337MvWCeVjBJyhXIsxt5A
🚩 Connect with us:
Facebook: https://www.facebook.com/nilctraining
LinkedIn: https://www.linkedin.com/company/nilctraining/
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
More on: Prompt Craft
View skill →Related AI Lessons
⚡
⚡
⚡
⚡
GitHub Copilot Just Changed — Here's What It Means for Devs in 2026
Dev.to AI
I Let AI Handle My PR Reviews for 30 Days — The Data Was Ugly
Dev.to AI
Why Developers Who Don’t Use AI Will Become Obsolete
Medium · AI
I built GhostType: inline AI text completion for every app on macOS
Dev.to · mk668a
Chapters (4)
AI Hallucinations Explained
0:59
Prompt for Reducing Hallucinations
2:50
Creating Copilot Custom Instructions
3:44
Outro
🎓
Tutor Explanation
DeepCamp AI