HOW TO Reduce Hallucinations in Microsoft 365 Copilot

NILC Training · Beginner ·💻 AI-Assisted Coding ·3mo ago
Reducing AI hallucinations is one of the most practical skills you can master to make Copilot actually work for your business. It is frustrating when the tool provides confident but incorrect information, but most of these errors stem from how the prompt is structured. In this video, Jonathan Pollinger breaks down the specific techniques you can use to keep Copilot grounded in facts. We look at how to provide better context, how to set clear boundaries for the AI, and the exact phrases that force the system to admit when it does not know an answer. #ai #copilot #hallucination ⌚ TIMESTAMPS: 00:00 – AI Hallucinations Explained 00:59 – Prompt for Reducing Hallucinations 02:50 – Creating Copilot Custom Instructions 03:44 – Outro 🎓 Explore our AI Courses: https://www.nilc.co.uk/category/business-skills/artificial-intelligence/ ➡️ Subscribe for weekly tutorials: https://www.youtube.com/channel/UCh337MvWCeVjBJyhXIsxt5A 🚩 Connect with us: Facebook: https://www.facebook.com/nilctraining LinkedIn: https://www.linkedin.com/company/nilctraining/
Watch on YouTube ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Related AI Lessons

Chapters (4)

AI Hallucinations Explained
0:59 Prompt for Reducing Hallucinations
2:50 Creating Copilot Custom Instructions
3:44 Outro
Up next
The Coder's Companion: AI's Future
Real Python
Watch →