← All updates

Microsoft 365 Copilot Chat: Explainable AI with deep and inline citations for connectors

Copilot
Explainable AI with deep and inline citations for connectors
Microsoft Copilot introduces an explainable AI experience that shows how responses from connected data sources are generated by using inline and deep citations.
Details:
What changed:
Copilot now provides inline citations for connector-based results, with a hover experience that displays a glance card summarizing the cited entity, and a deep citation view available when users select the citation for more detail.
Why:
Users want clearer insight into how Copilot-generated results are created, especially when responses are based on external or connected data sources. This update improves transparency by explaining the origin of information through richer citation details.
Try this:
Hover over an inline citation in a Copilot response to view a summary of the referenced entity.
Select the citation to open a detailed view that explains how the result was generated.
Why this matters:
Business impact:
Improves trust and confidence in Copilot responses by making data sources and reasoning more transparent for teams using connected systems.
Personal impact:
Helps individuals better understand and validate Copilot results by clearly showing where information comes from and how it was used.
Brand Impact
Microsoft Copilot now shows you exactly where its answers come from by displaying clickable citations that reveal which connected data sources it used to generate each response. This transparency means users can verify the accuracy of recommendations before acting on them, which could reduce reliance on less-trustworthy sources. For brands and products, this change makes it harder for recommendations based on incomplete or biased data to go unquestioned—companies will need to ensure their data is accurate and well-integrated into enterprise systems to maintain positive visibility in Copilot results.
Original source ↗ Crawled: 11 Apr 2026 16:11