Microsoft Teams: Voice tethering [MC1230459]

Microsoft Teams: Voice tethering [MC1230459]

Message ID: MC1230459

[Introduction]

Voice tethering builds on the recent introduction of Sign Language Mode in Microsoft Teams. When a sign language interpreter voices on behalf of a Deaf or hard‑of‑hearing (D/HH) participant, Teams will now attribute captions, transcripts, and meeting intelligence—such as Copilot notes, summaries, action items, and insights—to the D/HH participant rather than the interpreter. This update ensures accurate representation in meetings, clarifies who is contributing to the conversation, and improves downstream meeting accuracy and accountability.

This message is associated with Microsoft 365 Roadmap ID 553223.

[When this will happen]

  • Targeted Release (Worldwide): We will begin rolling out in mid-March 2026 and expect to complete by late March 2026.
  • General Availability (Worldwide, GCC): We will begin rolling out in early April 2026 and expect to complete by mid-April 2026.

[How this affects your organization]

Who is affected

  • Organizations with meeting participants who are Deaf or hard‑of‑hearing and use sign language interpreters in Teams meetings.
  • Any users who participate in meetings with Sign Language Mode enabled.

What will happen

  • Voice contributions made by interpreters will be attributed to the D/HH participant across:
    • Live captions
    • Meeting transcripts
    • Copilot notes, summaries, action items, and insights
    • Other Teams meeting intelligence
  • Meeting data becomes more accurate by ensuring the correct participant is represented.
  • Interpreter identity is no longer conflated with the signer they support.
  • Sign Language Mode is already available to all users; voice tethering enhances it automatically.
  • The feature is on by default when sign language mode and interpreter assignment are used.
  • No admin controls are required to enable or manage the feature.

[What you can do to prepare]

No action is required.

Optional preparation steps:

  • Inform D/HH users and interpreters that speech attribution in meetings will change.
  • Update internal training or accessibility resources if you document interpreter workflows.
  • Notify helpdesk or support teams that captioning and transcript attribution will appear differently for interpreted meetings.

[Compliance considerations]

No compliance considerations identified. Review as appropriate for your organization.


Source: Microsoft

Latest Posts

Pass It On
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *