Microsoft has recently launched a new feature in Teams: Sign Language Mode. This feature is aimed at deaf and hard-of-hearing users and their interpreters, with the goal of making meetings more accessible and inclusive. The feature was developed in collaboration with Microsoft’s internal D/HH community.
What is Sign Language Mode?
With Sign Language Mode, participants can indicate whether they are sign language users or interpreters. This makes it easier to locate and highlight interpreters during meetings, ensuring that sign language is always clearly visible on the screen. Video for sign language users is also provided in higher quality, enabling more natural communication. An automatic recognition function also ensures that the person signing receives the same visibility and status as someone speaking via audio.
How it works in practice
The feature makes it easier for participants to see and follow sign language interpretation during meetings, without having to search for the right person in the participant list. The new detection model recognizes when someone is signing and highlights that person on the meeting screen, making it easier for both sign language users and other meeting participants.
More information and future development
Microsoft plans to further develop the feature in the coming months, with continued focus on improving visibility and participation for sign language users and interpreters in the Teams environment. If you want to learn more about how to activate and use Sign Language Mode, or read about Microsoft’s work on inclusive design in AI and sign language, you can find more information via the links below.
Read more about Sign Language Mode in Microsoft Teams
Learn about the background to the development of Sign Language Mode