В видеовстречах «МТС Линк» появились субтитры в режиме реального времени

The MTS Link platform for business communications, training and collaboration has launched a real-time speech-to-text feature. The technology is based on an LLM model from MTS AI.

Generated by Dall-E neural network
Generated by Dall-E neural network

Artificial intelligence converts the speech of participants into text in real time and displays it in the online call window. The function is based on the principle of subtitles: the words on the screen are synchronized with the audio track.

Currently, speech recognition is only available in Russian. The company plans to introduce subtitles in English and other foreign languages, as well as streaming translation functions.

Dmitry Kryukov, Head of Machine Learning at MTS Link, said:

Streaming transcription is a technology that has long become part of the everyday digital environment: we are used to subtitles in videos and social networks. But it has a special role to play in work communications. It helps those who find it difficult to easily understand oral speech. Our task is to create tools that make online communication accessible to everyone. Artificial intelligence simplifies this task and speeds up the process of its implementation: the technology provides high accuracy of transcription and helps to perceive speech without distortion. As a result, participants can focus on the main thing — the content of the conversation.

AI subtitles are available to MTS Link corporate users upon request to a personal manager and require additional payment to the main tariff.

Sources
MTS Link

Now on home