This is a fascinating topic! How does that sound – enabling nearly real-time live translations in meetings? Yes, a bit like scifi but it is today a reality. We will be able to enjoy that in Microsoft Teams meetings during early this year, as Microsoft announced during Ignite 2024 conference.. No, you don’t have this one in your Teams, unless you are part of the preview group – there are some videos out in the wild about this agent. We have seen these videos, so what we can expect from this feature then?

Luckily there are some videos and materials already out in the live, that can be used to gain some insight on the matter.

Microsoft has Agents in Microsoft 365 page at Adoption site. That is where we can see the video (rather small) and text about Interpreter Agents.

The Interpreter agent enables real-time speech-to-speech interpretation in Teams meetings so each participant can speak and listen in the language of their choice in up to nine languages.

Public preview coming in early 2025.

Now, what that means? Let’s combine the information from the Ignite blog post.

Today we’re introducing the new Interpreter agent – this will allow each user to enable real-time speech-to-speech interpretation in Teams meetings so they can speak and listen in the language of their choice, instantly overcoming language barriers. Users will also be able to have the Interpreter simulate their voice for a more inclusive experience. The Interpreter will start with supporting nine languages including Chinese (Mandarin), English, French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish. This new agent will be coming in public preview in early 2025 and will require a Microsoft 365 Copilot license to access the preview.

So, what’s been revealed so far

Microsoft 365 Copilot license is required, as this is a personal agent then I assume if I have the license, I will be able to use the Interpreter Agent

Coming to Public Preview early 2025. That is .. now! Unfortunately we can’t see it yet.

Will support 9 languages: Chinese (Mandarin), English, French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish

User will be able to choose the interpreter to simulate speaker’s voiceww

That video at Microsoft page is a bit small in size, but luckily someone has put that same video out to YouTube.

So, let’s dissect that one and take a look about the feature in more detail.

First, it is activated via the three-dots (…) menu in Teams meeting, and under Language and speech it is possible to Turn on interpreter.

After that you will choose the language you want to listen to this meeting. This makes the interpreter personal – all participants may want to listen what others say in their native tongue.

You get the dropdown to select interpreter language

You can choose to use the original audio (original language) or one of those 9 languages.

After that, you can balance how much you want to hear about the original language

In the video we can hear the person speak other language first, and after a while the interpreter kicks in.

In that video I didn’t notice anything about selecting to simulate the user voice “Users will also be able to have the Interpreter simulate their voice”.

There is also an another video about this, from Marco Casalaina who is Vice President Of Products, Azure AI at Microsoft.

What I notice is that in the second video, there is no adjustment for the balance between the original and interpreted audio.

Looking both videos, we can see the potential of this feature big time. This isn’t real-time, not yet. You also need to learn how to speak and have breaks, so the translator can do its job. It won’t be as fast-paced as some meetings can be.. Perhaps in the future we will be closer to Star Trek’s universal translator, with true real-time translations – but this is very impressive already.

There is an another multi-lingual meeting feature coming to Teams during April: Teams Meeting transcription support for multiple spoken languages. This is scheduled for April 2025 and will work only on Desktop and Mac.

Meeting participants can now set their own spoken language during meetings, allowing each participant to communicate in their preferred language. Previously, Teams meetings required a common spoken language to generate a transcript. With this update, a meeting transcript can be accurately generated reflecting each participant’s language of choice.

The key is here: Meeting participants can now set their own spoken language during meetings. Instead of having a single one language covering everyone in the meeting, everybody can set their own language. Transcription will understand what everyone said, even when everyone is speaking a different language and the translator helps to participate in multi-lingual/cultural meetings!

There is also one more, quite recent, video from LinkedIn, where the interpreter is demonstrated. It pretty much looks that we will be getting closer and closer to experience this ourselves.

What is there already, is the Calling Policy parameter, AIInterpreter, that can be used to control if the interpreter will be available or not to people. On default the AI Interpreter will be enabled. .Another parameter -VoiceSimulationInInterpreter states if the voice simulation can be used or not. On default the voice simulation will be disabled.

I am waiting for this feature to arrive for sure. It should help in meetings where everyone isn’t that good in English to gain more collaboration. The delay in interpreting is longer than I hoped, but these examples are clearly from the early version and there are limits currently what is realistically possible. Near real-time will do just fine, when you need it. Just learn the proper pace when speaking and give others some break to hear the translation. Multi-lingual meetings are a thing already. While live translated captions and transcriptions are great, these two features (interpreter and multiple spoken languages in a meeting) will be a long step forward!

Multiple spoken languages will emerge (by the roadmap) during April 2025, and these will work only on desktop Teams. Probably also interpreter works only on desktop client, like AI notes during the meeting today do.

Once I learn more and this feature hits the public preview, I will take an another look at the Interpreter Agent.

Published by Vesa Nopanen

Vesa “Vesku” Nopanen, Principal Consultant and Microsoft MVP (M365 and AI Platform) working on Future Work at Sulava.

I work, blog and speak about Future Work : AI, Microsoft 365, Copilot, Microsoft Mesh, Metaverse, and other services & platforms in the cloud connecting digital and physical and people together.

I have about 30 years of experience in IT business on multiple industries, domains, and roles.
View all posts by Vesa Nopanen



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here