buchachon - Fotolia

Get started Bring yourself up to speed with our introductory content.

What the Skype translator can't do for Microsoft global partners

The Skype translator demoed at Microsoft's Worldwide Partner Conference promises to break new ground in multilingual communication, but Microsoft global partners still have trouble understanding the keynote speech.

Microsoft demoed its upcoming Skype translator at its Worldwide Partner Conference in Washington, D.C., this week, software that will enable real-time audio translation of one language into another. In the demo, an English-speaking Microsoft executive had a conversation on Skype with a woman in Germany -- each speaking their native language. It was perhaps the highlight of the technology the company demoed during its keynote addresses this week.

For many Microsoft global partners -- from 135 countries outside of the U.S. -- at the event, the technology might have seemed frustratingly out of reach. I've been speaking English since I uttered my first words so many years ago, but at the keynote address on Monday, I had a hard time keeping up with the various presentations, between taking notes, tweeting and trying to absorb it all. After the keynote, I asked one attendee from Sao Paulo, Brazil -- whose English was not very difficult for me to understand -- how easy it had been for him to follow the keynote speakers and how much of what they'd said he'd understood. He gave the universal sign for "very little": index and thumb positioned parallel to one another and separated by less than an inch.

Those who took the stage at the Verizon Center clearly took great care to speak very slowly and deliberately, and the big screens above their heads displayed very accurate English subtitles, yet my impression is that communications was still lacking.

I was struck by the implications of his response. The overwhelming majority of attendees came from outside the United States -- only 3,000 of the 16,000 attendees were from the U.S., according to one estimate I'd heard. I don't know how many of them came from native-English-speaking countries, but having spent most of my time this week on the show floor, in the press room and at after-hours events, I would bet that native-English speakers were in the minority. Many of the partners from countries where English isn't an official language did indeed have some skill with the English language, but that doesn't mean they had an easy time following the keynote presentations, where they could not ask the speaker to slow down or repeat themselves or give nonverbal cues to indicate that they didn't understand. Those who took the stage at the Verizon Center clearly took great care to speak very slowly and deliberately, and the big screens above their heads displayed very accurate English subtitles, yet my impression is that communication was still lacking.

This communication difficulty manifested in different ways. An MSP CEO from New York told me that in an interactive session -- it might have been the Partner Pitch Pit -- one partner made a zealous speech in very broken English. "We could barely understand him, but he tried so hard that he got a big round of applause," the MSP told me.

(And then there was the interaction I had with a table full of Germans, one of whom told me a joke, in English. It was a very funny joke, but I had such a hard time understanding him that he had to repeat the punch line four times. Now, if I could just remember the punch line …)

It's true that no critical news is likely to come out of these partner events. And the networking opportunity is just as important as the educational aspects of the conference. Yet there is valuable information that gets shared, to greater or lesser degrees, in the keynotes, breakouts and interactive discussions.

In the face-to face, one-on-one conversations that are typical of networking events, there's room for one speaker to give clues to the other that indicate they're having trouble understanding. The first speaker might take a different tack, rephrase the statement, motion with their hands to get the point across. A keynote address or breakout presentation, however, doesn't allow that type of communication. So not being able to understand much of it must detract from the value of the conference.

I'd asked the attendee from Sao Paulo why, if he could understand so little of what was being said, he and his colleagues had made the 10-hour flight to D.C. "To show Microsoft that we are committed to them," he replied. If you multiply that out to all the other non-native speakers who went to WPC yet had trouble understanding the speeches and discussions, that's a lot of commitment. That so many partners came from all around the world is a testament to Microsoft's global importance -- as it is to other vendors with big contingents from outside the U.S. at their global partner conferences.

For non-U.S.-based partners, the Microsoft global event is valuable enough -- in its opportunities to network with Microsoft, other vendors and their peers, as well as to show loyalty to the mothership -- that they'll spend the considerable time and money to attend. Yet when they're there, the sharing of information still is contingent upon the most basic form of communication: face-to-face human speech.

The Skype translator won't solve the problem of face-to-face communications between two people who don't speak the same language, so the novice English speakers will have the same issues at WPC 2015 in Orlando, Florida -- unless Microsoft decides to hire a whole bunch of UN-style interpreters and distribute audio equipment to the attendees. Will we ever solve the problem through computing technology?

Dig Deeper on Channel partner program news

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

When will technology solve the face-to-face language barrier problem?