The Guardian reports that the mistranslation went viral after it was spotted by Saudi social media users. For an unknown period of time, Microsoft’s Bing Translator service, a rival to the likes of Google Translate, was found to be translating “Daesh,” the Arabic name for Islamic State, into “Saudi Arabia.”
Outraged by the mistake, Saudi officials and social media users called for a nationwide boycott of Microsoft’s products. Microsoft said it had apologised to Saudi officials and corrected the translation within hours of being informed. It has put measures in place to prevent the problem recurring.
The cause of the error remains uncertain. However, Microsoft’s vice president for Saudi Arabia, Dr. Mamdouh Najjar, told the Huffington Post that it was probably down to Bing Translator’s use of crowdsourced translations. To improve the accuracy of its service, Microsoft allows users to suggest translations for words and phrases in their native language. If the same suggestion is received from over 1,000 people, Bing adds credence to it and it becomes the default.
Najjar speculated that the service may have been manipulated by user suggestions. It appears as though Bing was flooded with suggestions that “Daesh” should be translated to “Saudi Arabia.” The system ended up with a bias towards the translation and began using instead of its usual alternative. Without any human intervention, the suggestion became the default.
“As an employee of Microsoft, I apologise personally to the great Saudi people and this country, dear to all our hearts, for this unintentional mistake,” Najjar said.
The company has not detailed the mechanisms implemented to prevent such an error occurring again. It may have enforced human intervention when a translation gains enough suggestions to become the default, forcing an operator to make the final decision on whether it is displayed to all Bing Translator users. The Daesh translation is now fixed.
The issues raised by the false translation could be particularly impactful on webpages. Microsoft recently launched the Windows 10 Anniversary Update, bringing extensions to its Edge web browser for the first time. One of the available extensions is Bing Translator. It uses the Bing Translate service to convert the text of foreign webpages into a user’s native language. It could have resulted in “Daesh” being falsely translated into “Saudi Arabia” when reading foreign news sites.
The error highlights the problems with crowdsourcing components of key services. Allowing users to contribute suggestions is beneficial the majority of the time. It only takes a single false translation for Bing to generate a bias, however. Over time, it could have severe consequences, as in this case.
It’s the second time this year that Microsoft has felt the effects of users abusing its services. In March, it was forced to pull its Tay chatbot offline after just a few hours. The bot was going to be used to analyse how people communicate online and generate human replies. However, 16 hours later, trolls had turned Tay into a Hitler-supporting racist endorsing genocide and white supremacy. An embarrassed Microsoft pulled the “casual and playful” Tay offline. It has not yet returned.
