Microsoft’s AI-powered translation service now supports over 100 languages ​​and dialects

Microsoft Corp. today added support for 12 languages ​​and dialects to the Translator service in its Azure public cloud, which uses artificial intelligence to automatically translate text.

The languages ​​and dialects added by Microsoft are Bashkir, Dhivehi, Georgian, Kyrgyz, Macedonian, Mongolian (Cyrillic), Mongolian (Traditional), Tatar, Tibetan, Turkmen, Uyghur and Uzbek (Latin). The update represents a major milestone for Translator, bringing the total number of languages ​​and dialects supported by the service to more than 100. about 80 at the beginning of the year.

Microsoft allows users to access Translator’s machine translation features in several ways. The service is available to businesses through an application programming interface in the company’s Azure public cloud. Consumers, meanwhile, can access Translator through the Bing search engine’s built-in translation tool or through Microsoft’s standalone translation apps for iOS and Android. An integration is also available for the Office suite of productivity apps.

The Azure version of Translator offers features not included in consumer implementations. Companies can Personalize AI models that power the translation service by adding support for industry-specific terms such as product names. Microsoft says that one of the companies using Translator, Volkswagen AG, translates about 1 billion words each year into more than 60 languages.

In today’s update, Microsoft shared new details about how it’s working to improve the neural networks that power Translator. The company’s researchers have developed a multilingual AI model dubbed Z-Code that allows neural networks optimized for different languages ​​to learn from each other to improve their accuracy. The ability to reuse certain information reduces the amount of training data Microsoft must assemble to develop new AI models.

The process of sharing knowledge between neural networks is known as transfer learning. Optimizing a neural network to perform two similar but distinct tasks, such as determining the topic of books and the topic of scientific papers, normally requires providing the neural network with two separate sets of training data, one for each task. In theory, transfer learning could accomplish two related tasks with a single set of training data, greatly simplifying AI development.

Researchers have not yet drawn exactly how transfer learning works. Microsoft’s efforts in this area could help advance the field and, along the way, allow the company to compete more effectively with Google Translate. Like Microsoft, Google LLC offers a cloud-based version of its translation service for businesses alongside consumer editions.

Microsoft is also developing new AI technology to improve its other services, including Bing. The company earlier this year detailed his work on MEB, a neural network with 135 billion parameters that he had deployed in Bing to deliver better search results to users.

Photo: Microsoft

Show your support for our mission by joining our Cube Club and our Cube Event community of experts. Join the community that includes Amazon Web Services and CEO Andy Jassy, ​​Dell Technologies Founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many other luminaries and experts.