System Maintenance: MyPortal will not be available on Tuesday, Sep 17, 2019 at 8:30pm to Wednesday, Sep 18, 2019 at 6:30am (AEST). We apologize for any inconvenience caused. Thank you!
Google not lost on translation Friday, 09 June 2017

News article written by Corbett Communications. The statements made or opinions expressed do not necessarily reflect the views of Engineers Australia.

Imagine trying to translate 103 languages to deliver meaningful answers that make sense to people of each language? Language is a very complex and extensive skill to have and the fact isn’t lost on engineers at Google. The digital giant employs machine learning now for Google Translate which has the computing power of 180,000 billion calculations a second.

This is what Google’s new second-generation tensor processing units (TPUs) do; they’re specialist circuit boards built for machine learning. Senior Research scientist Mike Schuster and the Google Translate team is taking the lead in digital language translation, seeing it morph from translating individual phrases a decade ago now to entire sentences.

Schuster, who has a Masters and a PhD in Electrical Engineering, revealed that the original Google Translate built its tables of phrases from translation on the internet.

“You’d find small segments like ‘My dog is red’; you’d find the same segment in a Japanese sentence and you’d make a big table of that,” Schuster recently told The Australian. “Then you’d search your tables and put together your sentences.”

The machine learning project at Google for language translation began in August 2015. Soon after, Google started to write an AI system to make use of the new TPUs. Schuster said it was only about five months later that first results came through. Then, a year later, Google Translate was ready to launch its first language pair – Mandarin to English.

Fast enough for everyone

At first, the AI-based translations were too slow,” Schuster said. “It took 10 seconds for a 10-word sentence to translate; 10 seconds is way too long … and if you want to use it for a billion people, you won’t have enough machines”.

So, how did they ramp up the speed? Schuster said Google used its TPUs to make everything faster and it made better algorithms. Eventually it was 200 milliseconds for a 10-word sentence - done over two months, which Schuster said was fast enough to launch the system.

And now instead of maintaining tables the new AI system trawls the web to take in billions of sentences and their translated equivalent in a targeted language. Google Translate doesn’t analyse these sentences, it only looks at the differences in new translations compared to old ones. Then the machine learning system guestimates a translation when it sees similar sentences.

At present, Google Translate is only taking sentences, but in the future it’s working towards whole paragraphs. The system went live in September 2016 with English to Mandarin, after which Google added eight more languages in November and another seven in March this year. Then in April, 26 more. So far, 41 languages use AI now to translate.

Author: Desi Corbett

Image: Google TPU.