Amazon, perhaps, is one of the only few players in the machine learning landscape to have heavily invested in transfer learning, besides Facebook, Microsoft and DeepMind. The success of this can be seen in its Alexa virtual assistant, which has reached significant strides in the last few years, competing closely with Google Assistant and Apple’s Siri.
For those unaware, transfer learning is a technique where learning in a new task is done via the transfer of knowledge from a related task that has already been learned. For instance, knowing how to ride a bicycle makes learning how to ride a motorcycle easier. Similarly, this idea can be applied to machine learning instead of learning or developing a large scale language model with billions of parameters from scratch.
Sign up for your weekly dose of what's up in emerging technology.
Doing so is not only time consuming but also expensive. Developers or researchers have to collect thousands of voice samples and annotate/label them manually, a process that can take months or weeks easily. That is why researchers at Amazon Alexa have pursued transfer learning, which leverages a neural network trained on a large dataset of previously labelled samples to train in a new domain/area with sparse data.
Today, the Amazon Alexa team uses transfer learning technology extensively to transfer knowledge across various language models, features, and better machine translation capabilities.
Transfer Learning Everywhere
Two years ago, Amazon introduced two new features called ‘Newscaster and Neural text-to-speech (TTS)’ to its cloud-based TTS service – Amazon Polly. Launched in 2016, Amazon Polly turns text into human-like speech, allowing users to develop speech-enabled products and applications.
Besides Polly, Amazon offers multiple APIs that aim at executing tasks within text analysis, which can also explore transfer learning to further customise the inference of these models. Some of them include Amazon Personalize, Amazon Forecast, Amazon Transcribe, Amazon Rekognition, Amazon Comprehend, Amazon Lex, Amazon Textract, Amazon Translate, etc.
Here’s a list of all the research work done in transfer learning by Amazon researchers in the last four years.
Shaping Transfer Learning
Four years ago, Amazon founder Jeff Bezos (executive chair), at the annual Amazon letter, described Alexa’s success and improvements from semi-supervised and transfer learning. He said they have dramatically reduced the amount of time required to teach Alexa new languages by using machine translation and transfer techniques, which allowed them to serve customers in more countries, including India and Japan.
Last month, Amazon released new tools like notebooks, text models and solutions for multimodal financial analysis within Amazon SageMaker JumpStart. Using these tools, you can easily retrieve public financial documents, including SEC filings, and process financial text documents with features like summarization and scoring for various attributes such as sentiment, risk and readability.
In addition, users can access pre-trained language models trained on financial text for transfer learning and use example notebooks for data retrieval, text feature engineering, regression models and multimodal classification.
Today, transfer learning has become one of the most popular techniques in deep learning as it can train deep neural networks with little or limited data in a short period of time. Previously, tech evangelist Andrew Ng, at NIPS 2016, had said that transfer learning would be – after supervised learning – the next driver of machine learning commercial success. Cut to the present; Amazon is most certainly leading the way for transfer learning.