BERT Is So Popular That Google Have To Release A Website To Collate All Developments

With the advent of transformer-based machine translation models, researchers have been successful in implementing state-of-the-art performance in natural language processing (NLP). In 2018, Google open-sourced its groundbreaking state-of-the-art technique for NLP pre-training called Bidirectional Encoder Representations from Transformers, or BERT.…

Top 7 Free NLP Books To Read

Natural Language Programming or NLP has enabled computers to interpret human language that has further opened doors to new innovation. Due to this very reason, the interest to learn more about the subject has increased in recent years, and as…

Are Larger Models Better For Compression

When OpenAI released its GPT model, it had 1.5 billion parameters and made it the biggest model back then. It was soon eclipsed by NVIDIA’s Megatron, which had 8 billion parameters. Last month Microsoft released the world’s largest language model…

Top 8 Baselines For NLP Models

The ability of natural language in machines, so far, has been elusive. However, the last couple of years, at least since the advent of Google’s BERT model, there has been tremendous innovation in this space. With NVIDIA and Microsoft releasing…

Microsoft Introduces First Bimodal Pre-Trained Model for Natural Language Generation

Over these few years, large pre-trained models such as BERT, ELMo, XLNet, among others, have brought significant improvements on almost every natural language processing (NLP) tasks in organisations. Microsoft has been doing a lot of research around natural language processing…

Top NLP Open Source Projects For Developers In 2020

The year 2019 was an excellent year for the developers, as almost all industry leaders open-sourced their machine learning tool kits. Open-sourcing not only help the users but also helps the tool itself as developers can contribute and add customisations…

Transformers Simplified: A Hands-On Intro To Text Classification Using Simple Transformers 

In the past few years, we have seen tremendous improvements in the ability of machines to deal with Natural Language. We saw algorithms breaking the state-of-the-art one after the other on a variety of language-specific tasks, all thanks to transformers.…

How Hike Is Using NLP-Based Stickers To Help Users Express Themselves In Indian Languages

Popular messenger service Hike has been working on various innovations within the platform. Ever since it came into existence, the platform has acquired more than 100 million users and given the company enough impetus to keep improving their product. The…

5 Latest Data Science Skills That Are On A Rise In 2019

In the ever-changing data science landscape, skillset evolves as new tools and techniques surface. Based on the trends, data scientists should focus on emerging and most in-demand skills to stay abreast of the changing needs. Although data scientists possess several…

Why Transformers Play A Crucial Role In NLP Development

Recent advances in modern Natural Language Processing (NLP) research have been dominated by the combination of Transfer Learning methods with large-scale Transformer language models. Creating these general-purpose models remains an expensive and time-consuming process restricting the use of these methods…

More than 1,00,000 people are subscribed to our newsletter

Subscribe now to receive in-depth stories on AI & Machine Learning.