Now Reading
Hugging Face Gets An Amazon Treatment

Hugging Face Gets An Amazon Treatment

  • AWS partners with Hugging face to boost NLP technologies.
Hugging Face Gets an Amazon Web Service Treatment

To amplify the process of embracing natural language processing models via the power of AI, Amazon Web Service (AWS) chose to collaborate with Hugging Face, who are pursuing various ways to make this process elementary.

The main goal of NLP is to understand and decipher the linguistics which humans can utilise in a way that can be considered meaningful. Hugging Face got Amazon’s attention by creating Transformers, a library that has enabled at least over a hundred different NLP models to execute various tasks such as summarization, translations, and textual interpretation, making it easier for everyone to utilise. It works by administering various algorithms to recognise and derive natural language orders, which are unstructured language data but can be easily understood by computers.

Register for our Workshop on How To Start Your Career In Data Science?

Also Read: Hugging Face And Its Tryst With Success

Nuts & Bolts Of The Partnership

NLP models must keep on transforming with the ever-changing slangs and linguistics of language. According to experts, if said models don’t keep themselves updated, they might be worthless and ineffective. However, to keep the dice rolling and maintain the cutting edge in the field of NLP can be a costly affair in the means of finance and time. 

To combat these issues, Hugging Face created Transformers, which can not only advance but can also democratise NLP. This can be observed in this initiative, where the company provides more than 7000 different updated NLP models which have been attuned to a variety of languages, going in-depth to the linguistics of Ndonga, dialect spoken in Namibia and Breton, and another dialect derived from a Celtic background which is utilised in areas of France.

Transformers has also picked up steam amongst various users, which range from students to professionals. With such popularity and ambidextrous nature of the product, Amazon Web Service intended to take the software to the next level using Amazon Sagemaker.

Sagemaker is Amazon’s own cloud platform, released in November 2017, to create, educate, and employ machine learning models on the cloud.

The first step taken by the creators of Transformers was to choose AWS as its selected cloud provider along with releasing two new software that is AutoNLP and an Accelerated Inference API. While the AutoNLP is an automated way to educate, analyse and deploy updated NLP models for various tasks, the Accelerated Inference API speeds up the process of deploying NLP models by releasing them on the cloud.

See Also

In return for utilising Sagemaker, AWS has aided in the increased accessibility of Hugging Face’s Deep Learning Containers (DLC) to equip programmers to construct applications utilising NLP. This is because of the uniqueness of Amazon Sagemaker, which allows testing to occur within minutes rather than days. 

According to the official blog post, Ohio-based Quantum Health, a health-tech provider, and New York-based Kustomer, a customer service CRM platform, are two companies utilising Hugging Face technology and fine-tuning it on Amazon Sagemaker. While Quantum Health was looking to provide effective and efficient healthcare navigation for its patrons, Kustomer used NLP to manage its relationships and interactions.

Also Read: Top 6 Alternatives To Hugging Face

Wrapping Up

In combination with Hugging Face’s advanced technology, the infrastructure provided by Amazon Web Service will lay a path for integrating various NLP models into our daily lives. With this partnership, users will be able to train their language models, reducing the impacts of language barriers, concluded the blog post.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top