MITB Banner

NLP gets a quantum boost

QSANN is effective and scalable on larger data sets and can be deployed on near-term quantum devices.

Share

Illustration by Analytics India Magazine

Listen to this story

Quantum computing has a lot of applications in various fields of artificial intelligence, including natural language processing (NLP). Due to heavy syntactic preprocessing and syntax-dependent network architecture, Quantum NLP (QNLP) is ineffective on huge real-world data sets. Researchers from Baidu proposed simple network architecture called the quantum self-attention neural network (QSANN) to tackle such limitations.

The researchers introduced the self-attention mechanism into quantum neural networks and then used a Gaussian projected quantum self-attention as a sensible quantum version of self-attention. 

QSANN is scalable on larger data sets and can be deployed on near-term quantum devices.

QNN for text classification

Text classification is one of the most fundamental tasks in natural language processing. The process entails taking a given text sequence and assigning it to the respective predefined categories. In this paper, the researchers from Baidu Research Institute and the University of Technology, Sydney, have taken sentiment analysis and topic classification as examples of text classification. The standard approach in machine learning is to train a model with a set of pre-labelled sequences (supervised learning). Then, when the model encounters a new sequence, it will be able to predict the category of the new sequence based on the learnings from the training data set.

Quantum Self-attention layer

The QSANN used a Gaussian Projected Quantum Self-Attention method to overcome the unitary nature of quantum circuits. QSANN surpassed the CSANN benchmark on the Yelp, IMDb, and Amazon datasets in the assessments, achieving 100 percent accuracy on the MC challenge.

Source: arxiv.org

Some of the sophisticated approaches, such as positional encoding and multi-head attention, can also be used in quantum neural networks in the future for generative models and other more complex tasks. 

Experiment results

The researchers tested the model on public datasets. However, In these experiments, the quantum part was accomplished via classical simulation. The performance of QSANN was tested against 2 models:

  1.  The syntactic analysis-based quantum model: This model was tested on two simple tasks, i.e., mean classification and relative clause evaluation. 
  2. The classic self-attention neural network (CSANN): This model was tested on three public sentiment analysis data sets, i.e., Amazon, Yelp, and IMDb. 

The feasibility of the quantum self-attention GPQSA is demonstrated through the visualisation of self-attention coefficients. The researchers also showed the robustness of QSANN in noisy quantum channels. All these simulations and optimization loops are implemented via Paddle Quantum2 on the PaddlePaddle Deep Learning Platform

Source: arxiv.org

In the MC task, QSANN outperformed DisCoCat with an accuracy of 100 percent while using only 25 parameters (18 in the query-key-value part and 7 in fully connected part). RP task demonstrated similar results albeit a lower test accuracy rate due to a massive bias between the training and test sets.

In Yelp, IMDB and Amazon datasets, since they haven’t been tested with quantum algorithms before, QSANN set the benchmark by outperforming both the naive method and CSANN by utilising only 49 parameters for Yelp and IMDB and 61 for Amazon.

The test results strongly demonstrate the vast potential for the application of QSANN for binary text classification.

QNN vs CNN

Quantum neural networks offer unprecedented possibilities in solving problems beyond the abilities of classic neural networks. In comparison to traditional approaches, the quantum neural network demonstrates the following advantages:

  • Exponentially larger memory capacity
  • Higher performance by utilising a lesser number of parameters
  • Faster learning process
  • Elimination of catastrophic forgetting due to the absence of pattern
  • Interference
  • Single layer network solution of linearly inseparable problems
  • Processing speed (1010 bits/s)
  • Small scale (1011 neurons/mm3)
  • Higher stability and reliability

Quantum natural language processing (QNLP), strives to develop quantum-native NLP models that can be implemented on quantum devices. Most of the QNLP proposals lack scalability since they are based on syntactic analysis. Further, the syntax-based methods employ different PQCs for sentences with different syntactical structures and therefore are not flexible enough to process complex human language

Now, the performance of QSANN on various NLP tasks such as language modelling, machine translation, question answering, and text classification is a testament to the huge potential of quantum models.

Share
Picture of Kartik Wali

Kartik Wali

A writer by passion, Kartik strives to get a deep understanding of AI, Data analytics and its implementation on all walks of life. As a Senior Technology Journalist, Kartik looks forward to writing about the latest technological trends that transform the way of life!
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.