NLP gets a quantum boost

QSANN is effective and scalable on larger data sets and can be deployed on near-term quantum devices.
Listen to this story

Quantum computing has a lot of applications in various fields of artificial intelligence, including natural language processing (NLP). Due to heavy syntactic preprocessing and syntax-dependent network architecture, Quantum NLP (QNLP) is ineffective on huge real-world data sets. Researchers from Baidu proposed simple network architecture called the quantum self-attention neural network (QSANN) to tackle such limitations.

The researchers introduced the self-attention mechanism into quantum neural networks and then used a Gaussian projected quantum self-attention as a sensible quantum version of self-attention. 

QSANN is scalable on larger data sets and can be deployed on near-term quantum devices.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

QNN for text classification

Text classification is one of the most fundamental tasks in natural language processing. The process entails taking a given text sequence and assigning it to the respective predefined categories. In this paper, the researchers from Baidu Research Institute and the University of Technology, Sydney, have taken sentiment analysis and topic classification as examples of text classification. The standard approach in machine learning is to train a model with a set of pre-labelled sequences (supervised learning). Then, when the model encounters a new sequence, it will be able to predict the category of the new sequence based on the learnings from the training data set.

Quantum Self-attention layer

The QSANN used a Gaussian Projected Quantum Self-Attention method to overcome the unitary nature of quantum circuits. QSANN surpassed the CSANN benchmark on the Yelp, IMDb, and Amazon datasets in the assessments, achieving 100 percent accuracy on the MC challenge.

Source: arxiv.org

Some of the sophisticated approaches, such as positional encoding and multi-head attention, can also be used in quantum neural networks in the future for generative models and other more complex tasks. 

Experiment results

The researchers tested the model on public datasets. However, In these experiments, the quantum part was accomplished via classical simulation. The performance of QSANN was tested against 2 models:

  1.  The syntactic analysis-based quantum model: This model was tested on two simple tasks, i.e., mean classification and relative clause evaluation. 
  2. The classic self-attention neural network (CSANN): This model was tested on three public sentiment analysis data sets, i.e., Amazon, Yelp, and IMDb. 

The feasibility of the quantum self-attention GPQSA is demonstrated through the visualisation of self-attention coefficients. The researchers also showed the robustness of QSANN in noisy quantum channels. All these simulations and optimization loops are implemented via Paddle Quantum2 on the PaddlePaddle Deep Learning Platform

Source: arxiv.org

In the MC task, QSANN outperformed DisCoCat with an accuracy of 100 percent while using only 25 parameters (18 in the query-key-value part and 7 in fully connected part). RP task demonstrated similar results albeit a lower test accuracy rate due to a massive bias between the training and test sets.

In Yelp, IMDB and Amazon datasets, since they haven’t been tested with quantum algorithms before, QSANN set the benchmark by outperforming both the naive method and CSANN by utilising only 49 parameters for Yelp and IMDB and 61 for Amazon.

The test results strongly demonstrate the vast potential for the application of QSANN for binary text classification.

QNN vs CNN

Quantum neural networks offer unprecedented possibilities in solving problems beyond the abilities of classic neural networks. In comparison to traditional approaches, the quantum neural network demonstrates the following advantages:

  • Exponentially larger memory capacity
  • Higher performance by utilising a lesser number of parameters
  • Faster learning process
  • Elimination of catastrophic forgetting due to the absence of pattern
  • Interference
  • Single layer network solution of linearly inseparable problems
  • Processing speed (1010 bits/s)
  • Small scale (1011 neurons/mm3)
  • Higher stability and reliability

Quantum natural language processing (QNLP), strives to develop quantum-native NLP models that can be implemented on quantum devices. Most of the QNLP proposals lack scalability since they are based on syntactic analysis. Further, the syntax-based methods employ different PQCs for sentences with different syntactical structures and therefore are not flexible enough to process complex human language

Now, the performance of QSANN on various NLP tasks such as language modelling, machine translation, question answering, and text classification is a testament to the huge potential of quantum models.

More Great AIM Stories

Kartik Wali
A writer by passion, Kartik strives to get a deep understanding of AI, Data analytics and its implementation on all walks of life. As a Senior Technology Journalist, Kartik looks forward to writing about the latest technological trends that transform the way of life!

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM