MITB Banner

A Striking Relevance of Data Sketching in LLMs

Can data sketching be used in large language models for building chatbots?

Share

Listen to this story

At Data Engineering Summit (DES) 2023, presenting a talk about Unleashing the Power of Probabilistic Data Structures: Optimizing Storage and Performance for Big Data, Sudarshan Pakrashi, Director of Data Engineering at Zeotap spoke about statistical algorithms designed to optimise use of memory in storing and querying large datasets. In one of the questions asked, he spoke about if data sketching can be used in current generative AI models such as LLMs.

To this, Pakrashi responded that it is possible to do so and is actually a “great analogy”. He explained how in every language model there are word associations that need to be maintained when there is a huge dataset of words. “Imagine the permutations and combinations that you want to have and sketches are in fact used to maintain those because then your model is actually going to query the frequencies for those combinations,” he explained. 

Data sketching is a method of summarising large datasets using compact data structures that can provide approximate answers to queries about the data. In the context of LLMs, data sketching can be used to summarise the text corpus used to train the model, which can help reduce the memory requirements of the model and improve its training efficiency.

Why Do We Need Data Sketching in LLMs

“Do you ever feel overwhelmed by a constant stream of information?” – reads the first line of the paper – ‘What is Data Sketching, and Why Should I Care’ by Graham Cormode in 2017. When you start filtering out information based on what is required, that is exactly similar to what data sketching essentially is. And this can greatly benefit the training of LLMs. 

Data sketching can help improve the efficiency and scalability of generative models in the following ways:

  • Data compression: Summarising large datasets using sketching techniques, LLMs can be trained on representations that are smaller than the original one, reducing the memory requirements and computational resources needed for deploying and training. This can be especially helpful when dealing with limited resources or large-scale datasets.
  • Faster training: Speed up the training process for LLMs by reducing the amount of data they need to process. Data sketching, by essentially reducing the size of the data, can lead to faster convergence and shorter training times without significantly compromising the quality of the generated samples.
  • Real-time data: Sketching can enable LLMs to process and learn from data streams efficiently, and automatically update their internal representations on-the-fly by generating new samples based on the most recent data.
  • Anomaly detection: For identifying outliers or anomalies in the training data, sketching and sampling techniques can be used to improve the quality of LLMs outputs. By identifying and potentially removing anomalous data points, LLMs can focus on learning the underlying structure and patterns in the data, leading to better generated samples.
  • Data exploration: For exploring large datasets to gain insights into their structure and characteristics, sketching can provide insights that can be used to guide the design and configuration of LLMs, such as selecting appropriate architectures, hyperparameters, or loss functions.

Data Sketching Techniques in LLMs

Although data sketching has been used previously in natural language processing (NLP) tasks, the recent rise of LLMs like GPT-3, that require huge compute, data sketching can, and in fact have, increased the efficient training and deployment of AI models. 

One commonly used data sketching technique in LLMs is the Bloom filter, which is a probabilistic data structure that can efficiently test whether an item is in a set. Bloom filters can be used to represent the vocabulary of the text corpus used to train the model, allowing the model to store the vocabulary in a much smaller memory footprint. 

Another technique used in LLMs for data sketching is Count-min sketches. This is another type of probabilistic data structure that can efficiently estimate the frequency of items in a set. Count-min sketches can be used to estimate the frequency of words in the text corpus, which can be used to optimise the training of the model.

Similar techniques include HyperLogLog, which is another probabilistic algorithm used to estimate the number of distinct elements in a large dataset. 

Moreover, Quantiles sketches is another technique that provides approximate answers to queries about percentiles, medians, or other order statistics of a dataset. This is similar to Sampling, which means selecting a subset of data for representing the entire dataset. 

Share
Picture of Mohit Pandey

Mohit Pandey

Mohit dives deep into the AI world to bring out information in simple, explainable, and sometimes funny words. He also holds a keen interest in photography, filmmaking, and the gaming industry.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.