AIM Banners_978 x 90

Redis Unveils Redis Vector Library for Generative AI Development

It operates within the Redis Enterprise platform, functioning as a real-time vector database catering to vector search, LLM caching, and chat history. 

Real time database company Redis has introduced Redis Vector Library for streamlining Generative AI application development. It operates within the Redis Enterprise platform, functioning as a real-time vector database catering to vector search, LLM caching, and chat history. 

Key Features of the Library

The Redis Vector Library introduces a simplified client, particularly focusing on vector embeddings for search, making it more accessible for AI-driven tasks. The Python Redis Vector Library (redisvl) extends the widely used redis-py client, enabling seamless integration with Redis for generative AI applications. Setting up the library involves installing it via pip, and Redis can be deployed either on Redis Cloud for a managed service or using a Docker image for local development. Additionally, the library comes with a dedicated CLI tool called rvl.

To optimise production search performance, it allows explicit configuration of index settings and dataset schema using redisvl. Defining, loading, and managing a custom schema is made straightforward with YAML files.

The VectorQuery feature, a fundamental component of redisvl, aims to simplify vector searches with optional filters, improving retrieval precision. Beyond basic querying, filters enable combining searches over structured data with vector similarity. The library also includes a vectoriser module for generating embeddings, providing access to popular embedding providers like Cohere, OpenAI, VertexAI, and HuggingFace.

Redisvl also includes Semantic Caching, to improve the efficiency of applications interacting with LLMs by caching responses based on semantic similarity. This feature claims to reduces response times and API costs by using previously cached responses for similar queries. The library aims to provide abstractions for LLM session management and contextual access control in the future.

Read more: How Redis Finds Moat in the Indian Market

📣 Want to advertise in AIM? Book here

Picture of Shritama Saha
Shritama Saha
Shritama (she/her) is a technology journalist at AIM who is passionate to explore generative AI with a special focus on big techs, database, healthcare, DE&I, hiring in tech and more.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed