UHG
Search
Close this search box.

Redis Unveils Redis Vector Library for Generative AI Development

It operates within the Redis Enterprise platform, functioning as a real-time vector database catering to vector search, LLM caching, and chat history. 

Share

Real time database company Redis has introduced Redis Vector Library for streamlining Generative AI application development. It operates within the Redis Enterprise platform, functioning as a real-time vector database catering to vector search, LLM caching, and chat history. 

Key Features of the Library

The Redis Vector Library introduces a simplified client, particularly focusing on vector embeddings for search, making it more accessible for AI-driven tasks. The Python Redis Vector Library (redisvl) extends the widely used redis-py client, enabling seamless integration with Redis for generative AI applications. Setting up the library involves installing it via pip, and Redis can be deployed either on Redis Cloud for a managed service or using a Docker image for local development. Additionally, the library comes with a dedicated CLI tool called rvl.

To optimise production search performance, it allows explicit configuration of index settings and dataset schema using redisvl. Defining, loading, and managing a custom schema is made straightforward with YAML files.

The VectorQuery feature, a fundamental component of redisvl, aims to simplify vector searches with optional filters, improving retrieval precision. Beyond basic querying, filters enable combining searches over structured data with vector similarity. The library also includes a vectoriser module for generating embeddings, providing access to popular embedding providers like Cohere, OpenAI, VertexAI, and HuggingFace.

Redisvl also includes Semantic Caching, to improve the efficiency of applications interacting with LLMs by caching responses based on semantic similarity. This feature claims to reduces response times and API costs by using previously cached responses for similar queries. The library aims to provide abstractions for LLM session management and contextual access control in the future.

Read more: How Redis Finds Moat in the Indian Market

📣 Want to advertise in AIM? Book here

Picture of Shritama Saha

Shritama Saha

Shritama (she/her) is a technology journalist at AIM who is passionate to explore generative AI with a special focus on big techs, database, healthcare, DE&I, hiring in tech and more.
Related Posts
19th - 23rd Aug 2024
Generative AI Crash Course for Non-Techies
Upcoming Large format Conference
Sep 25-27, 2024 | 📍 Bangalore, India
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

Flagship Events

Rising 2024 | DE&I in Tech Summit
April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore
Data Engineering Summit 2024
May 30 and 31, 2024 | 📍 Bangalore, India
MachineCon USA 2024
26 July 2024 | 583 Park Avenue, New York
MachineCon GCC Summit 2024
June 28 2024 | 📍Bangalore, India
Cypher USA 2024
Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA
Cypher India 2024
September 25-27, 2024 | 📍Bangalore, India
discord-icon
AI Forum for India
Our Discord Community for AI Ecosystem, In collaboration with NVIDIA.