Redis
Redis is a fast open source, in-memory data store. As part of the Redis Stack, RediSearch is the module that enables vector similarity semantic search, as well as many other types of searching.
Compatibility
Only available on Node.js.
LangChain.js accepts node-redis as the client for Redis vectorstore.
Setup
- Run Redis with Docker on your computer following the docs
- Install the node-redis JS client
- npm
- Yarn
- pnpm
npm install -S redis
yarn add redis
pnpm add redis
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/community
yarn add @langchain/openai @langchain/community
pnpm add @langchain/openai @langchain/community
Index docs
import { createClient } from "redis";
import { OpenAIEmbeddings } from "@langchain/openai";
import { RedisVectorStore } from "@langchain/redis";
import { Document } from "@langchain/core/documents";
const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();
const docs = [
new Document({
metadata: { foo: "bar" },
pageContent: "redis is fast",
}),
new Document({
metadata: { foo: "bar" },
pageContent: "the quick brown fox jumped over the lazy dog",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "lorem ipsum dolor sit amet",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "consectetur adipiscing elit",
}),
];
const vectorStore = await RedisVectorStore.fromDocuments(
docs,
new OpenAIEmbeddings(),
{
redisClient: client,
indexName: "docs",
}
);
await client.disconnect();
API Reference:
- OpenAIEmbeddings from
@langchain/openai
- RedisVectorStore from
@langchain/redis
- Document from
@langchain/core/documents
Query docs
import { createClient } from "redis";
import { ChatOpenAI, OpenAIEmbeddings } from "@langchain/openai";
import { RedisVectorStore } from "@langchain/redis";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { createRetrievalChain } from "langchain/chains/retrieval";
const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();
const vectorStore = new RedisVectorStore(new OpenAIEmbeddings(), {
redisClient: client,
indexName: "docs",
});
/* Simple standalone search in the vector DB */
const simpleRes = await vectorStore.similaritySearch("redis", 1);
console.log(simpleRes);
/*
[
Document {
pageContent: "redis is fast",
metadata: { foo: "bar" }
}
]
*/
/* Search in the vector DB using filters */
const filterRes = await vectorStore.similaritySearch("redis", 3, ["qux"]);
console.log(filterRes);
/*
[
Document {
pageContent: "consectetur adipiscing elit",
metadata: { baz: "qux" },
},
Document {
pageContent: "lorem ipsum dolor sit amet",
metadata: { baz: "qux" },
}
]
*/
/* Usage as part of a chain */
const model = new ChatOpenAI({ model: "gpt-3.5-turbo-1106" });
const questionAnsweringPrompt = ChatPromptTemplate.fromMessages([
[
"system",
"Answer the user's questions based on the below context:\n\n{context}",
],
["human", "{input}"],
]);
const combineDocsChain = await createStuffDocumentsChain({
llm: model,
prompt: questionAnsweringPrompt,
});
const chain = await createRetrievalChain({
retriever: vectorStore.asRetriever(),
combineDocsChain,
});
const chainRes = await chain.invoke({ input: "What did the fox do?" });
console.log(chainRes);
/*
{
input: 'What did the fox do?',
chat_history: [],
context: [
Document {
pageContent: 'the quick brown fox jumped over the lazy dog',
metadata: [Object]
},
Document {
pageContent: 'lorem ipsum dolor sit amet',
metadata: [Object]
},
Document {
pageContent: 'consectetur adipiscing elit',
metadata: [Object]
},
Document { pageContent: 'redis is fast', metadata: [Object] }
],
answer: 'The fox jumped over the lazy dog.'
}
*/
await client.disconnect();
API Reference:
- ChatOpenAI from
@langchain/openai
- OpenAIEmbeddings from
@langchain/openai
- RedisVectorStore from
@langchain/redis
- ChatPromptTemplate from
@langchain/core/prompts
- createStuffDocumentsChain from
langchain/chains/combine_documents
- createRetrievalChain from
langchain/chains/retrieval
Create index with options
To pass arguments for index creation, you can utilize the available options offered by node-redis through createIndexOptions
parameter.
import { createClient } from "redis";
import { OpenAIEmbeddings } from "@langchain/openai";
import { RedisVectorStore } from "@langchain/redis";
import { Document } from "@langchain/core/documents";
const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();
const docs = [
new Document({
metadata: { foo: "bar" },
pageContent: "redis is fast",
}),
new Document({
metadata: { foo: "bar" },
pageContent: "the quick brown fox jumped over the lazy dog",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "lorem ipsum dolor sit amet",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "consectetur adipiscing elit",
}),
];
const vectorStore = await RedisVectorStore.fromDocuments(
docs,
new OpenAIEmbeddings(),
{
redisClient: client,
indexName: "docs",
createIndexOptions: {
TEMPORARY: 1000,
},
}
);
await client.disconnect();
API Reference:
- OpenAIEmbeddings from
@langchain/openai
- RedisVectorStore from
@langchain/redis
- Document from
@langchain/core/documents
Delete an index
import { createClient } from "redis";
import { OpenAIEmbeddings } from "@langchain/openai";
import { RedisVectorStore } from "@langchain/redis";
import { Document } from "@langchain/core/documents";
const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();
const docs = [
new Document({
metadata: { foo: "bar" },
pageContent: "redis is fast",
}),
new Document({
metadata: { foo: "bar" },
pageContent: "the quick brown fox jumped over the lazy dog",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "lorem ipsum dolor sit amet",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "consectetur adipiscing elit",
}),
];
const vectorStore = await RedisVectorStore.fromDocuments(
docs,
new OpenAIEmbeddings(),
{
redisClient: client,
indexName: "docs",
}
);
await vectorStore.delete({ deleteAll: true });
await client.disconnect();
API Reference:
- OpenAIEmbeddings from
@langchain/openai
- RedisVectorStore from
@langchain/redis
- Document from
@langchain/core/documents