IntegrationsUpdated 2026-03-18
LangGraph Integration
Use RetainDB as memory and checkpoints for LangGraph agents with stateful conversations.
Applies to: LangGraph
integrate RetainDB with LangGraph for stateful agent applications.
Installation
bash
npm install @retaindb/sdk langgraphQuick Start
typescript
import { RetainDBClient } from "@retaindb/sdk";
import { createLangGraphCheckpointAdapter } from "@retaindb/sdk/langgraph";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
const client = RetainDBClient.fromEnv();
// Create checkpoint adapter
const checkpoint = createLangGraphCheckpointAdapter(client, {
user_id: "user-123",
});
// Use with LangGraph agent
const agent = createReactAgent({
llm: new ChatOpenAI({ temperature: 0 }),
tools,
checkpointer: checkpoint,
});LangGraph Checkpoint Adapter
Basic Setup
typescript
import { RetainDBClient } from "@retaindb/sdk";
import { createLangGraphCheckpointAdapter } from "@retaindb/sdk/langgraph";
const client = RetainDBClient.fromEnv();
const checkpoint = createLangGraphCheckpointAdapter(client, {
user_id: "user-123", // Required
session_id: "chat-456", // Optional - for multi-session
});Options
typescript
interface LangGraphCheckpointOptions {
user_id: string; // Required
session_id?: string; // Optional
checkpointNamespace?: string; // For organization
}Using with Agents
Prebuilt Agent
typescript
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { ChatOpenAI } from "@langchain/openai";
const agent = createReactAgent({
llm: new ChatOpenAI({ temperature: 0 }),
tools,
checkpointer: checkpoint,
});
// Thread ID for conversation
const config = { configurable: { thread_id: "conversation-1" } };
// First message
await agent.invoke(
{ messages: ["Hi, I'm looking for restaurant recommendations"] },
config
);
// Second message - continues conversation
const response = await agent.invoke(
{ messages: ["What type of food do I prefer?"] },
config
);Custom Graph
typescript
import { StateGraph, END } from "@langchain/langgraph";
const workflow = new StateGraph(AgentState)
.addNode("agent", agentNode)
.addEdge("__start__", "agent")
.addEdge("agent", END)
.compile({ checkpointer: checkpoint });
// Use with checkpoint
const config = { configurable: { thread_id: "thread-123" } };
const result = await workflow.invoke(state, config);State Management
Saving State
The checkpoint adapter automatically saves agent state:
typescript
// Agent state is automatically persisted
await agent.invoke(
{ messages: ["Remember I prefer Italian food"] },
config
);
// State saved to RetainDBLoading State
typescript
// State automatically loaded from RetainDB
const result = await agent.invoke(
{ messages: ["What food do I prefer?"] },
config
);
// Agent remembers "Italian food"Memory Integration
Combine checkpoint with memory for richer context:
typescript
import { RetainDBClient } from "@retaindb/sdk";
import { createLangChainMemoryAdapter } from "@retaindb/sdk/langchain";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
const client = RetainDBClient.fromEnv();
// Memory adapter for conversation history
const memory = createLangChainMemoryAdapter(client, {
user_id: "user-123",
session_id: "chat-456",
});
// Checkpoint adapter for LangGraph state
const checkpoint = createLangGraphCheckpointAdapter(client, {
user_id: "user-123",
session_id: "chat-456",
});
// Agent with both
const agent = createReactAgent({
llm: new ChatOpenAI(),
tools,
memory, // LangChain memory
checkpointer: checkpoint, // LangGraph checkpoint
});Complete Example
typescript
import { RetainDBClient } from "@retaindb/sdk";
import { createLangGraphCheckpointAdapter } from "@retaindb/sdk/langgraph";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { ChatOpenAI } from "@langchain/openai";
const client = RetainDBClient.fromEnv({
baseUrl: "https://api.retaindb.com",
});
// Create checkpoint adapter
const checkpoint = createLangGraphCheckpointAdapter(client, {
user_id: "user-123",
session_id: "support-chat",
});
// Create agent with checkpoint
const agent = createReactAgent({
llm: new ChatOpenAI({ temperature: 0 }),
tools,
checkpointer: checkpoint,
});
// Conversation config
const config = { configurable: { thread_id: "thread-456" } };
async function handleChat(message: string) {
// Invoke with checkpoint
const result = await agent.invoke(
{ messages: [message] },
config
);
return result.messages[result.messages.length - 1].content;
}
// Use
await handleChat("I need help with my order");
await handleChat("It's order #12345");
await handleChat("What was my order number?");
// Agent remembers order #12345Error Handling
typescript
try {
const checkpoint = createLangGraphCheckpointAdapter(client, {
user_id: "user-123",
});
} catch (error) {
if (error.code === "INVALID_USER_ID") {
console.error("Invalid user ID");
}
}Best Practices
1. Use Consistent Thread IDs
typescript
// Good - deterministic thread ID
const threadId = `user-${userId}-session-${sessionId}`;
// Bad - random thread ID
const threadId = Math.random().toString();2. Combine Memory and Checkpoints
typescript
// Use both for maximum context
const agent = createReactAgent({
llm,
tools,
memory: RetainDBMemory, // Conversation history
checkpointer: RetainDBCheckpointer, // Agent state
});Next step
- SDK LangGraph Adapter — Adapter details
- LangGraph Docs — LangGraph docs
- SDK Quickstart — Getting started
Was this page helpful?
Your feedback helps us prioritize docs improvements weekly.