File System Chat Message History
The FileSystemChatMessageHistory uses a JSON file to store chat message history. For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory.
Setupβ
You'll first need to install the @langchain/community package:
- npm
 - Yarn
 - pnpm
 
npm install @langchain/community @langchain/core
yarn add @langchain/community @langchain/core
pnpm add @langchain/community @langchain/core
- npm
 - Yarn
 - pnpm
 
npm install @langchain/openai @langchain/community @langchain/core
yarn add @langchain/openai @langchain/community @langchain/core
pnpm add @langchain/openai @langchain/community @langchain/core
Usageβ
import { ChatOpenAI } from "@langchain/openai";
import { FileSystemChatMessageHistory } from "@langchain/community/stores/message/file_system";
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";
import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
const model = new ChatOpenAI({
  model: "gpt-3.5-turbo",
  temperature: 0,
});
const prompt = ChatPromptTemplate.fromMessages([
  [
    "system",
    "You are a helpful assistant. Answer all questions to the best of your ability.",
  ],
  new MessagesPlaceholder("chat_history"),
  ["human", "{input}"],
]);
const chain = prompt.pipe(model).pipe(new StringOutputParser());
const chainWithHistory = new RunnableWithMessageHistory({
  runnable: chain,
  inputMessagesKey: "input",
  historyMessagesKey: "chat_history",
  getMessageHistory: async (sessionId) => {
    const chatHistory = new FileSystemChatMessageHistory({
      sessionId,
      userId: "user-id",
    });
    return chatHistory;
  },
});
const res1 = await chainWithHistory.invoke(
  { input: "Hi! I'm Jim." },
  { configurable: { sessionId: "langchain-test-session" } }
);
console.log({ res1 });
/*
 { res1: 'Hi Jim! How can I assist you today?' }
 */
const res2 = await chainWithHistory.invoke(
  { input: "What did I just say my name was?" },
  { configurable: { sessionId: "langchain-test-session" } }
);
console.log({ res2 });
/*
 { res2: { response: 'You said your name was Jim.' } 
 */
// Give this session a title
const chatHistory = (await chainWithHistory.getMessageHistory(
  "langchain-test-session"
)) as FileSystemChatMessageHistory;
await chatHistory.setContext({ title: "Introducing Jim" });
// List all session for the user
const sessions = await chatHistory.getAllSessions();
console.log(sessions);
/*
 [
  { id: 'langchain-test-session', context: { title: "Introducing Jim"  } }
 ]
 */
API Reference:
- ChatOpenAI from 
@langchain/openai - FileSystemChatMessageHistory from 
@langchain/community/stores/message/file_system - RunnableWithMessageHistory from 
@langchain/core/runnables - StringOutputParser from 
@langchain/core/output_parsers - ChatPromptTemplate from 
@langchain/core/prompts - MessagesPlaceholder from 
@langchain/core/prompts