Because PlanetScale works via a REST API, you can use this with Vercel Edge, Cloudflare Workers and other Serverless environments. For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for an PlanetScale Database instance.

Setup

You will need to install @planetscale/database in your project:
npm
npm install @langchain/openai @planetscale/database @langchain/community @langchain/core
You will also need an PlanetScale Account and a database to connect to. See instructions on PlanetScale Docs on how to create a HTTP client.

Usage

Each chat history session stored in PlanetScale database must have a unique id. The config parameter is passed directly into the new Client() constructor of @planetscale/database, and takes all the same arguments.
import { BufferMemory } from "langchain/memory";
import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const memory = new BufferMemory({
  chatHistory: new PlanetScaleChatMessageHistory({
    tableName: "stored_message",
    sessionId: "lc-example",
    config: {
      url: "ADD_YOURS_HERE", // Override with your own database instance's URL
    },
  }),
});

const model = new ChatOpenAI({
  model: "gpt-4o-mini",
});
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
  res1: {
    text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
  }
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
  res1: {
    text: "You said your name was Jim."
  }
}
*/

Advanced Usage

You can also directly pass in a previously created @planetscale/database client instance:
import { BufferMemory } from "langchain/memory";
import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { Client } from "@planetscale/database";

// Create your own Planetscale database client
const client = new Client({
  url: "ADD_YOURS_HERE", // Override with your own database instance's URL
});

const memory = new BufferMemory({
  chatHistory: new PlanetScaleChatMessageHistory({
    tableName: "stored_message",
    sessionId: "lc-example",
    client, // You can reuse your existing database client
  }),
});

const model = new ChatOpenAI({
  model: "gpt-4o-mini",
});
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
  res1: {
    text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
  }
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
  res1: {
    text: "You said your name was Jim."
  }
}
*/