Because PlanetScale works via a REST API, you can use this with Vercel Edge, Cloudflare Workers and other Serverless environments.For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for an PlanetScale Database instance.
Each chat history session stored in PlanetScale database must have a unique id.
The config parameter is passed directly into the new Client() constructor of @planetscale/database, and takes all the same arguments.
Copy
Ask AI
import { BufferMemory } from "langchain/memory";import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";import { ChatOpenAI } from "@langchain/openai";import { ConversationChain } from "langchain/chains";const memory = new BufferMemory({ chatHistory: new PlanetScaleChatMessageHistory({ tableName: "stored_message", sessionId: "lc-example", config: { url: "ADD_YOURS_HERE", // Override with your own database instance's URL }, }),});const model = new ChatOpenAI({ model: "gpt-4o-mini",});const chain = new ConversationChain({ llm: model, memory });const res1 = await chain.invoke({ input: "Hi! I'm Jim." });console.log({ res1 });/*{ res1: { text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?" }}*/const res2 = await chain.invoke({ input: "What did I just say my name was?" });console.log({ res2 });/*{ res1: { text: "You said your name was Jim." }}*/
You can also directly pass in a previously created @planetscale/database client instance:
Copy
Ask AI
import { BufferMemory } from "langchain/memory";import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";import { ChatOpenAI } from "@langchain/openai";import { ConversationChain } from "langchain/chains";import { Client } from "@planetscale/database";// Create your own Planetscale database clientconst client = new Client({ url: "ADD_YOURS_HERE", // Override with your own database instance's URL});const memory = new BufferMemory({ chatHistory: new PlanetScaleChatMessageHistory({ tableName: "stored_message", sessionId: "lc-example", client, // You can reuse your existing database client }),});const model = new ChatOpenAI({ model: "gpt-4o-mini",});const chain = new ConversationChain({ llm: model, memory });const res1 = await chain.invoke({ input: "Hi! I'm Jim." });console.log({ res1 });/*{ res1: { text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?" }}*/const res2 = await chain.invoke({ input: "What did I just say my name was?" });console.log({ res2 });/*{ res1: { text: "You said your name was Jim." }}*/
Assistant
Responses are generated using AI and may contain mistakes.