ChatKinetica.load_messages_from_context()
will retrieve the
context information from the database so that it can be used to create a chat prompt.
The chat prompt consists of a SystemMessage
and pairs of
HumanMessage
/AIMessage
that contain the samples which are question/SQL
pairs. You can append pairs samples to this list but it is not intended to
facilitate a typical natural language conversation.
When you create a chain from the chat prompt and execute it, the Kinetica LLM will
generate SQL from the input. Optionally you can use KineticaSqlOutputParser
to
execute the SQL and return the result as a dataframe.
Currently, 2 LLM’s are supported for SQL generation:
.env
file of the project:
KINETICA_URL
: Database connection URLKINETICA_USER
: Database userKINETICA_PASSWD
: Secure password.KineticaChatLLM
then you are successfully connected.
faker
package to create a dataframe with 100 fake profiles.
CREATE OR REPLACE CONTEXT
syntax.
Here we create a context from the SQL syntax referencing the table we created.
load_messages_from_context()
function will retrieve a context from the DB and convert it into a list of chat messages that we use to create a ChatPromptTemplate
.
KineticaSqlOutputParser
that will execute the SQL and return a dataframe. This is optional and if we left it out then only SQL would be returned.
KineticaSqlResponse
containing the generated SQL and data. The question must be relevant to the to LLM context we used to create the prompt.