Timbr integrates natural language inputs with Timbr’s ontology-driven semantic layer. Leveraging Timbr’s robust ontology capabilities, the SDK integrates with Timbr data models and leverages semantic relationships and annotations, enabling users to query data using business-friendly language.
Timbr provides a pre-built SQL agent, TimbrSqlAgent, which can be used for end-to-end purposes from user prompt, through semantic SQL query generation and validation, to query execution and result analysis.
For customizations and partial usage, you can use LangChain chains and LangGraph nodes with our 5 main tools:
IdentifyTimbrConceptChain&IdentifyConceptNode- Identify relevant concepts from user promptsGenerateTimbrSqlChain&GenerateTimbrSqlNode- Generate SQL queries from natural language promptsValidateTimbrSqlChain&ValidateSemanticSqlNode- Validate SQL queries against Timbr knowledge graph schemasExecuteTimbrQueryChain&ExecuteSemanticQueryNode- Execute (semantic and regular) SQL queries against Timbr knowledge graph databasesGenerateAnswerChain&GenerateResponseNode- Generate human-readable answers based on a given prompt and data rows
Additionally,For a comprehensive example of thelangchain-timbrprovidesTimbrLlmConnectorfor manual integration with Timbr’s semantic layer using LLM providers.
langchain-timbr integration, see the demo notebook.
Setting up
Installation
Install the package
Optional: Install with selected LLM provider
Choose one of:openai, anthropic, google, azure_openai, snowflake, databricks, vertex_ai (or 'all')
Configuration
Starting fromlangchain-timbr v2.0.0, all chains, agents, and nodes support optional environment-based configuration. You can set the following environment variables to provide default values and simplify setup for the provided tools:
Timbr Connection Parameters
TIMBR_URL: Default Timbr server URLTIMBR_TOKEN: Default Timbr authentication tokenTIMBR_ONTOLOGY: Default ontology/knowledge graph name
url, token, ontology) become optional in all chain and agent constructors and will use the environment values as defaults.
LLM Configuration Parameters
- LLM_TYPE: The type of LLM provider (one of langchain_timbr LlmTypes enum: ‘openai-chat’, ‘anthropic-chat’, ‘chat-google-generative-ai’, ‘azure-openai-chat’, ‘snowflake-cortex’, ‘chat-databricks’)
- LLM_API_KEY: The API key for authenticating with the LLM provider
- LLM_MODEL: The model name or deployment to use
- LLM_TEMPERATURE: Temperature setting for the LLM
- LLM_ADDITIONAL_PARAMS: Additional parameters as dict or JSON string
llm parameter becomes optional and will use the LlmWrapper with environment configuration.
Example environment setup:
Querying the semantic layer
We can now use Timbr’s chains to query the semantic layer. Import and utilize your intended chain/node, or use TimbrLlmConnector to manually integrate with Timbr’s semantic layer.ExecuteTimbrQueryChain example
Using multiple chains with SequentialChain
You can combine multiple Timbr chains to create more complex workflows.Using the TimbrLlmConnector
For manual integration with Timbr’s semantic layer, you can use theTimbrLlmConnector which includes the following methods:
get_ontologies- List Timbr’s semantic knowledge graphsget_concepts- List selected knowledge graph ontology representation conceptsget_views- List selected knowledge graph ontology representation viewsdetermine_concept- Identify relevant concepts from user promptsgenerate_sql- Generate SQL queries from natural language promptsvalidate_sql- Validate SQL queries against Timbr knowledge graph schemasrun_timbr_query- Execute (semantic and regular) SQL queries against Timbr knowledge graph databasesrun_llm_query- Execute agent pipeline to determine concept, generate SQL, and run query from natural language prompt