SAP HANA Cloud database.
Setup & Installation
You must have an SAP HANA Cloud instance with the triple store feature enabled. For detailed instructions, refer to: Enable Triple Store To use SAP HANA Knowledge Graph Engine with LangChain, install thelangchain-hana package:
HanaRdfGraph Class
Creating a HanaRdfGraph instance
The constructor requires:
connection: an activehdbcli.dbapi.connect(...)instancegraph_uri: the named graph (or"DEFAULT") where your RDF data lives- One of:
ontology_query: a SPARQL CONSTRUCT to extract schema triplesontology_uri: a hosted ontology graph URIontology_local_file+ontology_local_file_format: a local Turtle/RDF fileauto_extract_ontology=True(not recommended for production—see note)
graph_uri vs. Ontology
graph_uri: The named graph in your SAP HANA Cloud instance that contains your instance data (sometimes 100k+ triples). IfNone,""or"DEFAULT"is provided, the default graph is used.- Ontology: a lean schema (typically ~50-100 triples) describing classes, properties, domains, ranges, labels, comments, and subclass relationships. The ontology guides SPARQL generation and result interpretation.
Creating a graph instance with DEFAULT graph
More info on the DEFAULT graph can be found at DEFAULT Graph and Named Graphs.Creating a graph instance with a graph_uri
Creating a graph instance with a remote ontology_uri
Load the schema directly from a hosted graph URI.
Creating a graph instance with a custom ontology_query
Use a custom CONSTRUCT query to selectively extract schema triples.
Creating a graph instance with a Local rdf file
(ontology_local_file + ontology_local_file_format): Load the schema from a local RDF ontology file.
Supported RDF formats are Turtle, RDF/XML, JSON-LD, N-Triples, Notation-3, Trig, Trix, N-Quads.
Auto extraction of ontology
(auto_extract_ontology=True): Infer schema information directly from your instance data.
Note: Auto-extraction is not recommended for production—it omits important triples likerdfs:label,rdfs:comment, andrdfs:subClassOfin general.
Executing SPARQL queries
You can use thequery() method to execute arbitrary SPARQL queries (SELECT, ASK, CONSTRUCT, etc.) on the data graph.
The function has the following parameters
- query: the SPARQL query string.
- content_type: the response format for the output (Default is CSV)
- CSV:
"sparql-results+xml" - JSON:
"sparql-results+json" - XML:
"sparql-results+csv" - TSV:
"sparql-results+tsv"
Note: CONSTRUCT and ASK Queries returnLet us insert some data into theturtleandbooleanformats respectively.
Puppets graph.
Puppets graph.
Puppets graph.