Oracle Cloud Infrastructure (OCI) Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs), that cover a wide range of use cases, and which are available through a single API.
Using the OCI Generative AI service you can access ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters. Detailed documentation of the service and API is available here and here.This notebook explains how to use OCI’s Genrative AI models with LangChain.
from langchain_community.embeddings import OCIGenAIEmbeddings# use default authN method API-keyembeddings = OCIGenAIEmbeddings( model_id="MY_EMBEDDING_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID",)query = "This is a query in English."response = embeddings.embed_query(query)print(response)documents = ["This is a sample document", "and here is another one"]response = embeddings.embed_documents(documents)print(response)
Copy
Ask AI
# Use Session Token to authNembeddings = OCIGenAIEmbeddings( model_id="MY_EMBEDDING_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", auth_type="SECURITY_TOKEN", auth_profile="MY_PROFILE", # replace with your profile name auth_file_location="MY_CONFIG_FILE_LOCATION", # replace with file location where profile name configs present)query = "This is a sample query"response = embeddings.embed_query(query)print(response)documents = ["This is a sample document", "and here is another one"]response = embeddings.embed_documents(documents)print(response)
Assistant
Responses are generated using AI and may contain mistakes.