OpenAIEmbeddings features and configuration options, please refer to the API reference.
Overview
Integration details
Setup
To access OpenAI embedding models you’ll need to create a/an OpenAI account, get an API key, and install thelangchain-openai integration package.
Credentials
Head to platform.openai.com to sign up to OpenAI and generate an API key. Once you’ve done this set the OPENAI_API_KEY environment variable:Installation
The LangChain OpenAI integration lives in thelangchain-openai package:
Instantiation
Now we can instantiate our model object and generate chat completions:Azure OpenAI v1 API supportAs of 
langchain-openai>=1.0.1, OpenAIEmbeddings can be used directly with Azure OpenAI endpoints using the new v1 API, including support for Microsoft Entra ID authentication. See the Using with Azure OpenAI section below for details.Indexing and Retrieval
Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our RAG tutorials. Below, see how to index and retrieve data using theembeddings object we initialized above. In this example, we will index and retrieve a sample document in the InMemoryVectorStore.
Direct Usage
Under the hood, the vectorstore and retriever implementations are callingembeddings.embed_documents(...) and embeddings.embed_query(...) to create embeddings for the text(s) used in from_texts and retrieval invoke operations, respectively.
You can directly call these methods to get embeddings for your own use cases.
Embed single texts
You can embed single texts or documents withembed_query:
Embed multiple texts
You can embed multiple texts withembed_documents:
Using with Azure OpenAI
Azure OpenAI v1 API supportAs of 
langchain-openai>=1.0.1, OpenAIEmbeddings can be used directly with Azure OpenAI endpoints using the new v1 API. This provides a unified way to use OpenAI embeddings whether hosted on OpenAI or Azure.For the traditional Azure-specific implementation, continue to use AzureOpenAIEmbeddings.Using Azure OpenAI v1 API with API Key
To useOpenAIEmbeddings with Azure OpenAI, set the base_url to your Azure endpoint with /openai/v1/ appended:
Using Azure OpenAI with Microsoft Entra ID
The v1 API adds native support for Microsoft Entra ID authentication with automatic token refresh. Pass a token provider callable to theapi_key parameter:
Installation requirementsTo use Microsoft Entra ID authentication, install the Azure Identity library:
api_key parameter when using
asynchronous functions. You must import DefaultAzureCredential from azure.identity.aio:
When using an async callable for the API key, you must use async methods (
aembed_query, aembed_documents). Sync methods will raise an error.API reference
For detailed documentation onOpenAIEmbeddings features and configuration options, please refer to the API reference.
Connect these docs programmatically to Claude, VSCode, and more via MCP for    real-time answers.