ModelScope (Home | GitHub) is built upon the notion of “Model-as-a-Service” (MaaS). It seeks to bring together most advanced machine learning models from the AI community, and streamlines the process of leveraging AI models in real-world applications. The core ModelScope library open-sourced in this repository provides the interfaces and implementations that allow developers to perform model inference, training and evaluation. This will help you get started with ModelScope Chat Endpoint.Documentation Index
Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Integration details
| Provider | Class | Package | Serializable | Downloads | Version |
|---|---|---|---|---|---|
ModelScope | ModelScopeChatEndpoint | langchain-modelscope-integration | ❌ |
Setup
To access ModelScope chat endpoint you’ll need to create a ModelScope account, get an SDK token, and install thelangchain-modelscope-integration integration package.
Credentials
Head to ModelScope to sign up to ModelScope and generate an SDK token. Once you’ve done this set theMODELSCOPE_SDK_TOKEN environment variable:
Installation
The LangChain ModelScope integration lives in thelangchain-modelscope-integration package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
For detailed documentation of allModelScopeChatEndpoint features and configurations head to the reference: modelscope.cn/docs/model-service/API-Inference/intro
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

