Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.langchain.com/llms.txt

Use this file to discover all available pages before exploring further.

LangSmith Observability provides full visibility into your LLM application: from individual traces to production-wide performance metrics.
LangSmith works with many frameworks and providers. Browse available integrations to connect your stack including OpenAI, Anthropic, CrewAI, Vercel AI SDK, Pydantic AI, and more.

Get started

Set up tracing

Add tracing to your app in minutes with environment variables, framework integrations, or the SDK.

Trace a RAG application

Follow a step-by-step tutorial to instrument a retrieval-augmented generation app from start to finish.

Investigate and monitor

View traces

Filter, export, share, and compare traces via the UI or API.

Monitor performance

Build dashboards and set alerts to track quality and catch issues early.

Configure automations

Automate workflows with rules, webhooks, and online evaluations.

Collect feedback

Annotate outputs and gather user feedback using queues or inline annotation.

Analyze traces with Polly

LangSmith’s built-in AI assistant analyzes your traces and surfaces insights about performance, errors, and quality—without manual investigation.
For terminology and core concepts, refer to Observability concepts.
To set up a LangSmith instance, visit the Platform setup section to choose between cloud, hybrid, or self-hosted. All options include observability, evaluation, prompt engineering, and deployment.