> ## Documentation Index
> Fetch the complete documentation index at: https://docs.langchain.com/llms.txt
> Use this file to discover all available pages before exploring further.

# LangSmith Observability

> Instrument your LLM application, investigate traces, and monitor performance in production with LangSmith.

LangSmith Observability provides full visibility into your LLM application: from individual traces to production-wide performance metrics.

<Callout icon="plug" color="#4F46E5" iconType="regular">
  LangSmith works with many frameworks and providers. Browse [available integrations](/langsmith/integrations) to connect your stack including OpenAI, Anthropic, CrewAI, Vercel AI SDK, Pydantic AI, and more.
</Callout>

## Get started

<CardGroup cols={2}>
  <Card title="Set up tracing" icon="settings" href="/langsmith/observability-quickstart" arrow="true">
    Add tracing to your app in minutes with environment variables, framework integrations, or the SDK.
  </Card>

  <Card title="Trace a RAG application" icon="notebook" href="/langsmith/observability-llm-tutorial" arrow="true">
    Follow a step-by-step tutorial to instrument a retrieval-augmented generation app from start to finish.
  </Card>
</CardGroup>

## Investigate and monitor

<CardGroup cols={2}>
  <Card title="View traces" icon="route" href="/langsmith/filter-traces-in-application" arrow="true">
    Filter, export, share, and compare traces via the UI or API.
  </Card>

  <Card title="Monitor performance" icon="chart-area" href="/langsmith/dashboards" arrow="true">
    Build dashboards and set alerts to track quality and catch issues early.
  </Card>

  <Card title="Configure automations" icon="robot" href="/langsmith/rules" arrow="true">
    Automate workflows with rules, webhooks, and online evaluations.
  </Card>

  <Card title="Collect feedback" icon="users" href="/langsmith/attach-user-feedback" arrow="true">
    Annotate outputs and gather user feedback using queues or inline annotation.
  </Card>
</CardGroup>

<Card title="Analyze traces with Polly" icon="sparkles" href="/langsmith/polly" arrow="true">
  LangSmith's built-in AI assistant analyzes your traces and surfaces insights about performance, errors, and quality—without manual investigation.
</Card>

For terminology and core concepts, refer to [Observability concepts](/langsmith/observability-concepts).

<Note>
  To set up a LangSmith instance, visit the [Platform setup section](/langsmith/platform-setup) to choose between cloud, hybrid, or self-hosted. All options include observability, evaluation, prompt engineering, and deployment.
</Note>

***

<div className="source-links">
  <Callout icon="terminal-2">
    [Connect these docs](/use-these-docs) to Claude, VSCode, and more via MCP for real-time answers.
  </Callout>

  <Callout icon="edit">
    [Edit this page on GitHub](https://github.com/langchain-ai/docs/edit/main/src/langsmith/observability.mdx) or [file an issue](https://github.com/langchain-ai/docs/issues/new/choose).
  </Callout>
</div>
