LangSmith Deep Dive

LangSmith Setup & Tracing

3 min read

LangSmith is LangChain's platform for debugging, testing, and monitoring LLM applications. It provides tracing, evaluation, and observability for any LLM pipeline.

Why LangSmith?

Capability Benefit
Tracing See every step of your LLM pipeline
Debugging Find exactly where things go wrong
Evaluation Measure quality at scale
Monitoring Track production performance

Installation

pip install -U langsmith openai

Environment Setup

Set these environment variables to enable tracing:

export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=your-api-key
export LANGCHAIN_PROJECT=my-project

Get your API key from smith.langchain.com.

Basic Tracing with @traceable

The @traceable decorator automatically logs function calls:

from langsmith import traceable
from openai import OpenAI

client = OpenAI()

@traceable
def answer_question(question: str) -> str:
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": question}
        ]
    )
    return response.choices[0].message.content

# Every call is now traced
result = answer_question("What is the capital of France?")

Wrapping OpenAI Client

For automatic tracing of all OpenAI calls:

from langsmith import wrappers
from openai import OpenAI

# Wrap the client for automatic tracing
client = wrappers.wrap_openai(OpenAI())

# All calls are now traced automatically
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}]
)

Trace Structure

Each trace contains:

Run (parent)
├── Input: The function arguments
├── Output: The return value
├── Metadata: Timing, tokens, model info
└── Child Runs: Nested function calls

Adding Metadata

Attach custom metadata to traces:

@traceable(
    run_type="llm",
    metadata={"version": "1.0", "team": "support"}
)
def generate_response(prompt: str) -> str:
    # Your LLM call here
    pass

Viewing Traces

In the LangSmith UI you can:

  • See full execution traces
  • Inspect inputs and outputs at each step
  • View latency breakdowns
  • Filter by project, time, or metadata

Tip: Enable tracing in development first. Once comfortable, enable it in production for full observability.

Next, we'll explore LangSmith's Insights Agent for discovering patterns and failure modes in your traces. :::

Quiz

Module 3: LangSmith Deep Dive

Take Quiz