LangChain

Tips Tricks and Resources

Callbacks

In this guide, you’ll learn how to use callbacks in LangChain to hook into chain events—such as chain start, prompt formatting, and chain completion—to enable custom logging, monitoring, and integrations.

Table of Contents

  1. What Are Callbacks?
  2. Basic LLMChain without Callbacks
  3. Adding a Callback Handler
  4. Recap of Logging Techniques
  5. Links and References

What Are Callbacks?

A callback is a function that runs automatically when specific events occur in a LangChain component. With callbacks you can:

  • Log events to stdout, files, or cloud services
  • Track prompts and responses for auditing
  • Integrate with monitoring platforms (e.g., LangSmith)
  • Execute custom logic on chain events

Note

Callbacks provide fine-grained control over how your application logs, monitors, and reacts to chain activities.

Basic LLMChain without Callbacks

Start with a minimal example that formats a system and user prompt, sends it to the LLM, and returns the result:

from langchain.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain

llm = ChatOpenAI()

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a {subject} teacher"),
    ("human", "Tell me about {concept}")
])

chain = LLMChain(llm=llm, prompt=prompt)

response = chain.invoke({"subject": "physics", "concept": "galaxy"})
print(response)

Example response:

{
  "subject": "physics",
  "concept": "galaxy",
  "text": "A galaxy is a vast system of stars, gas, dust, and dark matter bound together by gravity. It is the basic building block of the universe..."
}

Adding a Callback Handler

Below, we register LangChain’s standard stdout handler so that chain events are logged to the console.

from langchain.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.callbacks import StdOutCallbackHandler

# 1. Create the callback handler
handler = StdOutCallbackHandler()

# 2. Initialize the LLM and prompt
llm = ChatOpenAI()
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a {subject} teacher"),
    ("human", "Tell me about {concept}")
])

# 3. Register the handler with the chain
chain = LLMChain(
    llm=llm,
    prompt=prompt,
    callbacks=[handler]
)

# 4. Invoke the chain; events will be printed to stdout
response = chain.invoke({"subject": "physics", "concept": "galaxy"})
print(response)

Example console output:

> Entering new LLMChain chain...
Prompt after formatting:
System: You are a physics teacher
Human: Tell me about galaxy

> Finished chain.
{'subject': 'physics', 'concept': 'galaxy', 'text': "A galaxy is a massive, gravitationally bound system..."}

With callbacks, you can replace or extend StdOutCallbackHandler to:

  • Format events as HTML or JSON
  • Write logs to files or databases
  • Integrate with external monitoring or alerting services

Warning

Custom callback handlers must implement the BaseCallbackHandler interface to ensure compatibility.

Recap of Logging Techniques

When building production-grade LangChain systems, consider these three approaches:

TechniqueScopeConfiguration Example
Global debug flagEntire langchainexport LANGCHAIN_DEBUG=true
Component-level verbosityIndividual chainsLLMChain(..., verbose=True)
Callback handlersFine-grained eventsLLMChain(..., callbacks=[StdOutCallbackHandler()])

Watch Video

Watch video content

Previous
Using Verbose Flag