LangChain

Introduction to LCEL

LCEL Demo 5

In this tutorial, you’ll learn how to compose multiple mini-chains into a single, end-to-end content_chain using LangChain. By the end, you will have created a workflow that:

  1. Generates an impactful blog post title
  2. Produces a detailed outline
  3. Writes a 200-word blog post
  4. Crafts a concise summary for social media

Each step is modular, letting you swap models or parsers as needed for maximum flexibility.

Table of Contents

  1. Prerequisites
  2. Imports
  3. Defining Individual Chains
  4. Composing the Content Chain
  5. Invoking the Chain
  6. Customizing Your LLMs
  7. References

Prerequisites

  • Python 3.8+
  • An active OpenAI API key
  • The langchain_core and langchain_openai packages installed
  • Familiarity with basic LLM concepts
pip install langchain_core langchain_openai

1. Imports

Begin by importing the necessary classes and functions:

from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI
from langchain_core.runnables import RunnablePassthrough

2. Defining Individual Chains

Each mini-chain handles one stage of the workflow. We parse the LLM output with StrOutputParser and wrap the result in a passthrough runnable to tag it in the next step.

Title Chain

Generates an engaging title from a raw topic.

title = (
    ChatPromptTemplate.from_template("Generate an impactful title for {input}")
    | ChatOpenAI()
    | StrOutputParser()
    | {"title": RunnablePassthrough()}
)

Outline Chain

Creates a detailed outline based on the generated title.

outline = (
    ChatPromptTemplate.from_template("Generate a detailed outline for {title}")
    | ChatOpenAI()
    | StrOutputParser()
    | {"outline": RunnablePassthrough()}
)

Blog Chain

Builds a 200-word blog post using the outline.

blog = (
    ChatPromptTemplate.from_template(
        "Generate a 200-word blog post based on the outline: {outline}"
    )
    | ChatOpenAI()
    | StrOutputParser()
    | {"blog": RunnablePassthrough()}
)

Summary Chain

Produces a concise, social-media-ready summary from the blog content.

summary = (
    ChatPromptTemplate.from_template("Generate a concise summary for the post: {blog}")
    | ChatOpenAI()
    | StrOutputParser()
)

Note

Each chain uses the same parser (StrOutputParser) and can be extended with custom parsers or validators as needed.

3. Composing the Content Chain

Link the mini-chains in sequence so the output of one stage feeds into the next:

content_chain = title | outline | blog | summary

4. Invoking the Chain

Run the full workflow by providing a topic. The chain executes each stage in order:

result = content_chain.invoke({"input": "The impact of AI on jobs"})
print(result)

Depending on your LLM configuration, this multi-step call may take a bit longer than a single API request—but it yields structured, stage-wise outputs.

Warning

Large chains incur multiple API calls. Monitor your usage to avoid unexpected costs.

5. Customizing Your LLMs

You can assign different models to each stage to optimize for speed, cost, or quality. For example:

StageTaskExample Model
TitleCreative title generationGemini
OutlineStructured outline creationGPT-3.5
BlogLong-form content (200 words)GPT-4
SummaryShort social-media summaryClaude

To swap a model, replace ChatOpenAI():

from langchain_llms import Gemini

title = (
    ChatPromptTemplate.from_template("…")
    | Gemini()
    | StrOutputParser()
    | {"title": RunnablePassthrough()}
)

Then reassemble:

content_chain = title | outline | blog | summary

6. References

Happy chaining!

Watch Video

Watch video content

Previous
LCEL Demo 4