LangChain
Interacting with LLMs
Parsing Model Output Demo 1
In this guide, you’ll learn how to convert large language model (LLM) responses into native Python structures using LangChain’s output parsers. We’ll demonstrate:
- Turning a numbered list into a comma-separated string
- Parsing that string into a Python list
This approach makes LLM outputs easy to manipulate in your codebase.
Table of Contents
- Prerequisites
- Initializing LangChain and OpenAI
- Generating an Unstructured List
- Formatting as CSV with CommaSeparatedListOutputParser
- Parsing the CSV into a Python List
- Next Steps
- References
Prerequisites
- Python 3.7+
- An OpenAI API key
langchain
installed
pip install langchain langchain-openai
Initializing LangChain and OpenAI
Import the core components:
from langchain_openai import OpenAI
from langchain.prompts import PromptTemplate
from langchain.output_parsers import CommaSeparatedListOutputParser
from langchain.output_parsers.list import ListOutputParser
Create the OpenAI LLM client:
llm = OpenAI()
Generating an Unstructured List
Define a simple prompt template without format constraints:
prompt = PromptTemplate(
template="List 3 {things}.",
input_variables=["things"]
)
Invoke the model to list three World Cup cricket teams:
response = llm.invoke(input=prompt.format(things="countries that play cricket in the World Cup"))
print(response)
Output might look like:
1. India
2. Australia
3. England
Note
This human-readable list is great for display but difficult to process in scripts. Next, we’ll add format instructions for a comma-separated response.
Formatting as CSV with CommaSeparatedListOutputParser
Instantiate the CSV parser and retrieve its instructions:
output_parser = CommaSeparatedListOutputParser() format_instructions = output_parser.get_format_instructions() print(format_instructions)
Expected instruction:
Your response should be a list of comma separated values, eg: `foo, bar, baz`
Embed these instructions into a new prompt template:
prompt_with_format = PromptTemplate( template="List 3 {things}.\n{format_instructions}", input_variables=["things"], partial_variables={"format_instructions": format_instructions} )
Invoke the LLM with the CSV constraint:
final_prompt = prompt_with_format.format(things="countries that play cricket in the World Cup") output = llm.invoke(input=final_prompt).strip() print(output)
Now the model returns:
India, Australia, England
Parsing the CSV into a Python List
Convert the raw string into a Python list:
# Before parsing
print(type(output)) # <class 'str'>
# Parse into list
things = output_parser.parse(output)
print(things) # ['India', 'Australia', 'England']
print(type(things)) # <class 'list'>
Now you can work with things
directly in your application.
Next Steps
Explore more advanced parsers for structured outputs:
Parser Type | Use Case | Example |
---|---|---|
CommaSeparatedListOutputParser | Simple lists in CSV | foo, bar, baz |
ListOutputParser | Numbered or bullet lists | 1. foo\n2. bar\n |
JSONOutputParser | Complex nested data structures | { "name": "Alice", "age": 30 } |
You can также try the JSONOutputParser
to extract richer data types from LLM responses.
References
- LangChain Output Parsers
- PromptTemplate Documentation
- OpenAI LLM Client
- CommaSeparatedListOutputParser Source
Watch Video
Watch video content