This guide explains how to create a custom LangChain tool for retrieving flight status details.
In this guide, we’ll walk through creating a custom LangChain tool that retrieves flight status details. You’ll learn how to define the tool, inspect its metadata, and wire it into a prompt–LLM chain for concise, one-word or short answers.
Then define your tool using the @tool decorator from LangChain:
Copy
Ask AI
from langchain.tools import toolfrom langchain.llms.openai import OpenAI@tooldef GetFlightStatus(flight_no: str) -> str: """Gets flight status and schedule""" # <Callout icon="triangle-alert" color="#FF6B6B"># In production, replace this stub with a real API call. #</Callout> return ( f"Flight {flight_no} departed at 5:20 PM. " "It is on-time and expected to arrive at 8:10 PM at Gate B12." )
The @tool decorator registers the function’s name, description, and argument schema automatically.
After defining GetFlightStatus, you can verify its registered metadata:
Copy
Ask AI
print(GetFlightStatus.name) # -> GetFlightStatusprint(GetFlightStatus.description) # -> Gets flight status and scheduleprint(GetFlightStatus.args) # -> {'flight_no': {'title': 'Flight No', 'type': 'string'}}
You can also view the complete StructuredTool representation:
Copy
Ask AI
from langchain.tools import StructuredToolprint(StructuredTool( name='GetFlightStatus', description="Gets flight status and schedule", args_schema=GetFlightStatus.args_schema, func=GetFlightStatus.func))
For a quick overview, here’s how the metadata maps out:
Each invocation returns a concise answer tailored by your prompt design.You now have a reusable flight status tool. Integrate real-world APIs inside GetFlightStatus to fetch live data, and combine multiple tools to build sophisticated LangChain agents.