In this guide, you’ll learn how to use configurable fields in LangChain to override default runtime parameters—ideal for switching between OpenAI models (e.g., GPT-3.5 and GPT-4) based on your prompt’s complexity or cost requirements.Documentation Index
Fetch the complete documentation index at: https://notes.kodekloud.com/llms.txt
Use this file to discover all available pages before exploring further.
Defining a Configurable Model
Begin by importing the required classes and creating aChatOpenAI instance where model_name is exposed as a configurable field. By default, this example uses GPT-3.5 Turbo.
If you don’t override
model_name, the chain will always use gpt-3.5-turbo by default.Building and Invoking the Chain
Next, define a simple haiku prompt template and compose it with your configurable model. Invoking the chain without further configuration uses the default settings.Overriding the Model at Runtime
To leverage a more advanced model like GPT-4 for complex prompts, call.with_config() with a dictionary matching your configurable field’s key:
Switching to higher-capability models (like
gpt-4) may increase your API costs. Monitor usage via your OpenAI dashboard.Benefits of Configurable Fields
Configurable fields provide a flexible, cost-effective way to manage runtime parameters:| Benefit | Description | Example |
|---|---|---|
| Cost Optimization | Default to a less expensive model, upgrade only when needed. | Start with GPT-3.5, switch to GPT-4 selectively |
| Developer Flexibility | Let users or downstream code adjust parameters without redeploy. | Expose temperature, max_tokens, or model_name |
| Seamless Integration | Combine with memory, callbacks, and dynamic prompts for workflows. | Multi-turn chatbots with user preferences |