LangChain’s package structure may change over time. Always refer to the official LangChain documentation for the latest import paths and best practices.
Prerequisites
- Python 3.7+
langchainandopenaipackages installed- Valid OpenAI API key
1. Configure Your OpenAI API Key
Before you start, make sure your OpenAI API key is set as an environment variable:Never commit your API keys to a public repository. Use environment variables or secret management tools to keep credentials secure.
2. Import Modules and Initialize the Chat Model
First, import the modules you need from LangChain. Then create an instance ofChatOpenAI with a deterministic configuration:
3. Understand the Message Classes
LangChain uses distinct classes to represent different roles in a conversation. Below is a quick reference:| Message Class | Role | Description |
|---|---|---|
| SystemMessage | System | Defines behavior/persona of the assistant |
| HumanMessage | User | Represents the user’s prompt or question |
| AIMessage | Assistant | The model’s reply (returned by the API call) |
4. Construct and Send Your Chat
Define a system message to set the assistant’s persona (for example, a physics teacher), followed by a human message asking a question:response object, you’ll find an AIMessage whose content attribute holds the model’s answer. This three-step pattern—system message, human message, then AI response—is fundamental to chat-based workflows.
5. Next Steps
- Experiment with different
temperaturesettings to control response creativity. - Chain multiple messages for multi-turn conversations.
- Explore advanced prompt-crafting techniques like few-shot examples and function calling.