The default model
By default, ControlFlow uses OpenAI’s GPT-4o model. GPT-4o is an extremely powerful and popular model that provides excellent out-of-the-box performance on most tasks. This does mean that to run an agent with no additional configuration, you will need to provide an OpenAI API key.Selecting a different LLM
Every ControlFlow agent can be assigned a specific LLM. When instantiating an agent, you can pass amodel
parameter to specify the LLM to use.
ControlFlow agents can use any LangChain LLM class that supports chat-based APIs and tool calling. For a complete list of available models, settings, and instructions, please see LangChain’s LLM provider documentation.
ControlFlow includes the required packages for OpenAI, Azure OpenAI, and Anthropic models by default. To use other models, you’ll need to first install the corresponding LangChain package and supply any required credentials. See the model’s documentation for more information.
Automatic configuration
ControlFlow can automatically load LLMs from certain providers, based on a parameter. The model parameter must have the form{provider key}/{model name}
.
For example:
Provider | Provider key | Required dependencies |
---|---|---|
OpenAI | openai | (included) |
Azure OpenAI | azure-openai | (included) |
Anthropic | anthropic | (included) |
google | langchain_google_genai | |
Groq | groq | langchain_groq |
Manual configuration
To configure a different LLM, follow these steps:1
Install required packages
To use an LLM, first make sure you have installed the appropriate provider package. For example, to use a Google model, run:
2
Configure API keys
You must provide the correct API keys and configuration for the LLM you want to use. These can be provided as environment variables or when you create the model in your script. For example, to use an OpenAI model, you must set the For model-specific instructions, please refer to the provider’s documentation.
OPENAI_API_KEY
environment variable:3
Create the model
Create the LLM model in your script, including any additional parameters. For example, to use Claude 3 Opus:
4
Pass the model to an agent
Finally, configure an agent with the model:
Changing the default model
ControlFlow has a few ways to customize the default LLM.ControlFlow includes OpenAI and Azure OpenAI models by default. To use other models, you’ll need to first install the corresponding LangChain package and supply any required credentials. See the model’s documentation for more information.
From a model object
To use any model as the default LLM, create the model object in your script and assign it tocontrolflow.defaults.model
. It will be used by any agent that does not have a model specified.
From a string setting
You can also specify a default model using a string, which is convenient though it doesn’t allow you to configure advanced model settings. This must be a string in the form{provider key}/{model name}
, following the same guidelines as automatic LLM configuration.
You can apply this setting either by using an environment variable before you import ControlFlow or in your script at runtime. For example, to use GPT 3.5 Turbo as the default model:
The default model can only be set by environment variable before importing ControlFlow. Once ControlFlow is imported, it reads the
controlflow.settings.llm_model
value to create the default model object.