ControlFlow supports a variety of LLMs and model providers.
model
parameter to specify the LLM to use.
ControlFlow agents can use any LangChain LLM class that supports chat-based APIs and tool calling. For a complete list of available models, settings, and instructions, please see LangChain’s LLM provider documentation.
{provider key}/{model name}
.
For example:
Provider | Provider key | Required dependencies |
---|---|---|
OpenAI | openai | (included) |
Azure OpenAI | azure-openai | (included) |
Anthropic | anthropic | (included) |
google | langchain_google_genai | |
Groq | groq | langchain_groq |
Install required packages
Configure API keys
OPENAI_API_KEY
environment variable:Create the model
Pass the model to an agent
controlflow.defaults.model
. It will be used by any agent that does not have a model specified.
{provider key}/{model name}
, following the same guidelines as automatic LLM configuration.
You can apply this setting either by using an environment variable before you import ControlFlow or in your script at runtime. For example, to use GPT 3.5 Turbo as the default model:
controlflow.settings.llm_model
value to create the default model object.