How to Configure Agents
In this guide, we will explore how to configure agents using the agent_setups.json
file. The entries in this file are parsed as AgentSetup
objects and used to initialize agents and their dependencies. Understanding this configuration is crucial for building complex agents with multiple dependencies.
Prerequisites
Before you begin, make sure you have completed the Getting Started with the Agents SDK tutorial and have a basic understanding of how to create agents.
Schema of agent_setups.json
The agent_setups.json
file contains configuration entries for each agent. Each entry is parsed into an AgentSetup
object, which is then used to initialize the agent. Note that parts of this configuration are also defined in the env/agent_setups.json
file. In particular, this file contains secrets and other sensitive information that should not be stored in version control. In the end, both files are merged to create the final agent configuration.
The agent_setups.json
file is only relevant for local development and self-hosting deployments. In managed deployments, the agent configuration lives in your tenant configuration.
Here is the schema of each entry in the agent_setups.json
file:
{
"agent_identifier": "string",
"agent_name": "string",
"llm_client_configuration": {
"vendor": "string",
"vendor_configuration": {
"vendor_value": {
"vendor_specific_key": "value"
}
},
"model_configuration": {
"name": "string",
"type": "string",
"temperature": "number",
"json_output": "boolean",
"max_tokens": "number"
}
},
"agent_configuration": {
"key": "value"
},
"sub_agent_mapping": {
"sub_agent_name": "sub_agent_identifier"
}
}
Fields
- agent_identifier: A unique identifier for the agent. This is the value that is used in the API to select the agent.
- agent_name: The name of the agent. This should match the
agent_name
attribute in theChatAgent
class. - llm_client_configuration: Configuration for the language model client.
- vendor: The vendor of the language model (e.g., "openai").
- vendor_configuration: Vendor-specific configuration options (e.g.,
{"openai": {"openai_api_key": "your_api"} }
). - model_configuration: Configuration for the language model.
- name: The name of the model (e.g., "gpt-4").
- type: The type of the model (e.g., "chat").
- temperature: The temperature setting for the model.
- json_output: Whether the model should output JSON.
- max_tokens: The maximum number of tokens to generate.
- agent_configuration: Optional custom configuration for the agent. This can include any key-value pairs that the agent needs.
- sub_agent_mapping: An optional mapping of sub-agent names to sub-agent identifiers. This is used to assign different configurations to the nested agents.
Example Configuration
Here is an example of an agent_setups.json
file with two agents:
{
"my_rag_agent": {
"agent_identifier": "my_rag_agent",
"agent_name": "my_rag_agent",
"llm_client_configuration": {
"vendor": "openai",
"vendor_configuration": {},
"model_configuration": {
"name": "gpt-4",
"type": "chat",
"temperature": 0.0
}
},
"agent_configuration": {
"foo": "bar",
"baz": 42
}
},
"my_other_agent": {
"agent_identifier": "my_other_agent",
"agent_name": "my_other_agent",
"llm_client_configuration": {
"vendor": "anthropic",
"vendor_configuration": {
"anthropic": {}
},
"model_configuration": {
"name": "claude-3-5-sonnet-20240620",
"type": "chat",
"temperature": 0.5
}
},
"agent_configuration": {
"foo": "baz",
"qux": 123
}
}
}
Request parameters
In addition to the configuration from the AgentSetup
object, the agent can also receive additional parameters at request time through the REST API. This includes:
tenant
andindex_id
query parameters. These are important when using the Zeta Alpha retriever to fetch documents.- Authorization headers. These are normally propagated to downstream calls to the Zeta Alpha services.
bot_params
in the body of the request. These are additional parameters that can be passed to the agent.conversation_context
in the body of the request. This payload can be used to restrict the conversation to a subset of the data when calling the Zeta Alpha retriever.
Initializing Agents
The information from the AgentSetup
object and the request parameters are used to initialize the agent. These are passed to the agent's __init__
method when the agent is created.
Any typed parameter in the agent's __init__
method will be automatically populated with the corresponding value using the following rules in order of precedence:
- If the parameter type is a class registered in the
AgentDependencyRegistry
, then the value will be the corresponding initialized class instance. The arguments passed to initialize the class instance follow these rules recursively. More information injectable dependencies can be found in the How to Create Injectable Dependencies guide. - If the parameter type is
ConversationContext
, then the value will be theconversation_context
payload from the request. - If the parameter type is a subclass of
ChatAgent
, this means it is a nested agent. The value will be the initialized nested agent instance. - If the parameter type is
LLMClientConfiguration
, then the value will be thellm_client_configuration
from theAgentSetup
object. - If the parameter name matches a key in the
agent_configuration
section of theAgentSetup
object, then the value will be the corresponding value from theagent_configuration
section. If the type of the parameter is a pydanticBaseModel
, then the value will be deserialized into the corresponding model. - If the parameter name matches a key in the
bot_params
from the request or any of the other request parameters (e.g.,tenant
,index_id
, etc.), then the value will be extracted from there. If the type of the parameter is a pydanticBaseModel
, then the value will be deserialized into the corresponding model. - If the parameter is optional or it has a default value, then the value will be
None
or the default value, respectively. - Otherwise an error will be raised.
Example
Here is an example of an agent that uses the agent_configuration
section of the AgentSetup
object:
from typing import List, Optional
from zav.agents_sdk import ChatAgent, ChatAgentFactory, ChatMessage
@ChatAgentFactory.register()
class MyRAGAgent(ChatAgent):
agent_name = "my_rag_agent"
def __init__(self, foo: str, baz: int):
self.foo = foo
self.baz = baz
async def execute(self, conversation: List[ChatMessage]) -> Optional[ChatMessage]:
return ChatMessage(sender="bot", content=f"Foo: {self.foo}, Baz: {self.baz}")
In this example, the foo
and baz
parameters are passed to the agent's __init__
method from the agent_configuration
section of the agent_setups.json
file defined earlier. In this case, the foo
parameter will be set to "bar"
and the baz
parameter will be set to 42
.