Skip to content


The AnthropicLLMUnitSpec class configures the interaction with Anthropic’s LLM models within the Eidolon framework, defaulting on the Claude Opus model. It allows customization of the model’s behavior, including temperature settings and token limits.


modeltype: AnnotatedReference[LLMModel, claude_opus]
Default: claude_opus
Description: Specifies the Claude Opus model from Anthropic, known for its capabilities in language understanding and generation.
temperaturetype: float
Default: 0.3
Description: Sets the creativity of the model’s responses. A lower value results in more deterministic and predictable responses.
max_tokenstype: Optional[int]
Default: None
Description: Limits the number of tokens in the model’s responses, which is useful for controlling response length or computational load.
client_argstype: dict
Default: {}
Description: Allows for the passing of additional arguments to the model client, providing flexibility to customize the model’s behavior based on specific requirements.