evals.models.bedrock#

class BedrockModel(default_concurrency=20, _verbose=False, _rate_limiter=<factory>, model_id='anthropic.claude-v2', temperature=0.0, max_tokens=256, top_p=1, top_k=256, stop_sequences=<factory>, session=None, client=None, max_content_size=None, extra_parameters=<factory>, initial_rate_limit=5)#

Bases: BaseModel

An interface for using LLM models via AWS Bedrock.

This class wraps the boto3 Bedrock client for use with Phoenix LLM evaluations. Calls to the AWS API are dynamically throttled when encountering rate limit errors. Requires the boto3 package to be installed.

Supports Async: 🟑

boto3 does not support async calls, so it’s wrapped in an executor.

Parameters:
  • model_id (str) – The model name to use.

  • temperature (float, optional) – Sampling temperature to use. Defaults to 0.0.

  • max_tokens (int, optional) – Maximum number of tokens to generate in the completion. Defaults to 256.

  • top_p (float, optional) – Total probability mass of tokens to consider at each step. Defaults to 1.

  • top_k (int, optional) – The cutoff where the model no longer selects the words. Defaults to 256.

  • stop_sequences (List[str], optional) – If the model encounters a stop sequence, it stops generating further tokens. Defaults to an empty list.

  • session (Any, optional) – A bedrock session. If provided, a new bedrock client will be created using this session. Defaults to None.

  • client (Any, optional) – The bedrock session client. If unset, a new one is created with boto3. Defaults to None.

  • max_content_size (Optional[int], optional) – If using a fine-tuned model, set this to the maximum content size. Defaults to None.

  • extra_parameters (Dict[str, Any], optional) – Any extra parameters to add to the request body (e.g., countPenalty for a21 models). Defaults to an empty dictionary.

  • initial_rate_limit (int, optional) – The initial internal rate limit in allowed requests per second for making LLM calls. This limit adjusts dynamically based on rate limit errors. Defaults to 5.

Example

# configure your AWS credentials using the AWS CLI
# https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html

from phoenix.evals import BedrockModel
model = BedrockModel(model_id="anthropic.claude-v2")