evals.models.mistralai#
- DEFAULT_MISTRAL_MODEL = 'mistral-large-latest'#
Use the latest large mistral model by default.
- class MistralAIModel(default_concurrency: int = 20, _verbose: bool = False, _rate_limiter: ~phoenix.evals.models.rate_limiters.RateLimiter = <factory>, model: str = 'mistral-large-latest', temperature: float = 0, top_p: float | None = None, random_seed: int | None = None, response_format: ~typing.Dict[str, str] | None = None, safe_mode: bool = False, safe_prompt: bool = False)#
Bases:
BaseModel
A model class for Mistral AI. Requires mistralai package to be installed.
- invocation_parameters() Dict[str, Any] #
- model: str = 'mistral-large-latest'#
- random_seed: int | None = None#
- response_format: Dict[str, str] | None = None#
- safe_mode: bool = False#
- safe_prompt: bool = False#
- temperature: float = 0#
- top_p: float | None = None#
- exception MistralRateLimitError#
Bases:
Exception