Mistral Client

class codestral_ros2_gen.models.mistral_client.ModelUsage(prompt_tokens, completion_tokens, total_tokens)[source]

Bases: object

Dataclass to represent token usage statistics.

Parameters:
  • prompt_tokens (int)

  • completion_tokens (int)

  • total_tokens (int)

prompt_tokens

Number of tokens used in the prompt.

Type:

int

completion_tokens

Number of tokens used in the completion.

Type:

int

total_tokens

Total number of tokens used.

Type:

int

class codestral_ros2_gen.models.mistral_client.MistralClient(api_key=None, config=None)[source]

Bases: object

Client for interacting with Mistral AI API.

Parameters:
  • api_key (str | None)

  • config (Dict[str, Any] | None)

DEFAULT_CONFIG

Default configuration for the Mistral client.

Type:

Dict[str, Any]

api_key

API key for authenticating with the Mistral AI API.

Type:

str

client

Mistral client instance.

Type:

Mistral

config

Configuration for the Mistral client.

Type:

Dict[str, Any]

__init__(api_key=None, config=None)[source]

Initialize the MistralClient.

Parameters:
  • api_key (Optional[str]) – API key for authenticating with the Mistral AI API.

  • config (Optional[Dict[str, Any]]) – Configuration for the Mistral client.

_get_api_key(api_key, config)[source]

Get API key from provided sources in order of precedence.

Parameters:
  • api_key (Optional[str]) – API key provided directly.

  • config (Dict[str, Any]) – Configuration dictionary.

Returns:

API key.

Return type:

str

Raises:

RuntimeError – If API key is not found in any of the provided sources.

complete(prompt, system_prompt=None, model_type=None, temperature=None)[source]

Get completion from the model.

Parameters:
  • prompt (str) – Main prompt text.

  • system_prompt (Optional[str]) – Optional override for system prompt.

  • model_type (Optional[str]) – Optional override for model type.

  • temperature (Optional[float]) – Optional override for temperature.

Returns:

Tuple of (generated_text, usage_stats).

Return type:

Tuple[str, ModelUsage]

Raises:
  • ValueError – If the prompt is empty.

  • ConnectionError – If there is a connection error with the Mistral API.

  • RuntimeError – If there is an API error.

_prepare_messages(prompt, system_prompt=None)[source]

Prepare messages list for the API call.

Parameters:
  • prompt (str) – Main prompt text.

  • system_prompt (Optional[str]) – Optional system prompt.

Returns:

List of messages for the API call.

Return type:

List[Dict[str, str]]