LiteLLMModel
- model_type (Union[ModelType, str]): Model for which a backend is created, such as GPT-3.5-turbo, Claude-2, etc.
- model_config_dict (Optional[Dict[str, Any]], optional): A dictionary that will be fed into:obj:
completion()
. If:obj:None
, :obj:LiteLLMConfig().as_dict()
will be used. (default: :obj:None
) - api_key (Optional[str], optional): The API key for authenticating with the model service. (default: :obj:
None
) - url (Optional[str], optional): The url to the model service. (default: :obj:
None
) - token_counter (Optional[BaseTokenCounter], optional): Token counter to use for the model. If not provided, :obj:
LiteLLMTokenCounter
will be used. (default: :obj:None
) - timeout (Optional[float], optional): The timeout value in seconds for API calls. If not provided, will fall back to the MODEL_TIMEOUT environment variable or default to 180 seconds. (default: :obj:
None
) **kwargs (Any): Additional arguments to pass to the client initialization.
init
_convert_response_from_litellm_to_openai
- response (LiteLLMResponse): The response object from LiteLLM.
token_counter
_run
- messages (List[OpenAIMessage]): Message list with the chat history in OpenAI format.