Skip to main content

Drop Unsupported Params

Drop unsupported OpenAI params by your LLM Provider.

Quick Start

import litellm 
import os

# set keys
os.environ["COHERE_API_KEY"] = "co-.."

litellm.drop_params = True # 👈 KEY CHANGE

response = litellm.completion(
model="command-r",
messages=[{"role": "user", "content": "Hey, how's it going?"}],
response_format={"key": "value"},
)

LiteLLM maps all supported openai params by provider + model (e.g. function calling is supported by anthropic on bedrock but not titan).

See litellm.get_supported_openai_params("command-r") Code

If a provider/model doesn't support a particular param, you can drop it.

OpenAI Proxy Usage

litellm_settings:
drop_params: true

Pass drop_params in completion(..)

Just drop_params when calling specific models

import litellm 
import os

# set keys
os.environ["COHERE_API_KEY"] = "co-.."

response = litellm.completion(
model="command-r",
messages=[{"role": "user", "content": "Hey, how's it going?"}],
response_format={"key": "value"},
drop_params=True
)

Specify params to drop

To drop specific params when calling a provider (E.g. 'logit_bias' for vllm)

Use additional_drop_params

import litellm 
import os

# set keys
os.environ["COHERE_API_KEY"] = "co-.."

response = litellm.completion(
model="command-r",
messages=[{"role": "user", "content": "Hey, how's it going?"}],
response_format={"key": "value"},
additional_drop_params=["response_format"]
)

additional_drop_params: List or null - Is a list of openai params you want to drop when making a call to the model.