Skip to main content

[BETA] Prompt Management

info

This feature is currently in beta, and might change unexpectedly. We expect this to be more stable by next month (February 2025).

Run experiments or change the specific model (e.g. from gpt-4o to gpt4o-mini finetune) from your prompt management tool (e.g. Langfuse) instead of making changes in the application.

Supported Integrations:

Quick Start​

import os 
import litellm

os.environ["LANGFUSE_PUBLIC_KEY"] = "public_key" # [OPTIONAL] set here or in `.completion`
os.environ["LANGFUSE_SECRET_KEY"] = "secret_key" # [OPTIONAL] set here or in `.completion`

litellm.set_verbose = True # see raw request to provider

resp = litellm.completion(
model="langfuse/gpt-3.5-turbo",
prompt_id="test-chat-prompt",
prompt_variables={"user_message": "this is used"}, # [OPTIONAL]
messages=[{"role": "user", "content": "<IGNORED>"}],
)

Expected Logs:

POST Request Sent from LiteLLM:
curl -X POST \
https://api.openai.com/v1/ \
-d '{'model': 'gpt-3.5-turbo', 'messages': <YOUR LANGFUSE PROMPT TEMPLATE>}'

How to set model​

Set the model on LiteLLM​

You can do langfuse/<litellm_model_name>

litellm.completion(
model="langfuse/gpt-3.5-turbo", # or `langfuse/anthropic/claude-3-5-sonnet`
...
)

Set the model in Langfuse​

If the model is specified in the Langfuse config, it will be used.

model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: azure/chatgpt-v-2
api_key: os.environ/AZURE_API_KEY
api_base: os.environ/AZURE_API_BASE

What is 'prompt_variables'?​

  • prompt_variables: A dictionary of variables that will be used to replace parts of the prompt.

What is 'prompt_id'?​

  • prompt_id: The ID of the prompt that will be used for the request.

What will the formatted prompt look like?​

/chat/completions messages​

The messages field sent in by the client is ignored.

The Langfuse prompt will replace the messages field.

To replace parts of the prompt, use the prompt_variables field. See how prompt variables are used

If the Langfuse prompt is a string, it will be sent as a user message (not all providers support system messages).

If the Langfuse prompt is a list, it will be sent as is (Langfuse chat prompts are OpenAI compatible).

Architectural Overview​

API Reference​

These are the params you can pass to the litellm.completion function in SDK and litellm_params in config.yaml

prompt_id: str # required
prompt_variables: Optional[dict] # optional
langfuse_public_key: Optional[str] # optional
langfuse_secret: Optional[str] # optional
langfuse_secret_key: Optional[str] # optional
langfuse_host: Optional[str] # optional