Cost tracking, observability, and more for Open WebUI
Create Portkey API Key
Add Your Provider
Get Your Model Slugs
@provider-slug/model-name
).@openai-test/gpt-4o
— use this in the model
field of API requests.Access Admin Panel
Enable Direct Connections
Configure Portkey Connection
https://api.portkey.ai/v1
portkey
(or any name you prefer)@openai/gpt-4o
, @anthropic/claude-3-sonnet
) from Step 1Select and Use Your Model
@model-name
) from the dropdown at the top.Step 1: Implement Budget Controls & Rate Limits
Step 2: Define Model Access Rules
Step 4: Set Routing Configuration
Step 4: Implement Access Controls
Step 5: Deploy & Monitor
Access Image Settings
Configure Image Generation Engine
@openai-test/dall-e-3
Configure Model-Specific Settings
Test Your Configuration
override_params
and attaching it to your Portkey API key. Here’s a guidemodel slug
in your default config
object.
Can I use multiple LLM providers with the same API key?
How do I track costs for different teams?
What happens if a team exceeds their budget limit?