Integrate Langfuse observability with Portkey’s AI gateway for comprehensive LLM monitoring and advanced routing capabilities
Langfuse is an open-source LLM observability platform that helps you monitor, debug, and analyze your LLM applications. When combined with Portkey, you get the best of both worlds: Langfuse’s detailed observability and Portkey’s advanced AI gateway features.This integration allows you to:
Track all LLM requests in Langfuse while routing through Portkey
Use Portkey’s 250+ LLM providers with Langfuse observability
Implement advanced features like caching, fallbacks, and load balancing
Maintain detailed traces and analytics in both platforms
Use Portkey’s config system for advanced features while tracking in Langfuse:
Copy
Ask AI
# Create a config in Portkey dashboard first, then reference itclient = OpenAI( api_key="dummy_key", base_url=PORTKEY_GATEWAY_URL, default_headers=createHeaders( api_key="YOUR_PORTKEY_API_KEY", config="pc-langfuse-prod" # Your saved config ID ))