Use Portkey with AWS’s Strands Agents to take your AI Agents to production
Quickstart: Install
Quickstart: Configure
OpenAIModel
with Portkey:Quickstart: Run
Create Provider
Create Config
Create API Key
Install Packages
Configure Portkey Client
OpenAIModel
, set the base_url
to Portkey’s Gateway URL and pass your Portkey API Key directly in as the main API key.View the Log
1. Create a Virtual Key
2. Create a Config
3. Create an API Key
4. Deploy & Monitor
RateLimitError
When your requests exceed quota, catch RateLimitError
:
TimeoutError
Set timeouts and catch TimeoutError
:
AuthenticationError
Verify your API key and header settings:
Best-practice Retry
Use a simple exponential backoff:
How does Portkey enhance Strands Agents?
Can I use Portkey with existing Strands Agents applications?
Does Portkey work with all Strands Agents features?
Can I track usage across multiple agents in a workflow?
trace_id
across multiple agents and requests to track the entire workflow. This is especially useful for multi-agent systems where you want to understand the full execution path.How do I filter logs and traces for specific agent runs?
agent_name
, agent_type
, or session_id
to easily find and analyze specific agent executions.Can I use my own API keys with Portkey?