Amazon Bedrock Mantle
OpenClaw includes a bundled Amazon Bedrock Mantle provider that connects to the Mantle OpenAI-compatible endpoint. Mantle hosts open-source and third-party models (GPT-OSS, Qwen, Kimi, GLM, and similar) through a standard/v1/chat/completions surface backed by Bedrock infrastructure.
What OpenClaw supports
- Provider:
amazon-bedrock-mantle - API:
openai-completions(OpenAI-compatible) - Auth: explicit
AWS_BEARER_TOKEN_BEDROCKor IAM credential-chain bearer-token generation - Region:
AWS_REGIONorAWS_DEFAULT_REGION(default:us-east-1)
Automatic model discovery
WhenAWS_BEARER_TOKEN_BEDROCK is set, OpenClaw uses it directly. Otherwise,
OpenClaw attempts to generate a Mantle bearer token from the AWS default
credential chain, including shared credentials/config profiles, SSO, web
identity, and instance or task roles. It then discovers available Mantle
models by querying the region’s /v1/models endpoint. Discovery results are
cached for 1 hour, and IAM-derived bearer tokens are refreshed hourly.
Supported regions: us-east-1, us-east-2, us-west-2, ap-northeast-1,
ap-south-1, ap-southeast-3, eu-central-1, eu-west-1, eu-west-2,
eu-south-1, eu-north-1, sa-east-1.
Onboarding
- Choose one auth path on the gateway host:
- Verify models are discovered:
amazon-bedrock-mantle provider. No
additional config is required unless you want to override defaults.
Manual configuration
If you prefer explicit config instead of auto-discovery:Notes
- OpenClaw can mint the Mantle bearer token for you from AWS SDK-compatible
IAM credentials when
AWS_BEARER_TOKEN_BEDROCKis not set. - The bearer token is the same
AWS_BEARER_TOKEN_BEDROCKused by the standard Amazon Bedrock provider. - Reasoning support is inferred from model IDs containing patterns like
thinking,reasoner, orgpt-oss-120b. - If the Mantle endpoint is unavailable or returns no models, the provider is silently skipped.