-
Notifications
You must be signed in to change notification settings - Fork 341
Description
Product
BAML
Problem Statement / Use Case
The current client LLM API works well with AWS credentials. However, they do not support a way to pass guardrails information like identifier and version to pass through guardrail checks for AWS requests. The modular API can be an alternative but there is no support for streaming. If baml can natively support the passing guardrail information through client llm that would be a great feature special for the organizations who enabled guardrails as their protection layer for llm requests/response. I don't want to miss the value baml brings in this AI world because of this small issue. Appreciate if you can add this feature at your earliest.
Proposed Solution
client CustomBedrock {
provider aws-bedrock
options {
model "anthropic.claude-sonnet-4-20250514-v1:0"
region "us-east-1"
guardrail_identifier "abcdefghijk"
guardrail_version "DRAFT"
}
}
Alternatively we can also pass it to the baml function using baml_options
b.ExtractResume("Vaibhav Gupta ....", baml_options = {"guardrail_identifier":"abcdefghijk" , "guardrail_version": "DRAFT"})
Alternative Solutions
No response
Additional Context
No response