Conversation
Add CacheConfig with strategy="auto" for BedrockModel to automatically inject cache points at the end of assistant messages in multi-turn conversations. - Add CacheConfig dataclass in model.py with strategy field - Add supports_caching property to check Claude model compatibility - Implement _inject_cache_point() for automatic cache point management - Export CacheConfig from models/__init__.py Closes strands-agents#1432
- Add warning when cache_config enabled but model doesn't support caching - Make supports_caching private (_supports_caching) - Fix log formatting to follow style guide - Clean up tests and imports
|
/strands review |
src/strands/models/bedrock.py
Outdated
| for block_idx, block in reversed(list(enumerate(content))): | ||
| if "cachePoint" in block: | ||
| del content[block_idx] | ||
| logger.warning( |
There was a problem hiding this comment.
Issue: Using logger.warning here seems too aggressive - this is expected behavior when cache_config manages cache points automatically.
Suggestion: Consider using logger.debug instead since stripping existing cache points is the intended behavior in auto mode, not a warning condition.
|
Issue: The PR description mentions "Integration test with Claude models on Bedrock confirming cache hits" but I couldn't find any integration test in Suggestion: If the integration test exists, please ensure it's included in this PR. If it was tested manually, consider documenting how to verify cache hits in the PR description. |
Review SummaryAssessment: Comment (Minor changes requested) Key Themes:
The implementation correctly injects cache points into Minor Items:
Nice work on this feature! 🎉 |
9e23a68 to
cc661af
Compare
Summary
CacheConfigwithstrategy="auto"for automatic prompt caching inBedrockModelUsage
Test plan
Closes strands-agents#1432