Skip to content

Conversation

@lukecold
Copy link
Contributor

📝 Pull Request Template

1. Related Issue

Closes # (issue number)

2. Type of Change (select one)

Type of Change: Bug Fix

3. Description

Previously, the LLM invocation within the compose method was not subject to a specific timeout, which could lead to indefinite blocking if the underlying LLM service failed to respond or was extremely slow.

This change Adds max_llm_wait_time, which defaults to 10 minutes to ensure that the LLM composition process will time out after the configured duration, preventing the agent from being stuck indefinitely and allowing for graceful error handling.

4. Testing

  • I have tested this locally.
  • I have updated or added relevant tests.

5. Checklist

Previously, the LLM invocation within the compose method was not subject to a specific timeout, which could lead to indefinite blocking if the underlying LLM service failed to respond or was extremely slow.

This change Adds max_llm_wait_time, which defaults to 10 minutes to ensure that the LLM composition process will time out after the configured duration, preventing the agent from being stuck indefinitely and allowing for graceful error handling..
Copy link
Collaborator

@vcfgv vcfgv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@vcfgv vcfgv merged commit f8c70c2 into ValueCell-ai:main Dec 23, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants