Skip to content

Conversation

@adrianlyjak
Copy link
Contributor

@adrianlyjak adrianlyjak commented Jan 23, 2026

  • Add Runtime abstract base class with BasicRuntime as default implementation
  • Split context into external, internal, and pre-context modules
  • Replace workflow_registry with workflow_tracker for workflow lifecycle management
  • Remove broker.py in favor of runtime-managed execution
  • Add plugins package with get_current_runtime() for context-aware runtime access
  • Update handler and workflow to use new runtime interfaces

Open with Devin

@changeset-bot
Copy link

changeset-bot bot commented Jan 23, 2026

🦋 Changeset detected

Latest commit: bc659fd

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 2 packages
Name Type
llama-index-utils-workflow Minor
llama-index-workflows Minor

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@coveralls
Copy link

coveralls commented Jan 23, 2026

Pull Request Test Coverage Report for Build 21276663443

Details

  • 475 of 748 (63.5%) changed or added relevant lines in 16 files are covered.
  • 248 unchanged lines in 22 files lost coverage.
  • Overall coverage decreased (-0.2%) to 88.469%

Changes Missing Coverage Covered Lines Changed/Added Lines %
packages/llama-index-workflows/src/workflows/runtime/control_loop.py 8 9 88.89%
packages/llama-index-workflows/src/workflows/resource.py 2 4 50.0%
packages/llama-index-workflows/src/workflows/context/state_store.py 12 15 80.0%
packages/llama-index-workflows/src/workflows/context/pre_context.py 24 29 82.76%
packages/llama-index-workflows/src/workflows/workflow.py 17 25 68.0%
packages/llama-index-workflows/src/workflows/runtime/types/plugin.py 96 109 88.07%
packages/llama-index-workflows/src/workflows/plugins/basic.py 113 148 76.35%
packages/llama-index-workflows/src/workflows/context/external_context.py 42 91 46.15%
packages/llama-index-workflows/src/workflows/handler.py 47 96 48.96%
packages/llama-index-workflows/src/workflows/context/context.py 42 94 44.68%
Files with Coverage Reduction New Missed Lines %
packages/llama-index-workflows/src/workflows/context/context.py 1 52.9%
packages/llama-index-workflows/src/workflows/runtime/types/plugin.py 1 88.24%
packages/llama-index-workflows/src/workflows/runtime/types/step_function.py 1 79.31%
tests/server/test_server_endpoints.py 1 99.86%
tests/test_handler.py 1 95.65%
src/workflows/resource.py 2 98.0%
tests/test_streaming.py 2 98.0%
tests/server/test_idle_release.py 4 98.28%
src/workflows/runtime/types/plugin.py 5 95.8%
src/workflows/runtime/types/step_function.py 6 93.1%
Totals Coverage Status
Change from base Build 21150089190: -0.2%
Covered Lines: 11777
Relevant Lines: 13312

💛 - Coveralls

Copy link

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 2 potential issues.

View issues and 13 additional flags in Devin Review.

Open in Devin Review

Copy link

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 new potential issue.

View issue and 18 additional flags in Devin Review.

Open in Devin Review

Comment on lines 140 to 152
async def stream_published_events(self) -> AsyncGenerator[Event, None]:
async with self._queues.stream_lock:
if self._queues.complete.done() and self._queues.publish_queue.empty():
return
while True:
item = await self._queues.publish_queue.get()
yield item
if isinstance(item, StopEvent):
break

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 WorkflowHandler.stream_events() may be callable multiple times if adapter stream ends without yielding StopEvent

WorkflowHandler.stream_events() enforces single-consumption via _all_events_consumed, but it only flips that flag when it sees a StopEvent (workflows/handler.py:150-163).

With the new asyncio runtime adapter, ExternalAsyncioAdapter.stream_published_events() can terminate without yielding a StopEvent: it returns immediately when the run is complete and the publish queue is empty (workflows/plugins/basic.py:140-148, specifically the early return at workflows/plugins/basic.py:141-143). This happens when another consumer already drained the publish queue (including the StopEvent) and a later caller starts streaming.

Actual behavior:

  • The async generator ends without yielding StopEvent, so _all_events_consumed remains False.
  • A second call to handler.stream_events() will not raise WorkflowRuntimeError and will start another (empty) stream, violating the documented “only be streamed once per handler instance” contract.

Expected behavior:

  • Either the adapter should always yield a terminal StopEvent to late subscribers, or the handler should treat a clean end-of-stream as “consumed” and set _all_events_consumed=True.

Impact: breaks API contract and can confuse server/clients that rely on the one-time streaming invariant, especially in multi-consumer scenarios (e.g., server draining + user code streaming).

Recommendation: Fix at one layer:

  • Adapter-level: for completed runs, yield the final StopEvent (e.g., cache it and yield it to late subscribers), instead of returning with an empty stream; or
  • Handler-level: if the underlying stream terminates without a StopEvent, set _all_events_consumed=True before returning, so subsequent calls raise as documented.
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Copy link

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 3 new potential issues.

View issues and 22 additional flags in Devin Review.

Open in Devin Review

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 ResourceManager caching treats falsey cached resources as missing and recomputes them

ResourceManager.get() uses elif resource.cache and not self.resources.get(resource.name, None): to decide if a cached resource exists. If a resource factory legitimately returns a falsey value (e.g., 0, False, "", empty list), the manager will repeatedly re-create it instead of returning the cached value.

Code:

elif resource.cache and not self.resources.get(resource.name, None):
    val = await resource.call()
    await self.set(resource.name, val)
else:
    val = self.resources.get(resource.name)

packages/llama-index-workflows/src/workflows/resource.py:234-241

Actual: falsey resources are effectively non-cacheable. Expected: cache presence should be based on key existence, not truthiness.

(Refers to lines 232-241)

Recommendation: Use if resource.name not in self.resources (or self.resources.get(...) is None if None is the only sentinel) to test cache presence.

Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good feedback, but seems kind of unrelated to changes here. Addressed elsewhere

Comment on lines 55 to 65
@property
def is_running(self) -> bool:
"""Whether the workflow is currently running."""
return self._require_v2_runtime_compatibility().is_running

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Context.is_running can raise on non-v2 runtimes because ExternalContext.is_running requires V2RuntimeCompatibilityShim

Context.is_running delegates to ExternalContext.is_running when the context is in the external face. But ExternalContext.is_running calls _require_v2_runtime_compatibility() and raises WorkflowRuntimeError if the runtime does not implement the deprecated V2RuntimeCompatibilityShim.

Code:

  • ExternalContext.is_running -> self._require_v2_runtime_compatibility().is_running packages/llama-index-workflows/src/workflows/context/external_context.py:56-59
  • Context.is_running uses that for external face packages/llama-index-workflows/src/workflows/context/context.py:159-165

Impact: Any call to ctx.is_running (including internal library usage such as Workflow.run() checking whether it should build a start event) will crash for runtimes that implement the new adapter interfaces but not the v2 shim. Actual: WorkflowRuntimeError thrown from a basic property access. Expected: is_running should be supported for all runtimes, or Workflow.run() should not rely on it.

Recommendation: Make ExternalContext.is_running work without the v2 shim (e.g., infer from get_result_or_none equivalent, or maintain running state in the adapter), or avoid calling ctx.is_running in places that must support non-v2 runtimes.

Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Made this a bit more compatible

- Add Runtime abstract base class with BasicRuntime as default implementation
- Split context into external, internal, and pre-context modules
- Replace workflow_registry with workflow_tracker for workflow lifecycle management
- Remove broker.py in favor of runtime-managed execution
- Add plugins package with get_current_runtime() for context-aware runtime access
- Update handler and workflow to use new runtime interfaces

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants