Skip to content

Conversation

@ugur-vaadin
Copy link
Contributor

Description

This PR adds SpringAiLLMProvider, which implements the LLMProvider interface.

  • Supports both streaming and non-streaming chat models
  • Supports Spring Tool annotations
  • Has a chat max message count of 30
  • Has a maximum tool execution count of 20 per message in order to avoid loops
  • Supports image, text, audio, video, and PDF (text content) attachments

Fixes #8469

Type of change

  • Bugfix
  • Feature

Checklist

  • I have read the contribution guide: https://vaadin.com/docs/latest/contributing/overview
  • I have added a description following the guideline.
  • The issue is created in the corresponding repository and I have referenced it.
  • I have added tests to ensure my change is effective and works as intended.
  • New and existing tests are passing locally with my change.
  • I have performed self-review and corrected misspellings.
  • Enhancement / new feature was discussed in a corresponding GitHub issue and Acceptance Criteria were created.

@ugur-vaadin ugur-vaadin force-pushed the feat-add-spring-ai-implementation-for-llmprovider branch from 46b52c3 to 024d1e2 Compare January 23, 2026 11:38
@ugur-vaadin ugur-vaadin force-pushed the feat-add-spring-ai-implementation-for-llmprovider branch from dac1e6c to 30068dd Compare January 26, 2026 13:19
@sonarqubecloud
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add SpringAI LLMProvider implementation

2 participants