-
Notifications
You must be signed in to change notification settings - Fork 69
LCORE-1240: Konflux requirements #1068
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -196,15 +196,15 @@ langdetect==1.0.9 \ | |
| litellm==1.81.3 \ | ||
| --hash=sha256:3f60fd8b727587952ad3dd18b68f5fed538d6f43d15bb0356f4c3a11bccb2b92 \ | ||
| --hash=sha256:a7688b429a88abfdd02f2a8c3158ebb5385689cfb7f9d4ac1473d018b2047e1b | ||
| llama-stack==0.4.2 \ | ||
| --hash=sha256:38caaed133139c1de8c4ef2d352f562c98d7a2797f97f2e4558015762787b20e \ | ||
| --hash=sha256:f4dbd043704d5e3b382a3fba690536b54baa58ae8ec27dae3ceba8ec7a377427 | ||
| llama-stack==0.4.3 \ | ||
| --hash=sha256:423207eae2b640894992a9075ff9dd6300ff904ab06a49fe38cfe0bb809d4669 \ | ||
| --hash=sha256:70d379ae9dbb5b1d0693f14054d9817aba183ffcd805133f0a4442baee132c6d | ||
| llama-stack-api==0.4.3 \ | ||
| --hash=sha256:89cf8eaf1468b50ac426e14d39beb4f93994e0807b857f729765e4af0e9b53a1 \ | ||
| --hash=sha256:f6c3e4d81d27a3332983a44f9295567e375136d4a402084958ed247f5676c1ee | ||
| llama-stack-client==0.4.2 \ | ||
| --hash=sha256:1277bf563531d9bc476e305f2d2bead9900986d426a2e32c9adf4b6a464804c3 \ | ||
| --hash=sha256:d6e1c73391bdc3494fe1fa9ce7575a4749d13d111718e458e874ece544988729 | ||
| llama-stack-client==0.4.3 \ | ||
| --hash=sha256:97b8cc5032bad4f0cdd1b0ae992cf44f5554679d315b7c40f46deb358c041f50 \ | ||
| --hash=sha256:cb807be258206e8fedeb5e5ceba7be7108d3badb31d74199406808c3d1679c35 | ||
| markupsafe==3.0.3 \ | ||
| --hash=sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f \ | ||
| --hash=sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a \ | ||
|
|
@@ -702,9 +702,9 @@ rich==14.3.1 \ | |
| semver==3.0.4 \ | ||
| --hash=sha256:9c824d87ba7f7ab4a1890799cec8596f15c1241cb473404ea1cb0c55e4b04746 \ | ||
| --hash=sha256:afc7d8c584a5ed0a11033af086e8af226a9c0b206f313e0301f8dd7b6b589602 | ||
| sentence-transformers==5.2.1 \ | ||
| --hash=sha256:388483e174223958ae71b6888c49a6cc1480306a12655a690b605c7a176162d0 \ | ||
| --hash=sha256:9c676411becbf2a0e7515501788a7f84b99ca2dcc124ce392639bcdda4a256ff | ||
| sentence-transformers==5.2.2 \ | ||
| --hash=sha256:280ac54bffb84c110726b4d8848ba7b7c60813b9034547f8aea6e9a345cd1c23 \ | ||
| --hash=sha256:7033ee0a24bc04c664fd490abf2ef194d387b3a58a97adcc528783ff505159fa | ||
|
Comment on lines
+705
to
+707
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🌐 Web query:
💡 Result: Sentence-Transformers v5.2.2 (Jan 27, 2026) — breaking-change relevant notes
Sources: GitHub release notes [1] and the merged PR ( 🌐 Web query:
💡 Result:
Sources: [1] PyPI release history, [2] Snyk advisory SNYK-PYTHON-SENTENCETRANSFORMERS-8161344, [3] Safety DB (PVE-2024-73169) 🏁 Script executed: # Check context around the sentence-transformers line in the requirements file
rg -B5 -A5 'sentence-transformers==5.2.2' requirements.hashes.source.txt | head -30Repository: lightspeed-core/lightspeed-stack Length of output: 817 🏁 Script executed: # Check if there are other versions of sentence-transformers in the file to understand the migration
rg 'sentence-transformers==' requirements.hashes.source.txtRepository: lightspeed-core/lightspeed-stack Length of output: 106 Remove or update sentence-transformers==5.2.2 — version does not exist on PyPI yet. Version 5.2.2 has not been released to PyPI (as of Jan 27, 2026). The latest published version is 5.2.0 (Dec 11, 2025). This pin will fail during installation with "No matching distribution found." Either use the current 5.2.0 release or wait for 5.2.2 to be published. Note: 5.2.2 will introduce a breaking change—the internal HTTP client changes from 🤖 Prompt for AI Agents |
||
| sse-starlette==3.2.0 \ | ||
| --hash=sha256:5876954bd51920fc2cd51baee47a080eb88a37b5b784e615abb0b283f801cdbf \ | ||
| --hash=sha256:8127594edfb51abe44eac9c49e59b0b01f1039d0c7461c6fd91d4e03b70da422 | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
Repository: lightspeed-core/lightspeed-stack
Length of output: 1678
🌐 Web query:
llama-stack 0.4.3 release notes changelog💡 Result:
I couldn’t find any llama-stack 0.4.3 release notes/changelog—there doesn’t appear to be a 0.4.3 release published.
What I can confirm from official channels:
llama-stack) latest published version is 0.4.2, uploaded Jan 16, 2026. [1]meta-llama/llama-stackcurrently shows releases up through v0.3.4 (Dec 3, 2025), not 0.4.x. [2]If you meant a different package/repo (e.g.,
llama-stack-client-*), tell me which one and I’ll pull the exact 0.4.3 notes (if it exists).Sources: [1] [2]
🌐 Web query:
llama-stack-client 0.4.3 PyPI release notes💡 Result:
llama-stack-client0.4.3 was released on PyPI on Jan 26, 2026, but PyPI does not include any per-version release notes/changelog text for this release (it’s just the package description + files). [1]On GitHub, the corresponding tag
v0.4.3exists and its tag message is simply “Release version 0.4.3”—no additional release notes are shown there either. [2]Sources: [1] [2]
🌐 Web query:
llama-stack 0.4.3 security advisory vulnerability💡 Result:
I couldn’t find any security advisory specifically for
llama-stackversion 0.4.3, and I also don’t see a publishedllama-stack0.4.3 release/tag in the project’s GitHub releases (they show up to v0.3.4) or in third‑party package trackers that list newer 0.4.x versions (e.g., 0.4.1). [4], [3]What is publicly documented for
llama-stackare these advisories/vulnerabilities:< 0.2.20; patched:0.2.20. [1]pickle; upgrade to0.0.41or higher. [2]If you meant a different package (for example
llama-stack-client-*) or meant 0.4.0/0.4.1 instead of 0.4.3, tell me the exact artifact name (PyPI/npm/Maven) and I’ll narrow it down.Sources: [1] [2] [3] [4]
Resolve version conflict:
llama-stack==0.4.3does not exist on PyPI.llama-stacklatest published version is 0.4.2 (Jan 16, 2026); 0.4.3 has not been released. GitHub releases only show v0.3.4 as the latest. This dependency specification will fail during installation. Verify the intended version and update accordingly.llama-stack-client==0.4.3exists (released Jan 26, 2026) but has no available release notes or changelog for impact assessment.🤖 Prompt for AI Agents