Add Intel® OpenVINO Whisper large v3 turbo model#243
Open
Anirudh-Swaminathan wants to merge 1 commit intomicrosoft:mainfrom
Open
Add Intel® OpenVINO Whisper large v3 turbo model#243Anirudh-Swaminathan wants to merge 1 commit intomicrosoft:mainfrom
Anirudh-Swaminathan wants to merge 1 commit intomicrosoft:mainfrom
Conversation
Anirudh-Swaminathan
commented
Feb 18, 2026
- Introduced audio_processor_config_default.json for audio feature extraction configuration.
- Created convert_whisper_to_ovir.py for converting Whisper models to OpenVINO format using Olive.
- Added info.yml to define model architecture and recipes for OpenVINO execution.
- Implemented whisper_large_v3_turbo_default_ov_npu.json for model conversion configuration targeting NPU.
- Developed whisper_large_v3_turbo_encapsulate.json for encapsulating the converted models with necessary settings.
- Introduced audio_processor_config_default.json for audio feature extraction configuration. - Created convert_whisper_to_ovir.py for converting Whisper models to OpenVINO format using Olive. - Added info.yml to define model architecture and recipes for OpenVINO execution. - Implemented whisper_large_v3_turbo_default_ov_npu.json for model conversion configuration targeting NPU. - Developed whisper_large_v3_turbo_encapsulate.json for encapsulating the converted models with necessary settings. --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Contributor
There was a problem hiding this comment.
Pull request overview
Adds an Intel® OpenVINO NPU conversion + encapsulation recipe for openai/whisper-large-v3-turbo, including an Olive-based conversion script, model metadata, and audio feature-extraction configuration. Also updates several OpenVINO-related requirement pins and bumps Intel NPU toolchain versions.
Changes:
- Add OpenVINO NPU Olive recipe configs + metadata (
info.yml) for Whisper large v3 turbo. - Add
convert_whisper_to_ovir.pyhelper to run conversion + encapsulation and post-processgenai_config.json. - Update OpenVINO/IntelNPU dependency versions and loosen some model-specific OpenVINO requirements.
Reviewed changes
Copilot reviewed 11 out of 11 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| openai-whisper-large-v3-turbo/OpenVINO/whisper_large_v3_turbo_encapsulate.json | Adds encapsulation pass config for OpenVINO NPU (genai override + provider options). |
| openai-whisper-large-v3-turbo/OpenVINO/whisper_large_v3_turbo_default_ov_npu.json | Adds Olive Optimum conversion + IO update config targeting NPU. |
| openai-whisper-large-v3-turbo/OpenVINO/info.yml | Registers the OpenVINO recipe for discovery/automation. |
| openai-whisper-large-v3-turbo/OpenVINO/convert_whisper_to_ovir.py | New conversion/encapsulation driver script with optional reshape and NPU weight sharing. |
| openai-whisper-large-v3-turbo/OpenVINO/audio_processor_config_default.json | Adds default audio feature-extraction pipeline configuration. |
| openai-whisper-large-v3-turbo/OpenVINO/README.md | Documents usage and expected output artifacts for the OpenVINO pipeline. |
| microsoft-Phi-4/OpenVINO/requirements.txt | Loosens Optimum-Intel and Transformers requirements. |
| microsoft-Phi-4-reasoning/OpenVINO/requirements.txt | Loosens Optimum-Intel and Transformers requirements. |
| microsoft-Phi-4-mini-reasoning/OpenVINO/requirements.txt | Loosens Optimum-Intel and Transformers requirements. |
| microsoft-Phi-4-mini-instruct/OpenVINO/requirements.txt | Loosens Optimum-Intel and Transformers requirements. |
| .aitk/requirements/requirements-IntelNPU.txt | Bumps OpenVINO/Olive/ONNX/ORT stack versions for Intel NPU environment. |
Comments suppressed due to low confidence (1)
openai-whisper-large-v3-turbo/OpenVINO/convert_whisper_to_ovir.py:7
sysis imported but never used in this script. Please remove the unused import to avoid lint warnings and keep dependencies clear.
import sys
import shutil
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
openai-whisper-large-v3-turbo/OpenVINO/whisper_large_v3_turbo_encapsulate.json
Show resolved
Hide resolved
openai-whisper-large-v3-turbo/OpenVINO/whisper_large_v3_turbo_encapsulate.json
Show resolved
Hide resolved
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.