Skip to content

Comments

Revert "[Bugfix] Support compile for Transformers multimodal (#23095)" Test#17

Open
MitchLewis930 wants to merge 1 commit intobench/PR_003_basefrom
bench/PR_003_bug__CodeRabbit
Open

Revert "[Bugfix] Support compile for Transformers multimodal (#23095)" Test#17
MitchLewis930 wants to merge 1 commit intobench/PR_003_basefrom
bench/PR_003_bug__CodeRabbit

Conversation

@MitchLewis930
Copy link
Collaborator

@MitchLewis930 MitchLewis930 commented Jan 21, 2026

This reverts commit 0e3bb54.

PLEASE FILL IN THE PR DESCRIPTION HERE ENSURING ALL CHECKLIST ITEMS (AT THE BOTTOM) HAVE BEEN CONSIDERED.

Purpose

Test Plan

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

BEFORE SUBMITTING, PLEASE READ https://docs.vllm.ai/en/latest/contributing (anything written below this line will be removed by GitHub Actions)

Summary by CodeRabbit

  • Chores
    • Removed Torch compile support for multimodal language models.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link

coderabbitai bot commented Jan 21, 2026

📝 Walkthrough

Walkthrough

The @support_torch_compile(...) decorator was removed from the TransformersForMultimodalLM class in the transformers model executor file, eliminating Torch compile support and dynamic argument dimension configuration. The class maintains its existing inheritance and functionality.

Changes

Cohort / File(s) Change Summary
Decorator Removal
vllm/model_executor/models/transformers.py
Removed @support_torch_compile(...) decorator from TransformersForMultimodalLM class, eliminating Torch compile support and dynamic argument dimension configuration

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Poem

🐰 A decorator departs with grace,
Torch compile leaves this special place,
Yet multimodal still stands tall,
Simpler now, yet serving all!
Hopping onward, clean and bright,
This refactor feels just right!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: reverting a specific bugfix commit for Torch compile support in Transformers multimodal LM.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

📜 Recent review details

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0e3bb54 and 8108871.

📒 Files selected for processing (1)
  • vllm/model_executor/models/transformers.py
💤 Files with no reviewable changes (1)
  • vllm/model_executor/models/transformers.py

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant