Refactor(handler part 15): extract training preset switching into dedicated mixin#620
Conversation
📝 WalkthroughWalkthroughExtracts training-preset logic from AceStepHandler into a new exported Changes
Sequence Diagram(s)(omitted) Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Tip Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord. Comment |
|
conflicts need to solve |
788fc13 to
4369fdc
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
acestep/core/generation/handler/training_preset_test.py (1)
10-35: Consider documenting why custom module loading is used.The custom module loading approach isolates the test from heavyweight package imports (e.g., torch, torchaudio). A brief inline comment explaining this rationale would help future maintainers understand this pattern.
📝 Suggested documentation addition
def _load_training_preset_module(): - """Load training_preset module directly from file to avoid package side effects.""" + """Load training_preset module directly from file to avoid package side effects. + + This bypasses the heavyweight acestep package imports (torch, torchaudio, etc.) + allowing fast, isolated unit tests without GPU or audio library dependencies. + """ repo_root = Path(__file__).resolve().parents[4]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@acestep/core/generation/handler/training_preset_test.py` around lines 10 - 35, Add a brief inline comment at the top of the _load_training_preset_module function explaining that it manually constructs package entries and loads training_preset.py to avoid importing the full aconda/package graph (which pulls heavyweight dependencies like torch/torchaudio) so tests remain fast and isolated; mention that this prevents package side-effects and ensures the test only loads the target module.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@acestep/core/generation/handler/training_preset_test.py`:
- Around line 10-35: Add a brief inline comment at the top of the
_load_training_preset_module function explaining that it manually constructs
package entries and loads training_preset.py to avoid importing the full
aconda/package graph (which pulls heavyweight dependencies like
torch/torchaudio) so tests remain fast and isolated; mention that this prevents
package side-effects and ensures the test only loads the target module.
|
@ChuxiJ Conflict resolved. |
Summary
Extract
switch_to_training_presetfromAceStepHandlerinto a dedicatedTrainingPresetMixinto continue handler decomposition while preserving behavior.What Changed
TrainingPresetMixin:acestep/core/generation/handler/training_preset.pyacestep/core/generation/handler/__init__.pyacestep/handler.pyswitch_to_training_presetimplementation fromAceStepHandler.acestep/core/generation/handler/training_preset_test.pyuse_mlx_ditfrom cached init params.Behavioral Notes
AceStepHandler.switch_to_training_preset(...)remains available via mixin inheritance.switch_to_training_presetnow explicitly forwards:prefer_source(existing behavior)use_mlx_dit(regression fix for MLX users)Tests
python acestep/core/generation/handler/training_preset_test.py(6tests passing)python acestep/core/generation/handler/init_service_test.py, but this environment lackstorchaudio(import failure unrelated to this PR logic).Scope / Risk
use_mlx_ditpassthrough on preset switching.Summary by CodeRabbit
New Features
Tests