Skip to content

Distribute PathoROB feature extraction across GPUs#6

Open
clemsgrs wants to merge 1 commit intomainfrom
feature/distributed-pathorob-extraction
Open

Distribute PathoROB feature extraction across GPUs#6
clemsgrs wants to merge 1 commit intomainfrom
feature/distributed-pathorob-extraction

Conversation

@clemsgrs
Copy link
Owner

@clemsgrs clemsgrs commented Feb 6, 2026

Summary

  • Distributed feature extraction for PathoROB: all GPUs now participate via DistributedSampler + extract_multiple_features (with all_gather), matching StandardProbePlugin's existing pattern. Previously N-1 GPUs sat idle during PathoROB evaluation.

Test plan

  • Single-GPU training with PathoROB enabled — should produce identical metrics to current behavior
  • Multi-GPU training — all GPUs should participate in feature extraction (visible via GPU utilization), metrics should match single-GPU

🤖 Generated with Claude Code

@clemsgrs clemsgrs changed the base branch from master to main February 6, 2026 14:27
PathoROB was running feature extraction on a single GPU while N-1 GPUs
sat idle. Switch to the same distributed pattern as StandardProbePlugin:
DistributedSampler + extract_multiple_features with all_gather. Metric
computation (RI, APD, clustering) stays main-process-only.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@clemsgrs clemsgrs force-pushed the feature/distributed-pathorob-extraction branch from 8dba99d to 406c7b1 Compare February 6, 2026 14:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant