-
Notifications
You must be signed in to change notification settings - Fork 52
Release 1.7.0 - Major changes with quantms-rescoring #625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Increase dev version
fixing of bug in msstats_tmt.R (Issue: Error in the msstats_tmt.R function parse_contrasts #577)
Pass correct flag to ProteinQuantifier
fixed rescoring range bugs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
update quantms-utils 0.0.24 & pmultiqc 0.0.39
Co-authored-by: ypriverol <52113+ypriverol@users.noreply.github.com>
Fix trailing whitespace and code formatting
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
bug fix when parameter model_dir is wrongly provided
Error of quantms-rescoring process with AWS
Co-authored-by: jpfeuffer <8102638+jpfeuffer@users.noreply.github.com>
Co-authored-by: jpfeuffer <8102638+jpfeuffer@users.noreply.github.com>
Co-authored-by: jpfeuffer <8102638+jpfeuffer@users.noreply.github.com>
Co-authored-by: jpfeuffer <8102638+jpfeuffer@users.noreply.github.com>
Co-authored-by: ypriverol <52113+ypriverol@users.noreply.github.com>
Co-authored-by: ypriverol <52113+ypriverol@users.noreply.github.com>
Add support for .dia file format and improve file format documentation
WalkthroughThis pull request refactors the MS2 rescoring feature namespace from Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Areas requiring extra attention:
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 16
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
subworkflows/local/file_preparation/main.nf (1)
115-123: Potential issue: .dia files may be lost when peak picking is enabled.When
params.openms_peakpickingis true,ch_resultsis overwritten with onlyOPENMS_PEAK_PICKER.out.mzmls_picked(line 122), which would lose any.diafiles that were previously mixed in at line 106.🔎 Suggested fix to preserve .dia files
if (params.openms_peakpicking) { // If the peak picker is enabled, it will over-write not bypass the .d files OPENMS_PEAK_PICKER ( indexed_mzml_bundle ) ch_versions = ch_versions.mix(OPENMS_PEAK_PICKER.out.versions) - ch_results = OPENMS_PEAK_PICKER.out.mzmls_picked + ch_results = OPENMS_PEAK_PICKER.out.mzmls_picked.mix(ch_branched_input.dia) }subworkflows/local/dda_id/main.nf (1)
113-113: Critical: Inconsistent parameter name will break by_project rescoring.Line 113 still uses the old parameter name
params.rescore_rangeinstead of the newparams.ms2features_range. This inconsistency will cause theby_projectrescoring branch to never execute, as the condition will always evaluate to false when using the new parameter namespace.🔎 Required fix for parameter name
- } else if (params.rescore_range == "by_project"){ + } else if (params.ms2features_range == "by_project"){nextflow.config (1)
265-269: Duplicateconfig_profile_descriptiondefinition.
config_profile_descriptionis defined on both line 265 and line 269. The second assignment will silently overwrite the first. Remove the duplicate.🔎 Proposed fix
config_profile_name = null config_profile_description = null custom_config_version = 'master' custom_config_base = "https://raw.githubusercontent.com/nf-core/configs/${params.custom_config_version}" - config_profile_description = null config_profile_contact = null config_profile_url = null
🧹 Nitpick comments (12)
.gitignore (1)
13-17: Minor formatting convention improvements.The ignore patterns are appropriate and correctly syntaxed. Consider these optional style refinements:
- Line 15: Add a space after the
#in the comment for consistency with gitignore conventions:# Ignore cursor AI rules- Lines 13-14: Two consecutive blank lines are slightly unusual; a single blank line before the comment is typically sufficient for readability.
These are cosmetic preferences and do not affect functionality.
modules/local/utils/extract_sample/main.nf (1)
26-26: Error suppression may complicate debugging.Redirecting stderr to
/dev/nullwill silence any error messages fromparse_sdrf --version. While version commands typically don't produce critical errors, this change could make troubleshooting harder if the command fails..devcontainer/setup.sh (1)
10-10: Consider version constraints for Nextflow update.The
nextflow self-updatecommand will fetch the latest Nextflow version, which may not align with the specific version (25.04.0) mentioned in the PR. Consider pinning to a specific version or documenting the expected version range.🔎 Suggested version-pinned approach
-# Update Nextflow -nextflow self-update +# Update Nextflow to specific version +NXF_VER=25.04.0 nextflow -version || curl -s https://get.nextflow.io | bash -s -- -v 25.04.0Or add error handling if keeping self-update:
-# Update Nextflow -nextflow self-update +# Update Nextflow (if installed) +if command -v nextflow &> /dev/null; then + nextflow self-update +else + echo "Warning: Nextflow not found in PATH" +fisubworkflows/local/psm_fdr_control/main.nf (1)
6-6: LGTM! Redundant alias removed.The change simplifies the import by removing the unnecessary
as ID_FILTERalias, improving code clarity without any functional impact.modules/local/openms/isobaric_analyzer/main.nf (1)
43-49: Consider adding a comment explaining the matrix value count logic.The conditional determines the number of correction values based on labelling type. A brief comment explaining why tmt18plex/tmt16plex use 8 values while others use 4 would improve maintainability.
🔎 Suggested documentation addition
def values = line.split('/') + // tmt18plex and tmt16plex use extended 8-value correction matrices, + // while other labelling types use the standard 4-value format // Handle different labelling types if (meta.labelling_type == 'tmt18plex' || meta.labelling_type == 'tmt16plex') {modules/local/pmultiqc/main.nf (1)
20-20: Variable name is misleading after logic change.The variable
disable_pmultiqcnow adds--quantms_pluginwhen pmultiqc is enabled (and mzTab export is on), which is the opposite of what the name suggests. Consider renaming toquantms_plugin_flagor similar for clarity.🔎 Suggested rename for clarity
- def disable_pmultiqc = (params.enable_pmultiqc) && (params.export_mztab) ? "--quantms_plugin" : "" + def quantms_plugin_flag = (params.enable_pmultiqc) && (params.export_mztab) ? "--quantms_plugin" : ""And update the usage on line 36:
- ${disable_pmultiqc} \ + ${quantms_plugin_flag} \subworkflows/local/utils_nfcore_quantms_pipeline/meta.yml (1)
2-3: Minor: Name and description mismatch.The subworkflow is named
pipeline_completion, but the description refers to "utils in the nf-core quantms pipeline." Consider updating the description to better reflect the pipeline completion functionality, e.g., "Subworkflow for pipeline completion handling in the nf-core quantms pipeline."subworkflows/local/create_input_channel/main.nf (1)
98-99: Consider using Nextflow'serror()instead ofexit(1).Using
exit(1, ...)directly may not integrate well with Nextflow's error handling and reporting. Consider using Nextflow's built-inerror()function for consistent pipeline error behavior.🔎 Suggested change
- exit(1, "ERROR: Please check input file -> File Uri does not exist!\n${filestr}") + error("ERROR: Please check input file -> File Uri does not exist!\n${filestr}")Apply similar changes to other
exit(1)calls throughout the file.modules/local/msstats/msstats_tmt/main.nf (1)
15-15: Correct syntax for optional output parameter.The output declaration has been updated to use proper named parameter syntax (
optional: trueinstead ofoptional true), which aligns with Nextflow DSL2 best practices.Consider verifying that this pattern is consistently applied across all module files:
#!/bin/bash # Find any remaining uses of old optional syntax rg -n 'path\s+"[^"]+"\s+optional\s+true' -g '*.nf'modules/nf-core/multiqc/tests/main.nf.test (1)
39-63: Consider adding version snapshot assertion for consistency.The custom prefix test case omits the version snapshot assertion that's present in other test cases (lines 33 and 85). If this is intentional (e.g., version output is identical to the basic test), consider adding a brief comment explaining why. Otherwise, adding a version assertion improves test coverage consistency.
🔎 Proposed addition
then { assertAll( { assert process.success }, { assert process.out.report[0] ==~ ".*/custom_prefix.html" }, - { assert process.out.data[0] ==~ ".*/custom_prefix_data" } + { assert process.out.data[0] ==~ ".*/custom_prefix_data" }, + { assert snapshot(process.out.versions).match("multiqc_versions_prefix") } ) }workflows/quantms.nf (2)
88-93: Verifyexit()syntax in Nextflow DSL2.The
exit(1, 'message')syntax is Groovy-compatible but in Nextflow workflows, consider usingerror()which is the idiomatic way to abort with a message.🔎 Suggested change
if (params.database) { ch_db_for_decoy_creation = Channel.from(file(params.database, checkIfExists: true)) } else { - exit(1, 'No protein database provided') + error('No protein database provided') }
96-101: Complex channel pipeline may be simplified.This pipeline combines ISO and LFQ configs, takes the first, combines with the database, then extracts the last element. The intent appears to be ensuring the database channel only emits when there's at least one DDA config. Consider adding a comment explaining the purpose.
🔎 Suggested clarification
+ // Only emit database when there's at least one DDA (ISO or LFQ) configuration + // This ensures decoy generation only runs for DDA workflows CREATE_INPUT_CHANNEL.out.ch_meta_config_iso.mix( CREATE_INPUT_CHANNEL.out.ch_meta_config_lfq ).first() | combine(ch_db_for_decoy_creation) | map { it[-1] } | set { ch_db_for_decoy_creation_or_null }
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
assets/nf-core-quantms_logo_light.pngis excluded by!**/*.pngmodules/nf-core/multiqc/tests/main.nf.test.snapis excluded by!**/*.snap
📒 Files selected for processing (81)
.devcontainer/setup.sh(1 hunks).editorconfig(0 hunks).github/actions/nf-test/action.yml(2 hunks).github/workflows/ci.yml(1 hunks).github/workflows/clean-up.yml(1 hunks).github/workflows/diann_private.yml(1 hunks).github/workflows/extended_ci.yml(2 hunks).github/workflows/fix_linting.yml(4 hunks).github/workflows/linting.yml(3 hunks).github/workflows/linting_comment.yml(1 hunks).github/workflows/template-version-comment.yml(1 hunks).gitignore(1 hunks).gitpod.yml(0 hunks).nf-core.yml(1 hunks).pre-commit-config.yaml(1 hunks).prettierignore(1 hunks)CHANGELOG.md(1 hunks)README.md(2 hunks)assets/methods_description_template.yml(0 hunks)assets/multiqc_config.yml(1 hunks)assets/schema_input.json(1 hunks)assets/tmt18plex_default_correction.matrix(1 hunks)bin/msstats_tmt.R(18 hunks)conf/modules/modules.config(1 hunks)conf/tests/test_dda_id_alphapeptdeep.config(2 hunks)conf/tests/test_dda_id_ms2pip.config(1 hunks)docs/output.md(1 hunks)docs/usage.md(2 hunks)main.nf(1 hunks)modules.json(2 hunks)modules/local/diann/convert_results/main.nf(2 hunks)modules/local/diann/generate_cfg/main.nf(1 hunks)modules/local/msstats/msstats_lfq/main.nf(1 hunks)modules/local/msstats/msstats_tmt/main.nf(1 hunks)modules/local/openms/id_merger/main.nf(1 hunks)modules/local/openms/id_ripper/main.nf(1 hunks)modules/local/openms/isobaric_analyzer/main.nf(1 hunks)modules/local/openms/protein_quantifier/main.nf(1 hunks)modules/local/openms/proteomicslfq/main.nf(1 hunks)modules/local/pmultiqc/main.nf(3 hunks)modules/local/preprocess_expdesign/main.nf(2 hunks)modules/local/samplesheet_check/main.nf(1 hunks)modules/local/sdrf_parsing/main.nf(2 hunks)modules/local/thermorawfileparser/main.nf(1 hunks)modules/local/utils/extract_sample/main.nf(2 hunks)modules/local/utils/msrescore_features/main.nf(2 hunks)modules/local/utils/mzml_statistics/main.nf(1 hunks)modules/local/utils/psm_clean/main.nf(0 hunks)modules/local/utils/psm_conversion/main.nf(1 hunks)modules/local/utils/spectrum_features/main.nf(0 hunks)modules/nf-core/custom/dumpsoftwareversions/templates/dumpsoftwareversions.py(1 hunks)modules/nf-core/multiqc/environment.yml(1 hunks)modules/nf-core/multiqc/main.nf(1 hunks)modules/nf-core/multiqc/meta.yml(1 hunks)modules/nf-core/multiqc/tests/custom_prefix.config(1 hunks)modules/nf-core/multiqc/tests/main.nf.test(1 hunks)modules/nf-core/multiqc/tests/nextflow.config(1 hunks)nextflow.config(7 hunks)nextflow_schema.json(20 hunks)ro-crate-metadata.json(2 hunks)subworkflows/local/create_input_channel/main.nf(2 hunks)subworkflows/local/dda_id/main.nf(7 hunks)subworkflows/local/file_preparation/main.nf(2 hunks)subworkflows/local/id/main.nf(2 hunks)subworkflows/local/psm_fdr_control/main.nf(1 hunks)subworkflows/local/psm_fdr_control/meta.yml(1 hunks)subworkflows/local/psm_rescoring/main.nf(4 hunks)subworkflows/local/utils_nfcore_quantms_pipeline/main.nf(1 hunks)subworkflows/local/utils_nfcore_quantms_pipeline/meta.yml(1 hunks)subworkflows/nf-core/utils_nextflow_pipeline/tests/tags.yml(0 hunks)subworkflows/nf-core/utils_nfcore_pipeline/main.nf(1 hunks)subworkflows/nf-core/utils_nfcore_pipeline/tests/tags.yml(0 hunks)subworkflows/nf-core/utils_nfschema_plugin/main.nf(2 hunks)subworkflows/nf-core/utils_nfschema_plugin/tests/main.nf.test(5 hunks)subworkflows/nf-core/utils_nfschema_plugin/tests/nextflow.config(1 hunks)tests/.nftignore(1 hunks)tests/default.nf.test(0 hunks)workflows/dia.nf(7 hunks)workflows/lfq.nf(3 hunks)workflows/quantms.nf(4 hunks)workflows/tmt.nf(2 hunks)
💤 Files with no reviewable changes (8)
- modules/local/utils/psm_clean/main.nf
- subworkflows/nf-core/utils_nextflow_pipeline/tests/tags.yml
- assets/methods_description_template.yml
- modules/local/utils/spectrum_features/main.nf
- tests/default.nf.test
- .gitpod.yml
- .editorconfig
- subworkflows/nf-core/utils_nfcore_pipeline/tests/tags.yml
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-14T09:53:36.777Z
Learnt from: enryH
Repo: bigbio/quantms PR: 574
File: .github/actions/get-shards/action.yml:31-41
Timestamp: 2025-08-14T09:53:36.777Z
Learning: The .github/actions/get-shards/ and .github/actions/nf-test/ actions in quantms are part of the nf-core template (introduced in template version 3.3.2) for nf-test framework integration, not custom pipeline code.
Applied to files:
.github/workflows/fix_linting.yml.github/workflows/linting.yml.github/workflows/extended_ci.yml.github/actions/nf-test/action.yml.nf-core.ymlREADME.mdmain.nfro-crate-metadata.json
🪛 LanguageTool
docs/output.md
[uncategorized] ~258-~258: Do not mix variants of the same word (‘labelling’ and ‘labeling’) within a single text.
Context: ...rs, the SDRF file has precedence. > > - labelling_type > - dissociationmethod > - fixedmo...
(EN_EXACT_COHERENCY_RULE)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (20)
- GitHub Check: Run pipeline with test data (25.04.0, test_localize, docker)
- GitHub Check: Run pipeline with test data (25.04.0, test_dda_id_ms2pip, docker)
- GitHub Check: Run pipeline with test data (25.04.0, test_lfq, docker)
- GitHub Check: Run pipeline with test data (25.04.0, test_dia, docker)
- GitHub Check: Run pipeline with test data (25.04.0, test_tmt, docker)
- GitHub Check: Run pipeline with test data (25.04.0, test_dda_id_alphapeptdeep, docker)
- GitHub Check: Docker Tests (latest-everything, test_dda_id_alphapeptdeep)
- GitHub Check: Docker Tests (25.04.0, test_lfq)
- GitHub Check: Docker Tests (25.04.0, test_tmt_corr)
- GitHub Check: Docker Tests (latest-everything, test_lfq)
- GitHub Check: Docker Tests (latest-everything, test_localize)
- GitHub Check: Docker Tests (latest-everything, test_tmt_corr)
- GitHub Check: Docker Tests (latest-everything, test_dda_id_ms2pip)
- GitHub Check: Docker Tests (latest-everything, test_tmt)
- GitHub Check: Docker Tests (latest-everything, test_dia)
- GitHub Check: Docker Tests (25.04.0, test_localize)
- GitHub Check: Docker Tests (25.04.0, test_dda_id_alphapeptdeep)
- GitHub Check: Docker Tests (25.04.0, test_dda_id_ms2pip)
- GitHub Check: Docker Tests (25.04.0, test_dia)
- GitHub Check: Docker Tests (25.04.0, test_tmt)
🔇 Additional comments (96)
modules/nf-core/custom/dumpsoftwareversions/templates/dumpsoftwareversions.py (1)
40-40: LGTM!The quote style change from single to double quotes is functionally equivalent and makes no impact on the code's behavior.
bin/msstats_tmt.R (3)
98-103: Critical bug fix: Correct contrast construction for pairwise control mode.The fix correctly addresses a critical logic error where contrasts were built using an undefined or wrong index
iinstead ofcontrol. The corrected code now properly:
- Places
-1at thecontrolindex position (baseline)- Places
1at thejindex position (test condition)- Labels rows as
lvls[j]-lvls[control](test vs. control)This ensures contrasts are correctly defined when a control condition is specified.
94-94: Improved error message formatting.The error message now ends with a period and newline, improving consistency and readability.
123-963: Formatting improvements throughout the file.The extensive formatting changes adopt idiomatic R style conventions:
- Consistent use of leftward assignment (
<-)- Improved spacing and indentation
- Better function parameter alignment
These changes enhance code readability and maintainability without altering functionality.
.nf-core.yml (1)
28-28: Template update from nf-core 3.3.2 to 3.5.1 is a routine sync.The changes between these versions are incremental. Ensure template merge conflicts are properly resolved and verify the pipeline still passes linting, but no special compatibility concerns exist for this minor version bump.
tests/.nftignore (2)
3-3: LGTM: Multiqc file ignore pattern updated.The change from
BETA-multiqc.parquettomultiqc.parquetreflects the removal of the BETA designation, consistent with multiqc becoming a core module in this release.
8-8: LGTM: New multiqc output added to ignore list.The addition of
llms-full.txtto the ignore list is appropriate for handling new multiqc data outputs..prettierignore (1)
13-16: LGTM: Standard nf-core ignore patterns added.The added paths (.nf-test/, modules/nf-core/, subworkflows/nf-core/) are standard exclusions for nf-core pipelines to prevent formatting of managed dependencies.
.github/workflows/linting_comment.yml (1)
24-28: LGTM: Improved linting comment action.The action version has been updated, and the explicit
pathinput adds clarity by specifying the exact location of lint results..pre-commit-config.yaml (1)
9-9: Review pre-commit-hooks v6.0.0 breaking changes.v6.0.0 requires Python >= 3.9 and removes
check-byte-order-markerandfix-encoding-pragmahooks (migrate tofix-byte-order-markerandpyupgrade). Verify:
- Repository supports Python >= 3.9
- No removed hooks are referenced in the configuration
file-contents-sorterhook (if used) does not specify both--uniqueand--ignore-case.github/workflows/clean-up.yml (1)
13-13: Verify runner version compatibility for actions/stale v10.The v10.0.0 breaking change is the upgrade to Node.js 24, which requires your runner to be on version v2.327.1 or later to ensure compatibility. All existing configuration options remain compatible with v10—no behavior changes affecting issue/PR management.
.github/workflows/template-version-comment.yml (1)
12-12: actions/checkout v5 is safe to use — no breaking changes identified.Credential storage was improved in v5 to store credentials in a separate file under $RUNNER_TEMP instead of directly in .git/config, but no workflow changes are required — git fetch, git push, etc. continue to work automatically. The
refconfiguration parameter remains fully supported and functional. The main update involves using Node 24 and requires a minimum Actions Runner version of v2.327.1. This upgrade is safe for production use.modules/local/samplesheet_check/main.nf (1)
7-8: quantms-utils 0.0.24 version update confirmed.The container image has been updated from version 0.0.23 to 0.0.24, and the updated version is available in both biocontainers and Galaxy Depot registries. The release (Nov 27, 2025) includes a fix for DIA-NN PTM parsing, lazy initialization for UnimodDatabase, and code cleanup improvements.
modules/local/preprocess_expdesign/main.nf (2)
10-11: Container version update matches the pattern across modules.This follows the same quantms-utils update to 0.0.24 as in other modules. Refer to the verification comment in
modules/local/utils/extract_sample/main.nf.
36-36: Consistent error suppression pattern.The stderr redirection to
/dev/nullmatches the pattern in other modules. See the earlier comment regarding potential debugging impact.main.nf (1)
59-59: Informational comment noted.The comment suggests a potential future enhancement to use the samplesheet channel from UTILS_NEXTFLOW_PIPELINE. No action required for this release.
modules/local/thermorawfileparser/main.nf (2)
13-13: Good fix: Using proper task.executor property.The change from bare
executortotask.executorcorrectly references the Nextflow task property. This improves code correctness.Also applies to: 19-19
8-9: Container version 1.4.5 confirmed available; verify intermediate breaking changes if applicable.ThermoRawFileParser 1.4.5 was released on October 14, 2024 and is available through biocontainers at quay.io/biocontainers/thermorawfileparser. When upgrading from version 1.3.4 (April 2021), note that version 1.4.2 introduced a breaking command-line change where the -b flag was repurposed from base64-encoding to output file specification, and version 1.4.4 includes performance improvements for MS level handling. Review your workflow's command-line arguments and MS data processing if they depend on these features.
.devcontainer/setup.sh (1)
1-13: Devcontainer setup looks reasonable.The script appropriately customizes the shell environment and sets up a welcome message for the development container.
modules/local/sdrf_parsing/main.nf (1)
42-42: Consistent error suppression pattern.The stderr redirection follows the same pattern as other modules in this PR.
subworkflows/nf-core/utils_nfcore_pipeline/main.nf (1)
101-101: Lowercasechannel.of()syntax is correct.The lowercase
channelhas been an alias of the uppercaseChannelsince version 20.07.0, allowing factory methods to be specified aschannel.of()orChannel.of(). The change to lowercase is valid and follows modern Nextflow conventions reflected in current documentation.modules/local/openms/proteomicslfq/main.nf (1)
19-26: The output syntax usingoptional: trueis correct.Nextflow DSL2 supports
optional: trueas a named parameter in output path declarations. This syntax properly declares optional outputs that may not be produced by the process.modules/local/utils/extract_sample/main.nf (1)
6-7: Verify the container version exists and flag any security advisories.The quantms-utils container has been updated to version 0.0.24. Please confirm this version exists in both the Galaxy depot and Biocontainers registries.
#!/bin/bash # Description: Verify quantms-utils:0.0.24 container availability echo "Checking Biocontainers registry..." curl -s "https://api.biocontainers.pro/ga4gh/trs/v2/tools/quantms-utils/versions/0.0.24--pyh7e72e81_0" | jq -r '.name, .meta_version' echo "Checking for security advisories..." # Note: Container vulnerability checking would require specific tools like trivy echo "Manual verification recommended: scan with trivy or similar tool"assets/tmt18plex_default_correction.matrix (1)
1-20: LGTM! New TMT18plex correction matrix added.The correction matrix format is consistent with 18 channels and 8 correction values per channel, aligning with the conditional logic added in
modules/local/openms/isobaric_analyzer/main.nffor handling tmt18plex labelling types.subworkflows/local/psm_fdr_control/meta.yml (1)
10-12: LGTM! Components documentation added.The components list now properly documents the two modules used in this subworkflow (false/discovery/rate and id/filter), aligning with the imports in
main.nf.modules/local/openms/isobaric_analyzer/main.nf (1)
43-49: TMT16plex correctly uses 8 correction values.The conditional logic grouping TMT18plex and TMT16plex together with 8 isotope correction values is correct. Both formats require 8 values per reporter ion: <-2C13>/<-N15-C13>/<-C13>/<-N15>/<+N15>/<+C13>/<+N15+C13>/<+2C13>.
subworkflows/local/id/main.nf (3)
37-37: Ensure PSM_RESCORING always emits versions output.Removing the
ifEmpty(null)fallback requires verifying thatPSM_RESCORING.out.versionsalways emits at least one value. If this workflow has conditional logic or optional outputs that could result in an empty channel, the.mix()operation could fail without the fallback.
31-31: Verify that PEPTIDE_DATABASE_SEARCH unconditionally emits versions.The
ifEmpty(null)fallback has been removed from the version channel mixing at line 31. However,PEPTIDE_DATABASE_SEARCH.out.versionsis only conditionally populated: it remains an empty channel if none of the search engines (msgf, comet, sage) are enabled viaparams.search_engines. Without the fallback, mixing an empty channel could cause blocking. Ensure that either at least one search engine is guaranteed to be enabled, or restore theifEmpty(null)fallback to handle the edge case.
59-59: Both PSM_FDR_CONTROL and PHOSPHO_SCORING workflows are guaranteed to emit versions in all execution paths.PSM_FDR_CONTROL: Whether the single search engine branch (ID_FILTER) or multi-engine branch (FDR_CONSENSUSID) executes, both emit versions that are mixed into ch_version before emission.
PHOSPHO_SCORING: Both the multi-engine branch (ID_SCORE_SWITCHER + LUCIPHOR) and single-engine branch (LUCIPHOR) result in versions being mixed into ch_version, since LUCIPHOR is called in both paths.
The removal of
ifEmpty(null)fallbacks is safe.Likely an incorrect or invalid review comment.
.github/workflows/fix_linting.yml (1)
16-16: Python 3.14 is available and all action versions are compatible.Python 3.14 is available in GitHub Actions with versions 3.14.1 and 3.14.2 released, and setup-python supports Python 3.14. All actions in this workflow are pinned to specific commit SHAs following best practices. Since these updates originate from the nf-core template, they have been tested and validated as compatible by the nf-core community before template release.
modules/local/openms/protein_quantifier/main.nf (1)
25-25: Parameter namespace format appears correct but requires validation against tool output.The code already uses
-consensus:fix_peptides, which aligns with OpenMS subsection parameter conventions seen in related tools. However, confirm this parameter format is supported by runningProteinQuantifier --helphelpin the 2025.04.14 container to verify the parameter exists under the consensus section.modules/local/openms/id_ripper/main.nf (2)
40-43: Shell script file renaming logic is correct.The loop correctly identifies idXML files that don't have the
_perc.idXMLsuffix and renames them using the appropriate pattern based on the search engine used.
10-16: Parameter removal is compatible with all callers.The
qval_scoreparameter removal is not a breaking change. All callers invokeID_RIPPER(PERCOLATOR.out.id_files_perc), which outputstuple val(meta), path(...). This matches the new input signature exactly. No caller was passing or expectingqval_score, making this removal a clean-up of unused parameters.README.md (2)
9-10: Version badges updated appropriately.The Nextflow version requirement badge (≥25.04.0) and nf-core template version badge (3.4.1) are correctly updated to reflect the release requirements.
81-96: Well-documented file format support.The new "Supported file formats" section clearly documents the supported input formats and their compressed variants. This aligns with the implementation changes in
file_preparation/main.nf.modules/local/pmultiqc/main.nf (1)
23-23: New contaminant affix parameter added correctly.The
contaminant_affixoption is properly conditionally included based onparams.contaminant_string..github/workflows/linting.yml (2)
14-14: Action versions updated with SHA pinning.The GitHub Actions have been properly updated with SHA pins for security, following best practices.
16-19: No actionable concern—Python 3.14 is stable and available.Python 3.14 was released on October 7, 2025, so the version is stable and readily available on CI runners. Python 3.14.1 is the first maintenance release of 3.14, released December 2, 2025, further confirming stability. No action needed.
docs/output.md (2)
246-251: Documentation improved for pipeline_info outputs.The output documentation now clearly lists the files in
pipeline_info/including the newparams_<timestamp>.jsonfile that captures command line parameters.
255-267: SDRF parameter precedence clearly documented.The note correctly lists parameters that are overwritten when an SDRF file is used. The spelling of
labelling_typeis the actual parameter name, so the static analysis warning about variant mixing can be safely ignored..github/actions/nf-test/action.yml (2)
27-30: Python version updated to 3.14.Same version bump as in the linting workflow. Based on learnings, this action is part of the nf-core template.
55-56: Conda channel configuration improved.Setting
channels: conda-forgeandchannel-priority: strictensures consistent package resolution, which is a best practice for reproducible environments.subworkflows/local/file_preparation/main.nf (1)
46-64: DIA file support and unsupported format warnings added.Good addition of
.diafile passthrough and user-friendly warning for unsupported formats. The warning mechanism properly collects files before logging.subworkflows/local/utils_nfcore_quantms_pipeline/main.nf (2)
30-69: PIPELINE_COMPLETION workflow is well-structured.The workflow properly handles completion notifications, email sending, and error logging with appropriate troubleshooting references.
11-15: Import structure updated to use nf-schema plugin.The imports have been updated to use
paramsSummaryMapdirectly from the nf-schema plugin (v2.5.1 as noted in the comment). ThePIPELINE_INITIALISATIONworkflow has been removed, while the remaining utility imports andPIPELINE_COMPLETIONworkflow are intact.subworkflows/local/utils_nfcore_quantms_pipeline/meta.yml (1)
1-32: Structure looks good.The metadata manifest follows the nf-core subworkflow schema and correctly declares components, I/O channels, and maintainers. This provides a clean declarative interface for the
pipeline_completionsubworkflow.modules.json (2)
13-17: New multiqc module addition looks correct.The new
multiqcmodule is properly added with branch, git_sha, and installed_by fields following the nf-core modules.json schema.
22-26: No action needed.utils_nextflow_pipelineandutils_nfcore_pipelineare separate nf-core subworkflows with different purposes and responsibilities. They are independently maintained and should not be expected to share the same git_sha.subworkflows/local/create_input_channel/main.nf (1)
36-40: Wrapper refactoring to map is a good simplification.Replacing the public
Wrapperclass with a local map is cleaner and reduces complexity. The map structure appropriately captures the same state (labelling_type,acquisition_method,experiment_id).modules/nf-core/multiqc/tests/custom_prefix.config (1)
1-5: LGTM!Standard nf-core test configuration for validating the custom prefix functionality of the MULTIQC process.
modules/nf-core/multiqc/tests/nextflow.config (1)
1-5: LGTM!Baseline test configuration that explicitly sets
ext.prefixtonull, ensuring the default MULTIQC behavior is tested. This correctly complements thecustom_prefix.configfor comprehensive test coverage.modules/local/diann/generate_cfg/main.nf (1)
6-7: LGTM - Consistent container version update.This container version bump from quantms-utils 0.0.23 to 0.0.24 aligns with the repository-wide upgrade pattern.
modules/local/utils/psm_conversion/main.nf (1)
6-7: LGTM - Container version update is consistent.The quantms-utils container bump to 0.0.24 is consistent with updates across other modules.
modules/local/msstats/msstats_lfq/main.nf (1)
15-15: Good syntax improvement to use named parameters.The change from positional syntax to named parameter syntax (
optional: true) is more explicit and aligns with modern Nextflow best practices.modules/nf-core/multiqc/environment.yml (1)
1-7: MultiQC version 1.32 is correctly specified. Version 1.32 is the latest available in the bioconda channel. The conda environment definition is correctly formatted with valid configuration..github/workflows/ci.yml (1)
38-39: Both test profile configurations (test_dda_id_alphapeptdeep.configandtest_dda_id_ms2pip.config) are properly defined in./conf/tests/, confirming the test profiles referenced in the CI workflow exist as expected.modules/local/utils/mzml_statistics/main.nf (1)
7-8: Verify container availability for quantms-utils 0.0.24 before approval.The latest confirmed version on PyPI is 0.0.23 (May 14, 2025). Version 0.0.24 must be verified as available in both the Galaxy Singularity depot and Biocontainers/Quay.io registries. Run the provided verification script to confirm both container endpoints are accessible and return successful HTTP responses before merging this PR.
docs/usage.md (2)
22-39: Excellent documentation addition for supported file formats.This new section clearly documents the supported mass spectrometry file formats and their compressed variants, which will help users understand what inputs the pipeline accepts.
130-130: Correct URL update for Charliecloud documentation.The domain change from
hpc.github.iotocharliecloud.ioreflects the project's current official website.modules/local/openms/id_merger/main.nf (1)
22-38: Consistent parameter renaming from rescore_range to ms2features_range.The parameter renaming is correctly applied while preserving the conditional logic for "by_project" and "by_sample" modes.
subworkflows/nf-core/utils_nfschema_plugin/tests/main.nf.test (2)
28-34: Correct extension of test inputs for new workflow signature.The existing tests have been properly updated to initialize the new input parameters (input[3] through input[8]) with default values, maintaining backward compatibility while supporting the expanded workflow interface.
Also applies to: 60-66, 92-98, 124-130
142-172: Good test coverage for help message generation.The new test properly validates the help message generation functionality with all required parameters configured. This enhances test coverage for the nf-schema plugin integration.
subworkflows/nf-core/utils_nfschema_plugin/tests/nextflow.config (1)
2-2: Verify nf-schema plugin version 2.5.1 availability.nf-schema@2.5.1 has been used in nf-core pipelines and is documented as a valid version in the official nf-schema repository. Ensure compatibility with Nextflow 25.04.0 during pipeline execution.
ro-crate-metadata.json (1)
189-189: Nextflow 25.04.0 requirement is appropriate.Nextflow 25.04.0 was released on May 8, 2025 and multiple stable patch releases exist (v25.04.7, v25.04.8), confirming availability and production stability.
modules/local/diann/convert_results/main.nf (2)
6-7: Correct the container version reference.The quantms-utils:0.0.24 tag does not exist. Latest available version is 0.0.23. Update the container references to use a valid version (e.g., quantms-utils:0.0.23--pyh7e72e81_0) or verify the intended version number.
Likely an incorrect or invalid review comment.
22-22: No action needed - mzTab output is already optional and not consumed downstream.The mzTab output in CONVERT_RESULTS is already marked
optional: true(line 22), and the DIA workflow'sfinal_resultemission of this output is not consumed by any downstream process inquantms.nf. Unlike the TMT and LFQ workflows which mix theirfinal_resultintoch_pipeline_results, the DIA workflow selectively consumes onlydiann_report,diann_report_parquet,msstats_in, andversions. Optional outputs in Nextflow cause the task to not fail and emit nothing when the file is missing, which is the expected and safe behavior here.workflows/lfq.nf (1)
41-41: LGTM: Version channel propagation simplified.The removal of
.ifEmpty(null)fallbacks makes version propagation more explicit and consistent with the broader changes across workflows (dia.nf, tmt.nf). This ensures that all subworkflows properly emit version information.Also applies to: 57-57, 66-66
workflows/dia.nf (1)
53-53: LGTM: Consistent version channel handling.All version channel propagation has been updated to remove
.ifEmpty(null)fallbacks, aligning with the workflow-wide refactoring pattern.Also applies to: 92-92, 105-105, 119-119, 142-142, 159-159, 168-168
subworkflows/local/psm_rescoring/main.nf (2)
24-24: LGTM: Parameter namespace refactored consistently.The renaming from
ms2rescore/add_snr_feature_percolator/rescore_rangeto the newms2features_*namespace is applied consistently across all conditional branches.Also applies to: 37-37, 44-44, 48-48, 87-87
83-83: Verify downstream consumers handle the updated tuple structure.The consensus input mapping has been updated from a 3-element tuple
[it[1], it[2], "MS:1001491"]to a 2-element tuple[it[1], it[2]]. Ensure that all downstream processes expecting this input (e.g., CONSENSUSID) have been updated to handle the new structure without the annotation.#!/bin/bash # Description: Find all usages of ch_consensus_input to verify tuple structure compatibility # Search for ch_consensus_input usage in workflow and module files rg -n -C3 'ch_consensus_input' --type=groovyAlso applies to: 114-114
.github/workflows/extended_ci.yml (1)
50-57: LGTM: Improved step labeling and simplified checkout.The checkout step label is clearer, and removing
fetch-depth: 0simplifies the configuration. The Java setup step label is also more descriptive.subworkflows/local/dda_id/main.nf (3)
49-49: LGTM: Parameter namespace updated correctly (except line 113).The ms2features_* parameter namespace is applied consistently in these conditional branches. However, note the critical issue flagged separately at line 113.
Also applies to: 60-60, 67-67, 71-71
38-38: LGTM: Version channel propagation simplified.Consistent with the broader refactoring across workflows, version channels now propagate without
.ifEmpty(null)fallbacks.Also applies to: 156-156, 166-166, 170-170
109-109: Downstream consumers already handle the 2-element tuple correctly.The CONSENSUSID module expects
tuple val(meta), path(id_file), which matches the 2-element tuple produced by.map{ [it[1], it[2]]}on lines 109 and 140. The invocation on line 155 passes this tuple correctly to CONSENSUSID without compatibility issues.modules/nf-core/multiqc/meta.yml (1)
1-92: LGTM: Well-structured MultiQC module metadata.The module manifest is comprehensive and follows nf-core conventions with proper EDAM ontology annotations for inputs and outputs. The metadata clearly defines optional configuration files, logos, and sample mapping capabilities.
Based on learnings, this is part of the nf-core template integration for the MultiQC module.
conf/tests/test_dda_id_ms2pip.config (1)
1-47: LGTM: Comprehensive test configuration for ms2pip workflow.The test configuration properly utilizes the new
ms2features_*parameter namespace and provides appropriate resource limits and test parameters. This will help ensure the ms2pip feature generation path is tested in CI.workflows/tmt.nf (1)
41-41: LGTM: Consistent version channel handling.All version channel propagation updated to remove
.ifEmpty(null)fallbacks, maintaining consistency with the refactoring across lfq.nf and dia.nf workflows.Also applies to: 47-47, 53-53, 59-59, 65-65, 74-74
conf/tests/test_dda_id_alphapeptdeep.config (1)
38-45: LGTM! Well-structured ms2features configuration for alphapeptdeep testing.The new
ms2features_*parameters are properly configured for alphapeptdeep testing. The inline comments clearly document the purpose of using alphapeptdeep and the generic model choice.modules/nf-core/multiqc/tests/main.nf.test (2)
70-71: External URL dependency may cause test fragility.The test fetches config from the nf-core/tools
devbranch. If this file is moved, renamed, or the branch changes, the test could fail unexpectedly. Consider using a versioned tag reference or bundling a local test config file for more stable tests.
1-37: Well-structured test suite following nf-core conventions.The test file properly covers core MULTIQC functionality with appropriate assertions for process success, output paths, and version snapshots.
subworkflows/nf-core/utils_nfschema_plugin/main.nf (3)
28-44: Verify theparams.helpstring handling logic.Line 41 checks
params.help instanceof String ? params.help : "". This impliesparams.helpcould be either a boolean or a string (for topic-specific help). Ensure this behavior is documented and that the workflow caller passes the expected type. Thehelpinput parameter (line 19) is boolean, butparams.helpis used separately here.
63-69: Clean refactoring of validation logic.The validateParameters call now uses an options map pattern, which is more extensible and aligns with the nf-schema plugin conventions.
50-57: LGTM! Clear parameter summary logging flow.The summary section properly handles optional
parametersSchemaand logs the before/after text with the parameter summary.modules/nf-core/multiqc/main.nf (3)
1-21: Well-structured MULTIQC process definition following nf-core conventions.The process correctly handles:
- Dynamic container selection for singularity/docker
- Flexible input staging with the
stageAs: "?/*"pattern- Optional outputs for plots directory
52-63: Stub implementation is complete.The stub correctly creates all expected output artifacts (data directory, plots directory, HTML report, and versions file), enabling workflow testing without running the full tool.
46-49: The version extraction pattern is correct. Themultiqc --versioncommand outputs in the formatmultiqc, version X.Y.Z, and the sed patterns/multiqc, version //gappropriately strips the prefix to extract only the version number. This approach is widely adopted across nf-core pipelines and remains compatible with current MultiQC versions.modules/local/utils/msrescore_features/main.nf (3)
25-28: Good defensive handling of edge case.The check for
params.ms2features_model_dir != trueproperly handles the Nextflow quirk where a flag passed without a value becomes booleantrue. The comment clearly explains the rationale.
61-83: Feature flags are correctly implemented.The new
ms2features_debugflag (lines 79-83) properly toggles debug logging. All flags follow a consistent pattern.
85-99: Script execution properly integrates new parameters.The rescoring command correctly includes the new
--ms2_tolerance_unitand${debug_log_level}arguments aligned with the ms2features refactoring.nextflow.config (3)
112-125: MS2Features parameter block looks well-structured.The new
ms2features_*namespace is cleanly organized with sensible defaults. The tolerance values and units are properly defined.
298-311: New ARM64 and emulate_amd64 profiles look appropriate.The arm64 profile correctly configures Wave for container builds on ARM architecture. The emulate_amd64 profile provides a fallback for running x86 containers on ARM hosts.
509-516: Verify pipeline testing and Java 17 compatibility for Nextflow 25.04.0 upgrade.Nextflow 25.04.0 now requires Java 17 or newer, which represents a breaking change from previous versions. Confirm the pipeline has been tested against this requirement. Additionally, nf-schema 2.5.1 fixes parameter validation issues and is stable, but verify it works correctly with your pipeline's parameter schema and sample sheet validation.
workflows/quantms.nf (2)
179-209: MultiQC workflow refactor is well-structured.The explicit channel construction improves readability. Using
.ifEmpty([])for optional channels and.collect()to gather all inputs before invokingSUMMARY_PIPELINEis the correct pattern.
167-176: Version filtering and collection looks correct.Filtering out null versions and collecting to YAML is appropriate. The filename pattern
'nf_core_' + 'quantms_software_' + 'mqc_' + 'versions.yml'could be simplified to a single string, but this is a minor style preference.nextflow_schema.json (2)
502-599: MS2Features schema definitions are comprehensive.The new
ms2features_*parameter definitions are well-documented with appropriatehelp_textexplaining usage. The enum constraints and defaults align withnextflow.config.
1455-1466: Union type forhelpparameter.The
helpparameter uses"type": ["boolean", "string"]which allows both boolean (--help) and string (--help topic) usage. This is valid JSON Schema 2020-12 and provides flexibility.
| NXF_VER: | ||
| - "24.10.5" | ||
| test_profile: ["test_lfq", "test_dia", "test_localize", "test_tmt", "test_dda_id"] | ||
| - "25.04.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nextflow 25.04.0 has breaking changes requiring review.
Nextflow 25.04 was released on May 8, 2025 and is stable. However, Nextflow now requires Java 17 or newer (upgraded from Java 11). Additionally, there are multiple breaking changes: The publish: section can only be specified in the entry workflow, workflow outputs are assigned instead of using the >> operator, and the syntax for dynamic publish paths has changed. The process shell section has been deprecated. Verify that your pipeline's Java runtime, workflow output declarations, and shell directives are compatible with these changes before upgrading.
| uses: nf-core/setup-nextflow@v2 | ||
| with: | ||
| version: "24.10.5" # Or the Nextflow version you prefer | ||
| version: "25.04.0" # Or the Nextflow version you prefer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test Nextflow 25.04.0 thoroughly before deploying to production.
Nextflow 25.04.0 introduces breaking changes from 24.10.5:
- Java 17+ is now required
- Process shell directives containing newlines trigger warnings (may cause failures on some systems)
- Dynamic publish paths and workflow outputs syntax changed
- HyperQueue executor requires version 0.20.0+
Run the test_latest_dia profile and verify all pipeline outputs with 25.04.0 locally before using in CI/CD.
🤖 Prompt for AI Agents
.github/workflows/diann_private.yml around line 41: the workflow bumps Nextflow
to 25.04.0 but the review warns of breaking changes; before committing this
change, either revert/pin the version to the known-good 24.10.5 or, if you
intend to move to 25.04.0, run the test_latest_dia profile locally with Nextflow
25.04.0 and Java 17+, update any pipeline code to fix new dynamic publish
paths/workflow outputs and newline-in-shell directive issues, ensure HyperQueue
executor is upgraded to >=0.20.0, and only then update the CI file to 25.04.0
once all pipeline outputs match the expected results.
| matrix: | ||
| NXF_VER: | ||
| - "24.10.5" | ||
| - "25.04.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Verify compatibility with Nextflow 25.04.0 and address breaking changes.
Upgrading from 24.10.5 to 25.04.0 introduces breaking changes:
- Dynamic publish path syntax changed (must use >> operator)
- Mapper index directive removed
- Java 17+ now required
- HyperQueue executor requires 0.20.0+
- Process shell section deprecated; newlines in shell directives trigger warnings
Test thoroughly and review the migration guide at https://www.nextflow.io/docs/latest/migrations/25-04.html
| test_profile: | ||
| [ | ||
| "test_lfq", | ||
| "test_dia", | ||
| "test_localize", | ||
| "test_tmt", | ||
| "test_dda_id_alphapeptdeep", | ||
| "test_dda_id_ms2pip", | ||
| "test_tmt_corr", | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test profile inconsistency between Docker and Singularity.
The Docker test matrix includes test_dda_id_ms2pip (line 40), but the Singularity test matrix omits it (line 125). This creates inconsistent test coverage between execution profiles.
🔎 Suggested fix to add missing test profile
test_profile:
- ["test_lfq", "test_dia", "test_localize", "test_tmt", "test_dda_id_alphapeptdeep", "test_tmt_corr"]
+ ["test_lfq", "test_dia", "test_localize", "test_tmt", "test_dda_id_alphapeptdeep", "test_dda_id_ms2pip", "test_tmt_corr"]Also applies to: 124-125
🤖 Prompt for AI Agents
.github/workflows/extended_ci.yml lines 33-42 (and also ensure lines ~124-125):
the Singularity test_profile is missing test_dda_id_ms2pip which is present in
the Docker profile; update the Singularity test_profile array to include
"test_dda_id_ms2pip" so both matrices run the same tests, and verify the item is
added in the same position/format as the Docker list to maintain consistency.
| - .github/workflows/awstest.yml | ||
| - .github/workflows/awsfulltest.yml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove duplicate entries in files_unchanged list.
Lines 23-24 duplicate the entries already present at lines 18-19. YAML will include both occurrences, which may cause nf-core pipelines lint to fail or produce warnings.
🔎 Proposed fix
- conf/modules.config
- conf/igenomes_ignored.config
- - .github/workflows/awstest.yml
- - .github/workflows/awsfulltest.yml
modules_config: falseCommittable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In .nf-core.yml around lines 23 to 24, the entries
".github/workflows/awstest.yml" and ".github/workflows/awsfulltest.yml" are
duplicates of the entries at lines 18-19; remove the duplicate lines so each
file appears only once in the files_unchanged list (or deduplicate the list),
then save the YAML ensuring indentation/format remains valid.
| "description": "The reference channel, e.g. for calculating ratios.", | ||
| "fa_icon": "fas fa-list-ol", | ||
| "default": "126" | ||
| "default": 126 | ||
| }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Type mismatch for reference_channel.
The schema defines reference_channel with a numeric default 126, but nextflow.config (line 74) defines it as a string '126'. This inconsistency may cause validation issues. Align the types.
🔎 Proposed fix in schema
"reference_channel": {
"type": "string",
"description": "The reference channel, e.g. for calculating ratios.",
"fa_icon": "fas fa-list-ol",
- "default": 126
+ "default": "126"
},🤖 Prompt for AI Agents
In nextflow_schema.json around lines 794 to 797, the schema defines
reference_channel with a numeric default (126) while nextflow.config uses a
string ('126'); update the schema so the type and default match the config by
changing reference_channel to a string type and set its default to "126" (adjust
any validation/description if needed to reflect string usage).
| "fa_icon": "fas fa-list-ol", | ||
| "enum": [0, 1, 2], | ||
| "default": 2 | ||
| "default": 2.0 | ||
| }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Type mismatch for pg_level default.
The pg_level enum is [0, 1, 2] (integers) but the default is 2.0 (float). JSON Schema validation may fail or behave unexpectedly. Use an integer default.
🔎 Proposed fix
"pg_level": {
"type": "number",
"description": "Controls the protein inference mode",
"fa_icon": "fas fa-list-ol",
"enum": [0, 1, 2],
- "default": 2.0
+ "default": 2
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "fa_icon": "fas fa-list-ol", | |
| "enum": [0, 1, 2], | |
| "default": 2 | |
| "default": 2.0 | |
| }, | |
| "pg_level": { | |
| "type": "number", | |
| "description": "Controls the protein inference mode", | |
| "fa_icon": "fas fa-list-ol", | |
| "enum": [0, 1, 2], | |
| "default": 2 | |
| }, |
🤖 Prompt for AI Agents
In nextflow_schema.json around lines 1088 to 1091, the pg_level property has an
enum of integers [0, 1, 2] but the default is the float 2.0; change the default
to the integer 2 so the default value matches the enum type and JSON Schema
validation behaves correctly.
| def Set enzymes = [] | ||
| def Set files = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Invalid Groovy syntax for Set declarations.
def Set enzymes = [] is not valid Groovy syntax. The def keyword and explicit type declaration cannot be combined this way. This will cause a compilation error.
🔎 Proposed fix
- def Set enzymes = []
- def Set files = []
+ Set enzymes = []
+ Set files = []Or alternatively, use:
def enzymes = [] as Set
def files = [] as Set📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def Set enzymes = [] | |
| def Set files = [] | |
| Set enzymes = [] | |
| Set files = [] |
🤖 Prompt for AI Agents
In subworkflows/local/create_input_channel/main.nf around lines 32 to 33, the
Groovy declarations "def Set enzymes = []" and "def Set files = []" use invalid
syntax (cannot combine def with explicit type that way); replace with valid
Groovy set declarations by either using a typed variable without def (e.g., Set
enzymes = ...) or keep def and coerce the literals to sets (e.g., def enzymes =
[] as Set and def files = [] as Set) so the variables are proper Set instances.
| // Pass through .dia files without conversion (DIA-NN handles them natively) | ||
| // Note: .dia files bypass peak picking (when enabled) as they are only used with DIA-NN | ||
| ch_results = ch_results.mix(ch_branched_input.dia) | ||
|
|
||
|
|
||
| MZML_STATISTICS(ch_results) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
cat modules/local/utils/mzml_statistics/main.nfRepository: bigbio/quantms
Length of output: 1430
🏁 Script executed:
sed -n '90,120p' subworkflows/local/file_preparation/main.nfRepository: bigbio/quantms
Length of output: 1400
Exclude .dia files from MZML_STATISTICS processing.
The .dia files mixed into ch_results (line 106) and passed to MZML_STATISTICS (line 109) are incompatible with the module. The underlying quantmsutilsc mzmlstats tool supports only .mzML or Bruker .d formats, not .dia files. Filter .dia files before passing ch_results to MZML_STATISTICS, or create a separate statistics collection path for them.
| // nothing will be run for the 'CONSENSUSID' process | ||
| CONSENSUSID(PSM_RESCORING.out.results.groupTuple(size: n_unique_search_engines)) | ||
| ch_software_versions = ch_software_versions.mix(CONSENSUSID.out.versions.ifEmpty(null)) | ||
| ch_software_versions = ch_software_versions.mix(CONSENSUSID.out.versions) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
find . -type f -name "*.nf" | xargs grep -l "process CONSENSUSID" 2>/dev/null | head -20Repository: bigbio/quantms
Length of output: 100
🏁 Script executed:
cat ./modules/local/openms/consensusid/main.nfRepository: bigbio/quantms
Length of output: 1363
🏁 Script executed:
cat -n subworkflows/local/id/main.nf | sed -n '40,60p'Repository: bigbio/quantms
Length of output: 1253
🏁 Script executed:
cat -n subworkflows/local/id/main.nf | tail -20Repository: bigbio/quantms
Length of output: 927
🏁 Script executed:
git log --oneline -p subworkflows/local/id/main.nf | grep -A 3 -B 3 "ifEmpty" | head -30Repository: bigbio/quantms
Length of output: 40
🏁 Script executed:
grep -r "ifEmpty" subworkflows/local/ --include="*.nf"Repository: bigbio/quantms
Length of output: 40
🏁 Script executed:
git log --all --oneline -- subworkflows/local/id/main.nf | head -5Repository: bigbio/quantms
Length of output: 280
🏁 Script executed:
git show c482ab1:subworkflows/local/id/main.nf | grep -A 5 -B 5 "CONSENSUSID.out.versions"Repository: bigbio/quantms
Length of output: 683
🏁 Script executed:
git show 4d19300:subworkflows/local/id/main.nf | grep -A 5 -B 5 "CONSENSUSID.out.versions"Repository: bigbio/quantms
Length of output: 697
🏁 Script executed:
grep -A 10 "emit:" subworkflows/local/id/main.nf | head -15Repository: bigbio/quantms
Length of output: 272
CONSENSUSID emits versions only when executed. When n_unique_search_engines <= 1, the CONSENSUSID process does not run at all, resulting in an empty channel. While CONSENSUSID always emits versions when it executes, the process itself is conditionally invoked (line 46-49), so the ifEmpty(null) fallback was necessary to handle the case where the process doesn't run. Removing it may cause issues with empty channel handling in downstream version collection.
🤖 Prompt for AI Agents
In subworkflows/local/id/main.nf around line 51, the CONSENSUSID output channel
may be empty when the process is not executed (n_unique_search_engines <= 1), so
revert the removal of the empty-channel fallback: restore a safe fallback such
as mixing CONSENSUSID.out.versions.ifEmpty(null) (or only mix when CONSENSUSID
was executed) so downstream version collection doesn't break on an absent
channel; ensure the mix operation receives a channel (possibly containing null)
rather than failing on an empty channel.
PR checklist
nf-core pipelines lint).nextflow run . -profile test,docker --outdir <OUTDIR>).nextflow run . -profile debug,test,docker --outdir <OUTDIR>).docs/usage.mdis updated.docs/output.mdis updated.CHANGELOG.mdis updated.README.mdis updated (including new tool citations and authors/contributors).Summary by CodeRabbit
New Features
Documentation
Chores
ms2features_*namespace✏️ Tip: You can customize this high-level summary in your review settings.