Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Jun 12, 2021

Bumps pytorch-lightning from 1.0.3 to 1.3.5.

Release notes

Sourced from pytorch-lightning's releases.

Standard weekly patch release

[1.3.5] - 2021-06-08

Added

  • Added warning to Training Step output (#7779)

Fixed

  • Fixed LearningRateMonitor + BackboneFinetuning (#7835)
  • Minor improvements to apply_to_collection and type signature of log_dict (#7851)
  • Fixed docker versions (#7834)
  • Fixed sharded training check for fp16 precision (#7825)
  • Fixed support for torch Module type hints in LightningCLI (#7807)

Changed

  • Move training_output validation to after train_step_end (#7868)

Contributors

@​Borda, @​justusschock, @​kandluis, @​mauvilsa, @​shuyingsunshine21, @​tchaton

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Standard weekly patch release

[1.3.4] - 2021-06-01

Fixed

  • Fixed info message when max training time reached (#7780)
  • Fixed missing __len__ method to IndexBatchSamplerWrapper (#7681)

Contributors

@​awaelchli @​kaushikb11

If we forgot someone due to not matching commit email with GitHub account, let us know :]

Standard weekly patch release

[1.3.3] - 2021-05-26

Changed

  • Changed calling of untoggle_optimizer(opt_idx) out of the closure function (#7563)

Fixed

  • Fixed ProgressBar pickling after calling trainer.predict (#7608)
  • Fixed broadcasting in multi-node, multi-gpu DDP using torch 1.7 (#7592)

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

[1.3.5] - 2021-06-08

Added

  • Added warning to Training Step output (#7779)

Fixed

  • Fixed LearningRateMonitor + BackboneFinetuning (#7835)
  • Minor improvements to apply_to_collection and type signature of log_dict (#7851)
  • Fixed docker versions (#7834)
  • Fixed sharded training check for fp16 precision (#7825)
  • Fixed support for torch Module type hints in LightningCLI (#7807)

Changed

  • Move training_output validation to after train_step_end (#7868)

[1.3.4] - 2021-06-01

Fixed

  • Fixed info message when max training time reached (#7780)
  • Fixed missing __len__ method to IndexBatchSamplerWrapper (#7681)

[1.3.3] - 2021-05-27

Changed

  • Changed calling of untoggle_optimizer(opt_idx) out of the closure function (#7563)

Fixed

  • Fixed ProgressBar pickling after calling trainer.predict (#7608)
  • Fixed broadcasting in multi-node, multi-gpu DDP using torch 1.7 (#7592)
  • Fixed dataloaders are not reset when tuning the model (#7566)
  • Fixed print errors in ProgressBar when trainer.fit is not called (#7674)
  • Fixed global step update when the epoch is skipped (#7677)
  • Fixed training loop total batch counter when accumulate grad batches was enabled (#7692)

[1.3.2] - 2021-05-18

Changed

  • DataModules now avoid duplicate {setup,teardown,prepare_data} calls for the same stage (#7238)

Fixed

... (truncated)

Commits
  • c292788 [bugfix] Minor improvements to apply_to_collection and type signature of `l...
  • 8a5a56b v1.3.5 & CHANGELOG
  • 9afab6f Move training_output validation to after train_step_end (#7868)
  • 6c7ae16 [bugfix] Resolve LearningRateMonitor + BackboneFinetuning (#7835)
  • 933aebc update fsspec to 2021.06.0 (#7869)
  • 29d72a9 Fix NVIDIA docker versions (#7834)
  • 290a3f0 Add warning to trainstep output (#7779)
  • 55f52a1 [sharded plugin] Fix check for fp16 precision (#7825)
  • 057f05f Fix support for torch Module type hints in LightningCLI (#7807)
  • 61525f6 update changelog + increment version
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/PyTorchLightning/pytorch-lightning) from 1.0.3 to 1.3.5.
- [Release notes](https://github.com/PyTorchLightning/pytorch-lightning/releases)
- [Changelog](https://github.com/PyTorchLightning/pytorch-lightning/blob/1.3.5/CHANGELOG.md)
- [Commits](Lightning-AI/pytorch-lightning@1.0.3...1.3.5)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jun 12, 2021
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Jun 19, 2021

Superseded by #36.

@dependabot dependabot bot closed this Jun 19, 2021
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/tune/pytorch-lightning-1.3.5 branch June 19, 2021 07:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants