Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Dec 3, 2025

Updates the requirements on lightning to permit the latest version.

Release notes

Sourced from lightning's releases.

Lightning v2.6.0

Changes in 2.6.0

PyTorch Lightning

  • Added WeightAveraging callback that wraps the PyTorch AveragedModel class (#20545)
  • Added Torch-Tensorrt integration with LightningModule (#20808)
  • Added time-based validation support though val_check_interval (#21071)
  • Added attributes to access stopping reason in EarlyStopping callback (#21188)
  • Added support for variable batch size in ThroughputMonitor (#20236)
  • Added EMAWeightAveraging callback that wraps Lightning's WeightAveraging class (#21260)
  • Expose weights_only argument for Trainer.{fit,validate,test,predict} and let torch handle default value (#21072)
  • Default to RichProgressBar and RichModelSummary if the rich package is available. Fallback to TQDMProgressBar and ModelSummary otherwise (#20896)
  • Add MPS accelerator support for mixed precision (#21209)
  • Fixed edgecase when max_trials is reached in Tuner.scale_batch_size (#21187)
  • Fixed case where LightningCLI could not be initialized with trainer_default containing callbacks (#21192)
  • Fixed missing reset when ModelPruning is applied with lottery ticket hypothesis (#21191)
  • Fixed preventing recursive symlink creation iwhen save_last='link' and save_top_k=-1 (#21186)
  • Fixed last.ckpt being created and not linked to another checkpoint (#21244)
  • Fixed bug that prevented BackboneFinetuning from being used together with LearningRateFinder (#21224)
  • Fixed ModelPruning sparsity logging bug that caused incorrect sparsity percentages (#21223)
  • Fixed LightningCLI loading of hyperparameters from ckpt_path failing for subclass model mode (#21246)
  • Fixed check the init args only when the given frames are in __init__ method (#21227)
  • Fixed how ThroughputMonitor calculated training time (#21291)
  • Fixed synchronization of gradients in manual optimization with DDPStrategy(static_graph=True) (#21251)
  • Fixed FSDP mixed precision semantics and added user warning (#21361)

Lightning Fabric

  • Expose weights_only argument for Trainer.{fit,validate,test,predict} and let torch handle default value (#21072)
  • Set _DeviceDtypeModuleMixin._device from torch's default device function (#21164)
  • Added kwargs-filtering for Fabric.call to support different callback method signatures (#21258)

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [lightning](https://github.com/Lightning-AI/lightning) to permit the latest version.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@2.0.0...2.6.0)

---
updated-dependencies:
- dependency-name: lightning
  dependency-version: 2.6.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the maintenance Continuous integration, unit testing & package distribution label Dec 3, 2025
@dependabot dependabot bot added the maintenance Continuous integration, unit testing & package distribution label Dec 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

maintenance Continuous integration, unit testing & package distribution

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant