Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github May 9, 2023

You can trigger a rebase of this PR by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
> **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days.

---
updated-dependencies:
- dependency-name: torch
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label May 9, 2023
@mzweilin
Copy link
Contributor

mzweilin commented Jun 2, 2023

We should also upgrade to the latest template: ashleve/lightning-hydra-template@v1.4.0...v2.0.2

Notable changes

  • pytorch_lightning -> lightning.pytorch
  • use lightning.fabric for TPU
  • upgrade to hydra 1.3
  • upgrade toolkits in pre-commit
  • config datamodule -> data
  • add a cpu trainer config
  • move src.tasks to src
  • stop exporting extra log after a task exception.
  • add aim as a logger
  • split src.utils.utils into several .py files
  • update .gitignore for the aim logger

@mzweilin mzweilin self-assigned this Jun 5, 2023
@mzweilin
Copy link
Contributor

mzweilin commented Jun 5, 2023

Changes in pytorch-lightning ~= 1.6.5 -> lightning ~= 2.0.2.

  • import pytorch_lightning as pl -> from lightning import pytorch as pl
  • LightningModule: training_epoch_end(self, outputs) -> on_train_epoch_end(self).
  • LightningModule: remove training_step_end()
  • LightningModule: change arguments configure_gradient_clipping()

@mzweilin
Copy link
Contributor

mzweilin commented Jun 5, 2023

lightning 2.0.2 depends on torchmetrics<2.0 and >=0.7.0.

However, we want to keep torchmetrics == 0.6.0 because mAP is super slow in later versions.

I hope torchmetrics will change the backend of mAP soon in the upcoming release.

Changes in torchmetrics == 0.6.0 -> torchmetrics == 0.11.4

  • Accuracy requires num_classes in arguments.
  • torchmetrics.detection.MAP -> torchmetrics.detection.mean_ap.MeanAveragePrecision

@dxoigmn
Copy link
Contributor

dxoigmn commented Jun 5, 2023

The reason we used torchmetrics == 0.6.0 is because MeanAveragePrecision is super slow in newer versions. It looks like they're finally going to revert back to the original implementation that uses the COCOapi: Lightning-AI/torchmetrics#1024.

@dxoigmn
Copy link
Contributor

dxoigmn commented Jul 14, 2023

Should this be closed @mzweilin?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants