Difference b.w calling self.log(..., on_step=False, on_epoch=True)
in training_step() and training_epoch_end()
#485
-
lightning-hydra-template/src/models/mnist_module.py Lines 75 to 76 in 75b44ff In the src/models/mnist_module.py, I wonder what the difference is compared to using It seems that the resulting values in either way are the same. Is there any difference in the internal mechanism? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
No difference as far as I'm aware, as long as we're logging torchmetrics object directly. Not sure if there won't be slight differences when we log through value though.
|
Beta Was this translation helpful? Give feedback.
No difference as far as I'm aware, as long as we're logging torchmetrics object directly. Not sure if there won't be slight differences when we log through value though.
training_epoch_end()
flagson_step=False, on_epoch=True
don't matter as you're always logging only once per epoch.training_step()
settingon_epoch=True
will make lightning average given value over logs from all steps.