1

使用 PyTorch Tensorboard,我可以在单个 Tensorboard 图中记录我的训练和有效损失,如下所示:

writer = torch.utils.tensorboard.SummaryWriter()

for i in range(1, 100):
    writer.add_scalars('loss', {'train': 1 / i}, i)

for i in range(1, 100):
    writer.add_scalars('loss', {'valid': 2 / i}, i)

在此处输入图像描述

如何使用 Pytorch Lightning 的默认 Tensorboard 记录器实现相同的功能?

def training_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> Tensor:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/train', loss.item())  # creates separate graph

    return loss

def validation_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> None:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.log('loss/valid', loss.item(), on_step=True)  # creates separate graph
4

2 回答 2

1

该文档将其描述为self.logger.experiment.some_tensorboard_function()some_tensorboard_function 是 tensorboard 提供的函数,因此对于您要使用的问题

self.logger.experiment.add_scalars() 

pytorch-lightning 的 Tensorboard 文档可以在这里找到

于 2021-02-21T20:41:19.493 回答
0

只是为了澄清上面的代码,然后在 Pytorch Lightning 中将是:

def training_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> Tensor:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.logger.experiment.add_scalars('loss', {'train': loss},self.global_step) 

    return loss

def validation_step(self, batch: Tuple[Tensor, Tensor], _batch_idx: int) -> None:
    inputs_batch, labels_batch = batch

    outputs_batch = self(inputs_batch)
    loss = self.criterion(outputs_batch, labels_batch)

    self.logger.experiment.add_scalars('loss', {'valid': loss},self.global_step) 
于 2022-03-02T03:14:57.063 回答