1

I'm training a neural net in Theano and lasagne, running the code in an iPython notebook. I like having the train and valid loss displayed at each iteration, like this:

epoch    train loss    valid loss    train/val    valid acc  dur
-------  ------------  ------------  -----------  -----------  -----
  1       0.53927       0.22774      2.36794      0.93296  5.45s
  2       0.28789       0.16561      1.73840      0.95033  5.40s

but I would also like to see the live/dynamic plot of the two losses. Is there a built-in way to do so?

I have tried creating a custom class and adding it to my net's on_epoch_finished, but either I get a new plot at each iteration (I'd like a single one, updating), either I have to erase the previous output at each iteration, and thus cannot see the text output (which I want to keep).

4

1 回答 1

1

我终于设法使用以下类更新了我想要的损失图:

from IPython import display
from matplotlib import pyplot as plt
import numpy as np
from lasagne import layers
from lasagne.updates import nesterov_momentum
from nolearn.lasagne import NeuralNet
from lasagne import nonlinearities

class PlotLosses(object):
    def __init__(self, figsize=(8,6)):
        plt.plot([], []) 

    def __call__(self, nn, train_history):
        train_loss = np.array([i["train_loss"] for i in nn.train_history_])
        valid_loss = np.array([i["valid_loss"] for i in nn.train_history_])

        plt.gca().cla()
        plt.plot(train_loss, label="train") 
        plt.plot(valid_loss, label="test")

        plt.legend()
        plt.draw()

以及要重现的代码示例:

net_SO = NeuralNet(
    layers=[(layers.InputLayer, {"name": 'input', 'shape': (None, 1, 28, 28)}),
            (layers.Conv2DLayer, {"name": 'conv1', 'filter_size': (3,3,), 'num_filters': 5}),
            (layers.DropoutLayer, {'name': 'dropout1', 'p': 0.2}),
            (layers.DenseLayer, {"name": 'hidden1', 'num_units': 50}),
            (layers.DropoutLayer, {'name': 'dropout2', 'p': 0.2}),
            (layers.DenseLayer, {"name": 'output', 'nonlinearity': nonlinearities.softmax, 'num_units': 10})],
    # optimization method:
    update=nesterov_momentum,
    update_learning_rate=10**(-2),
    update_momentum=0.9,

    regression=False,  
    max_epochs=200, 
    verbose=1,

    on_epoch_finished=[PlotLosses(figsize=(8,6))], #this is the important line
    )

net_SO.fit(X, y) #X and y from the MNIST dataset
于 2015-08-18T15:47:17.567 回答